Reconstruction of Midi from Synthesia videos (and similar)
Once, while sitting in YouTube, searching for interesting educational melodies, I came across videos from Synthesia, some of which I really liked, I decided to download and learn ... =) But alas, it turned out there are videos, but nobody wanted to upload midi files = (
Going for Google decided to see if there are ready-made solutions that would suit me, but alas, from the fact that I found there were only audio converters in midi, which slightly upset me ... Without thinking twice, I decided that it would be enough to restore the MIDI frame rate in video clips ..., and I decided to try to implement this business ....
I didn’t want to write everything from scratch, so I decided what I would do on the ready-made components that Debian GNU / Linux provides me with from what python was best suited for.
At the beginning of the implementation, I decided that I would use ready-made (pulled out from video clips) pictures, but after the first uploads I realized that it makes no sense ... It turned out that it was very slow and also consumes significant resources on the screw ... Then I decided to try a new thing for myself as OpenCV (I’ve wanted to feel it for a long time), it turned out that OpenCV works very well with the video stream, providing all the functions I need (count a pixel, display frames and text).
For example, opening a video file and receiving one frame can be described in two lines:
vidcap = cv2.VideoCapture('test.mp4')
success,image = vidcap.read()
And if you want, you can immediately dump frames on the screw:
cv2.imwrite("/tmp/frame%d.jpg" % frame, image)
After some time, I wrote a function for generating the positions of the keys of the virtual keyboard, and displaying them (in the form of rectangles) on top of the stream image and uploaded the picture, the following turned out:
So having decided that frame by frame, when reading the image from the video stream, I will read the active notes from the position of the virtual keys (only notes whose pixels coincide with the reference color or not far from it) are considered active and send them to midi. I couldn’t simply register notes, as if the situation is on a regular midi keyboard, it’s just a little easier ... I checked on the video, I saw how many notes I got (and there were a lot of them) I thought not bad, all I had to do was figure out how to write notes to a file, looking for a little , found a great python-midiutil python package. After some time I was able to record notes in midi. As it turned out, python-midiutil is a very simple and very user-friendly sachet. For example, creating a file and adding notes is done in a couple of lines:
mf.addTrackName(track, time, "Sample Track")
mf.addTempo(track, time, 60 )
mf.addNote(track, channel, pitch, keytime, duration, volume)
with open(outputmid, 'wb') as outf:
mf.writeFile(outf)
Downloading the resulting midi in LMMS turned out to be quite successful. First of all, I restored a couple of my favorite tunes. Then it became clear that the function for generating key positions was not very convenient from roller to roller, their location changed, I decided that I would do a GUI, I did a simple one, but with a key placement function
I think that this program can be useful to many, because I posted everything on the github