How we won the internal hackathon by learning skibidi, flossing and javascript

    VK has a cool tradition - an internal hackathon, in which only guys from VKontakte can participate. I’ll tell you about the hackathon on behalf of the team, which this year won the first place and died of fatigue in its entirety, but managed to try on a dance motion detector for the story camera.



    My name is Pavel , I lead the top research team on VKontakte and warmly treat the hackathons: as a participant (Junction or a number of deephack-s) and recently as a curator (VK hackathon or VK case on Junction - by the way, this was the first time, when a Russian company participated). Open to all VK Hackathon, we spend the fourth year (the last time we climbed into the General Headquarters of the Hermitage), and a significant part of our technical team participated in it before we settled in VK.

    The internal hackathon allows the team itself to freely experiment on the platform, test different ideas and generally have fun. An important difference is that solutions can be significantly more integrated into VK, which gives us the opportunity to achieve interesting prototypes.

    The hackathon takes place in the Singer House for a whole day - right in the middle of the week the Headquarters turns into a midnight movement. It is funny to see how cleaners look around in surprise in the morning - usually an empty office at 6 in the morning suddenly turns out to be filled with shaggy people who move like zombies and shout: “Five hours left!” Or when at three in the morning you go into the kitchen and it smells like in university dormitories during the sessions: energetics, pizza and panic. This, of course, happens on a typical day, but so massively - rarely.

    Three previous domestic hackathons took place in the summer. In 2019, we decided not to trifle and spend another winter - two hackathons are much better than one, because this is a great opportunity to experiment and implement an idea that doesn’t have enough time in a normal rhythm. The rules have also changed: before, a team could have a maximum of three people, and this year - four, but one does not write code, but specializes in something else. You could call a team of designers, product managers, testers, marketers and other guys. In total, 38 teams participated in this hackathon.

    Drimtim (more precisely, one of 38)


    We conspired with Danes and together convinced Yegor and Tyoma to join the team. The models were expected to be behind us, Yegor was responsible for iOS, Tyoma - for production and design. Mobile development + design + a bit of machine learning and a backend is the key to success at the 2k19 hackathon.

    Even this year, there was a division into tracks that had not previously been: Media (in which we participated), Communications, Infrastructure, Content and Entertainment. We had powerful competitors. For example, we were constantly motivated by senior designer VK Ilya , who came to our room and showed prototypes of the ideas of his team.

    Idea


    “I took prizes in almost all the hackathons I participated in, and I expected the same from the inside this winter.” ( Dania was confident)

    Our idea (specifically, Danina) was originally like this: I wanted to explore the topic of music generation + get everything on the device, otherwise “too backend”. The hackathon began with brainstorming - they wondered what could be thought up. Generating music is interesting, but I want to make it dependent on the user. Any buttons? Maybe draw on the screen and generate music depending on this? In parallel, the guys from the Music team learned how to add the tracks we need. But it still seemed like not quite that. The neighboring teams cheerfully figured out something behind their laptops and caused frustration.
    - And what if you make recognition of an air guitar, as if you are playing a guitar, and depending on this, play the sound of the guitar? ( Dark )

    Bingo! It’s a fighting idea, and it’s cool to arrange everything in our power. For recognition of movements there is a posenet , and it is very ok (besides mobile-friendly). Figachim!



    Decision


    The main tasks are to get a grid on the device (it should be real) and learn to recognize movements. Yegor took up porting, Tyoma - inventing which movements would be interesting to screw in (just a guitar - boring), and Dani and I - recognizing them. But this requires data. What is the difference between PRO and amateur? PRO has a cluster with a GPU - this is one time, and two - PRO will collect data for itself when it needs it. Dania organized a stand where raw data on the coordinates of the recognized figure were written from the camera, and then dances! That night we learned to dance flossing, skibidi and pipe .




    As a stand for recording movements, we used a working laptop, which first recorded Dani's face (before that he did not write a single line on js) when he saw another incomprehensible js-error.



    - I do not understand, I have a level error: print disappeared in Python! ( Dania )

    Night dancing (literally)


    Filmed at night for many hours of continuous movement in front of the camera. We recorded it ourselves, and also caught the developers wandered onto the floor and FORCED TO DANCE. It turned out seven different combinations - now we needed to learn how to distinguish between them.




    - I came in every three hours to check whether the guys are alive. Pasha shouted: “We have a pivot!” - and Dania fleksil of the last forces. Then everyone danced the pipe. When Daniel ran out of energy, Pasha opened the window and said: "Boys, we need to freshen up." ( Madina )

    The data from the figure was pre-processed: they threw out their legs, averaged their heads, and transferred to polar coordinates relative to the torso. Trained a motion detector using catboost - a three-second snippet of data flow from the model. Until this night, they didn’t work with the library - it turned out to be combat, and in iOS you can shove it.



    They taught a multiclass classification, while one class was as boring as possible - just staggering in front of the camera. The most difficult thing was to record the “rock” movement - we shook our heads so selflessly that after a while she began to spin. And they put out a hand with a "goat", although it was pointless - posenet has only one point on the whole brush, it does not see its fingers.



    - Somewhere at 3 in the morning Pasha climbed into a sleeping bag and spent an hour moving exclusively in it, jumping like a real kangaroo. ( Madina )

    At around 8 a.m. a small crisis overtook us - everything broke down and nothing worked, but then everything suddenly worked. Turning both models into the application turned out to be the biggest challenge - Yegor finished assembling just five minutes before the deadline. We give him the floor:

    - After we found the idea, everything went very well and productively. The guys trained the grid and danced, and I screwed the PoseNet stories camera to JavaScript directly in the browser. Initial test runs worked well and were surprisingly fast. Therefore, when in the morning it turned out that WebGL in WebView unexpectedly crashes when working with textures for some nonsense and there is no way to find a solution, I almost fell into despair. But it was too late to give up: we were burning with the idea. Therefore, from the last effort and on the last redbull bank, on the move, we dragged an alternative CoreML model to the iOS client and began to track the poses already natively - in order to further give them to the model with dances and get some result at the exit from them. In fact, we repeated the work again! Another challenge was the second model, which suddenly began to expect more than a thousand arguments to enter! Xcode generated an interface for it that would simply be unrealistic to use “head-on”. The benefit of Objective-C was not let down, and an elegant solution was found. (Egor )

    Pitching


    On Friday at 2 p.m. it was necessary to upload a video about the project - several teams did not have time, and they were disqualified. And at 14:40 we had a pitch before the curators of the track related to the product. We had guys from the Video and Music team, and it seems they liked everything on the pitch. In our track, we took second place (we wanted the first, because we have such a cool project!) And ended up in the final (two teams passed from our track).

    - This year I was the first curator of the internal hackathon. I will say restrainedly: evaluating the work was extremely difficult. The level of all teams without exception was somehow transcendental. The feature should be not just technological, not just “close to production”, not just “potentially useful to our products”. The winning project must meet all of these criteria at the same time. It seems that the guys did it. ( Andrey )

    The final pitching was at 17:40. By this time, it was necessary to prepare another demo, already for the whole Team, and the jury was different - technical director, product director and marketing director.

    At five in the evening it was all over — we drove home to sleep, not knowing anything about the results.

    Results finally


    Results were announced only on Monday. First, we awarded the winners of the tracks (not our case - let me remind you, we were the second), then the leaders of the audience (not us), and then the third (and this is also not us), the second (again not us) and, finally, us.

    Here are the projects we had to compete with:

    2nd place - a responsive voice assistant;
    3rd place - timeline of internal errors;
    Audience Award is a reminder of upcoming chat meetings.

    “This is the best hackathon I've attended.” There was a lot more drive than even on Junction. ( Dania )

    - I really enjoyed working with colleagues from a completely different department - I had never before touched machine learning, it seemed to me some kind of magic, but now it is not. ( Egor )
    - It was very cool to become part of such a cool team with such a cool project. During the day I managed to be a designer, videographer, sound engineer, editor, musician and copywriter! And I was the only one able to sleep. ( Dark )

    Life after the hackathon


    Most of the projects developed on hackathons do not get to the sale for various reasons: a change in focus, complexity of implementation, something unforeseen in the implementation. The inner hackathon is no exception.

    Nevertheless, we list the projects that saw the light of day:

    • beloved by all Vinci ;
    • user compatibility check , which was launched on February 14, 2018;
    • beautiful posters for short notes;
    • and a number of internal features that we would be happy to talk about, but cannot :)


    Also popular now: