Finger tracking with Microsoft Kinect

    Recently, I finally returned to work after the summer adventures, and the first thing that my boss threw off to me was a link, about which we will now discuss.

    Since at the moment one of my projects is gesture recognition using the Kinect camera, analyzing them and performing actions on such a robot, there’s a link, as you might guess, to a similar topic.

    So, meet Kinect 3D Hand Tracking or, in our opinion, "Tracking the 3D position of the hand with Kinect."



    What was before

    Until this moment, I personally saw this only here and personally, my demo on this link so stable and did not work.
    In addition, if I move away from the sensor by more than 2 meters, I couldn’t follow her fingers at all (in fact, the kinekt lost my fingers already at a distance of one and a half meters, but maybe my fingers are wrong). True, everything happened quite quickly even on my laptop, without using a GPU. And the code is open, in view of the fact that this is a demo project that is now included in, I’m not afraid of this statement, the largest open source project for robotics is ROS .
    But further we will not talk about that. Imagine that we are interested in very clearly tracking the movements of the fingers on the hand.

    Who are all these people and what did they write

    Three researchers: Iason Oikonomidis - Nikolaos Kyriazis - Antonis Argyros from the Faculty of Computer Science of the University of Crete, wrote a demo that you can feel with your own hand, which monitors the hand, including all 5 fingers on it.
    What exactly did this three write?
    (The following list is a translation directly from the project page)
    Their software monitors the 3D position, orientation and full articulation of the human hand based on visual data, without the use of any markers. The method that was developed:
    • models full articulation of the hand (26 degrees of freedom), performing any natural movements;
    • works based on melon data obtained with an easily accessible and widely used RGB-D camera (Kinect, Xtion);
    • does not require markers, special gloves;
    • gives a result with a speed of 20 fps, though on a GPU.
    • does not require calibration;
    • does not use any proprietary technology to track position, such as Nite, OpenNI or Kinect SDK


    Actually, after clarifying that it is better to wear sleeves in order to make it easier for the program to follow your hand, they go directly to the video demonstration, which I will give later for those who are too lazy to go to any links above.
    Example 1
    Example 2
    Example 3
    Example 4
    Example 5

    How fast does it all work

    And a little more about what hardware this whole thing is worth running.
    • CPU: Pentium® Dual-Core CPU T4300 @ 2.10GHz with 4096 MBs of RAM, GPU: GeForce GT 240M with 1024 MBs of RAM, Tracking FPS: 1.73792
    • CPU: Intel® Core (TM) 2 CPU 6600 @ 2.40GHz with 4096 MBs of RAM, GPU: GeForce 9600 GT with 1024 MBs of RAM, Tracking FPS: 2.15686
    • CPU: Intel® Core (TM) 2 Duo CPU T7500 @ 2.20GHz with 4096 MBs of RAM, GPU: Quadro FX 1600M with 256 MBs of RAM, Tracking FPS: 2.66695
    • CPU: Intel® Core (TM) i7 CPU 950 @ 3.07GHz with 6144 MBs of RAM, GPU: GeForce GTX 580 with 1536 MBs of RAM, Tracking FPS: 19.9447

    Total

    The system turned out to be slow, but so far it looks pretty promising, although of course it still makes some mistakes. Personally, for what I am doing, I’m currently useless, because Atom is on the robot. So I have to wait a while with the recognition of gestures with my fingers and stop at those for which I need to use the whole hand.
    Let's hope that the whole thing will be optimized somehow and sooner or later, we will be able to enjoy the control of being a robot or a computer, among other things, with precise hand gestures.

    UPD:
    Direct link to download the demo itself. The binary for Windows x64
    You can get acquainted with what should be installed here .

    PS. I would be grateful for an indication of inaccuracies and open to feedback. Thanks.

    Also popular now: