Track an object by its color using Aforge.NET

    Hello. A common phrase: "my first post" :). In it, I want to tell you about my small project to track an object by its color. Now it has a fairly wide scope, for example, the same joysticks from Wii and Playstation 3. The basis for the work was the development of Andrei Kirillov Aforge.NET - a pretty powerful thing for self-made image processing.
    The code does not pretend to be "the ultimate truth", much has been simplified (in one place, duplication was even allowed in a sense - for quick access to pixels I created my class, although there were similar developments in Aforge). Nevertheless, the code works, tracks the object, provides information about the location, allows you to dynamically calculate the shade of the object (in case of lighting changes).

    For those interested - I ask for cat.


    Short excursion to AForge.


    A framework is a set of libraries, each of which is designed to solve a certain kind of problem:
    • AForge.Imaging - a library with filters and calculations for image processing;
    • AForge.Vision - a machine vision library;
    • AForge.Neuro - a library for working with neural networks;
    • AForge.Geneti c - a library for working with genetic algorithms;
    • AForge.Fuzzy - a library of fuzzy calculations;
    • AForge.MachineLearning - a machine learning library;
    • AForge.Robotics - a library that provides support for some Robotics kits;
    • AForge.Video - a set of libraries for video processing.

    Complete with libraries there is a set of examples.

    User interface


    I did not write my project from scratch, I took the example of Vision \ MotionDetector as the basis. He already knows how to connect to a webcam, a remote webcam (via JPEG, MJPEG url), as well as open certain video files (which, I admit, have not experimented with):
    image

    The original example can identify movement on a stream using several algorithms, the simplest of which is finding the difference between two consecutive frames.
    The form code was processed by file, and sharpened specifically for the task. On the Motion tab, you need to select the object search algorithm:
    image

    Then select the object through the Define color tracking object form:
    image

    Information about the object will be displayed in the status bar of the main form:
    image

    As an additional setting, a color difference threshold is provided - the ability not to track a single color, but to take into account its variations:
    image

    Additionally, the user can specify whether he wants the color of the object to be also tracked during processing (that is, not only the object itself is tracked by color, but also the new color of the object is calculated during processing):
    image

    The next bun is standard with Aforge Motion Detector. The object can be highlighted in different ways on the image:
    image

    Implementation


    AForge.Vision.Motion. IMotionDetector - an interface that allows you to search for the difference between images. The ColorDetection class that performs processing is inherited from it .
    To interact with the user interface, the Initialize method (Image image, Rectangle rect) was added , which initializes the processing of subsequent frames. Here, information is collected about the target (the selected rectangle in the image). Information is collected on the dominant color in the selected area (this property will continue to serve as the basis for tracking). The position of the target is also remembered.

    IMotionDetector has the following methods:
    • ProcessFrame (AForge.Imaging.UnmanagedImage) - transfers the next frame received from the image capture device to the object. The next frame is an object of type AForge.Imaging.UnmanagedImage, a class that allows you to approach pixels much faster than Bitmap.GetPixel (x, y) .
    • Reset () - Resets the contents of the class.

    Properties:
    • AForge.Vision.Motion.IMotionDetector. MotionFrame is a property of type AForge.Imaging. UnmanagedImage , which is responsible for highlighting the region with the object.
    • AForge.Vision.Motion.IMotionDetector. MotionLevel - “level of motion” (from 0 to 1) - ephemeral value. Not implemented.

    To update the information in the application status bar, Get Properties has been added:
    • Point center
    • Rectangle BoundsBox
    • Color ObjectColor

    In order to use not only the target color, but also some shades, Set Property DifferenceThreshold is used .
    The main processing of the frame occurs in the ProcessFrame function . The algorithm can be divided into the following steps:
    1. Expanding the region of presence of the object. We will not search for a new position on the entire screen, but only in the area adjacent to the previous position. This makes the search more accurate from the point of view that the target object will not be confused with another object of the same color (in another part of the image).
    2. Calculation of the boundaries of the object in the above-described area by determining the extreme points of the color that is dominant for the object (the possible color deviation - DifferenceThreshold is also taken into account here).
    3. Creating a “mask” MotionFrame , which allows the MotionDetector to highlight the target object in the image.
    4. Next, the "average color" and the size of the new object are calculated.
    5. If the object is too small (for example, on the next frame, our target object was completely covered by another object) - we do not change the information about the position and color that remained in the inheritance from the processing of the previous frame.
    6. Otherwise, the new position and boundaries of the object are remembered, and if the algorithm monitors the color change, which is set using the bool DynamicColorTracking property, the new calculated color is also remembered.

    This completes the image processing.

    Possible improvements


    Since we already talked about controllers for game consoles, their color is usually contrastingly different from any other objects in the frame. Therefore, you can do an initial search for the target color in the entire frame (and not just in the adjacent area). This will allow you to track the object during faster movement.

    Related Links


    Aforge project site .
    Aforge - an article on CodeProject .

    PS Link to the source on depositfiles .
    Link to the source in Google Docs

    UPD. Thanks for the karma, transferred to the thematic blog.

    Also popular now: