What happens if you mix nuts, Arduino, OpenCV and Delphi. Part 2

    In the first part, I tried to select nuts without OpenCV, and was wrong.
    Programming on Delphi since the institute, starting from version 2, although being pretty close familiar with other PLs, I nevertheless began to look for headings specifically for Delphi. And found it .
    Having compiled the EdgeDetect example, and seeing the results, I realized that the OpenCV tool is really powerful, simple and fast. Thanks to the good people for the pascal header files to the C interface of this wonderful library, because they gave me the opportunity to write in the environment of my usual RAD. Having decided on YaP, I started developing software from scratch, this article describes my victories and misadventures, and please, don’t hurt too hard, this is only my second article on Habr.

    The first rake was associated with a rather noticeable memory leak: they were connected with the fact that after each cvFindContours you need to call cvClearMemStorage.
    Soon after realizing that at 30 FPS that my Logitech C270 was giving out, I could not detect nuts in free fall, I started looking for high-speed cameras. For the experiments, the PS3 Eye Camera was acquired, which produced sky-high 187 FPS at 320x240. As a result, another “feature” was found - the rendering limit of 65 FPS under Win7. As it turned out, cvWaitKey limits - a solution was found right there, namely: calling cvWaitKey not with each processed frame, but with less frequency.
     if gettickcount-rendertickcount >=  33 then begin      // 1000 / 33 = ~30 FPS
              rendertickcount := gettickcount;
              cc := cvWaitKey(1);

    I will directly describe the algorithm itself.
    For each sample from the database, an “album” of rotated samples is generated in increments of 10 degrees. This makes it possible to store much less samples in the database of standards and not waste resources on rotation "on the fly." Primitive correction of perspective, I implement on the fly using cvResize.
    procedure createAlbum(nsIndex:integer);
    var i : integer;
        rot_mat: pCvMat;
        scale: Double;
        center: TcvPoint2D32f;
        width, height : integer;
      with nsamples[nsIndex] do begin
        width := nutimgs[0].width;
        height := nutimgs[0].height;
        center.x := width div 2;
        center.y := height div 2;
        scale := 1;
        for i:= 1 to 35 do begin
          nutimgs[i].width := width;
          nutimgs[i].height := height;
          rot_mat := cvCreateMat(2, 3, CV_32FC1);
          cv2DRotationMatrix(center, i * 10, scale, rot_mat);
          cvWarpAffine(nutimgs[0], nutimgs[i], rot_mat, CV_INTER_LINEAR or CV_WARP_FILL_OUTLIERS, cvScalarAll(0));

    As a result of the sliding of nuts on the gutters, they quickly stain these very gutters with fat, which is very rich. This fact prevents a more accurate determination of the contours of nuts. I tried both simple cvThreshold and cvThreshold with cvCanny on top - it worked poorly on a dirty background. Plus, the shadow that the nuts cast when spans a short distance from the background interfered. To solve this problem, I came up with my own filter. Its essence is that it replaces the most “non-color” pixels with white pixels.
    procedure removeBack(var img: PIplImage; k:integer);
    var x, y :integer;
        sat: byte;
        framesize :integer;
        cvcvtColor(img, hsv, CV_BGR2HSV);
        x := 1;
        framesize := img.width * img.height * 3;
        while x <= framesize do begin
              sat := hsv.imageData[x];
                 if sat < k then begin
                    hsv.imageData[x-1] := 255;
                    hsv.imageData[x+1] := 255;
                    hsv.imageData[x] := 0;
              inc(x ,3);
        cvcvtColor(hsv, img, CV_HSV2BGR);

    For nuts sliding on a white background, there is a contour. A mask is made from the outline, which allows each nut to be copied with transparency into an array from PIplImage. Too small and very large contours are skipped.
            frame := cvQueryFrame(capture);
            cvCopy(frame, oframe);
            cvCvtColor(frame, gframe, CV_BGR2GRAY);
            cvThreshold(gframe, gframe, LowThreshVal, HighThreshVal,  CV_THRESH_BINARY_INV); 
            cvFindContours(gframe, storage, @contours, SizeOf(TCvContour), CV_RETR_EXTERNAL, CV_CHAIN_APPROX_NONE, cvPoint(0, 0));
            b := contours;
            NutIndex := 0;
            while b <> nil do begin
              asize := cvContourArea(b, CV_WHOLE_SEQ);
              if  ((asize > tbminObjSize) and (asize < tbmaxObjSize))  then begin
                _rect := cvBoundingRect(b);
                cvDrawContours(mask, b, CV_RGB(255, 0, 255), CV_RGB(255, 255, 0), -1, CV_FILLED, 1, cvPoint(0, 0));
                snuts[nutIndex].snut.width := _rect.width;
                snuts[nutIndex].snut.height := _rect.height;
                cvSetImageROI(oframe, _rect);
                cvSetImageROI(mask, _rect);
                cvCopy(oframe, snuts[nutIndex].snut, mask);
                snuts[NutIndex].rect := _rect;
              b := b.h_next;

    The frame is divided into regions-> lines, in reality these are separate gutters along which nuts slide. At the end of each of the lines is an actuator, which is a nozzle that controls the flow of pressure air.
    In the application, each line is served by a separate thread. Inside the thread we find the nut closest to the nozzle and determine its “similarity” with the base of the reference samples. Below is a piece of code that considers the “affinity” through cvAbsDiff:
                cvAbsDiff(tnut, nsamples[tp1].nutimgs[angle], matchres);
                cvCvtColor(matchres, gmatchres, CV_BGR2GRAY);
                cvThreshold(gmatchres, gmatchres, tbminTreshM, 255, 0);
                wcount := cvCountNonZero(gmatchres);

    The value of the variable wcount is the coefficient of similarity of the nut to the standard in "parrots". If this value is exceeded above the threshold, we transfer the line number through the com port to arduino. The controller opens the nozzle for a predetermined time, which “blows” the nut; in the normal state, the nozzles are closed. For asynchronous operation of actuators, the following sketch was written.
    int timeout = 75;
    int comm;
    unsigned long timeStamps[8];
    int ePins[] = {2, 3, 4, 5, 6, 7, 8, 9};
    void setup() {
      for (int i=0; i <= 7; i++){
        pinMode(ePins[i], OUTPUT);
       while (!Serial) {
        ; // wait for serial port to connect. Needed for Leonardo only
    void loop() {
      if (Serial.available() > 0) {
        comm = Serial.read();
        if (comm >= 0 && comm <= 7) {
          digitalWrite(ePins[comm], HIGH);
          timeStamps[comm] = millis();
        if (comm == 66) {
          Serial.write(103); // for device autodetection, 103 means version 1.03
      for (int i=0; i <= 7; i++){
         if (millis() - timeStamps[i] >= timeout) {
          digitalWrite(ePins[i], LOW);

    Nozzles are an electromagnetic solenoid. We commute this load according to the following scheme. Each nozzle needs a separate key.

    And this is how the assembled device looks like:

    At the request of the customer, I can not publish images of the finished device. I hope the following video will provide an opportunity to introduce the final device.

    I tried to describe the most difficult and interesting moments that I met as a result of working on this interesting project. Feel free to ask questions if something, in your opinion, is outlined in insufficient detail.
    Thanks for attention.

    UPD: Added slowmotion video, 75 FPS -> 1 FPS

    Also popular now: