
Flies, math ... Robots?
- Transfer

By formalizing the activity of fly brain cells involved in visual processes, scientists have found a new way to extract motion paths from raw visual data.
Although they built the system, researchers do not quite understand how it works. However, despite the mysteriousness of the formulas obtained, they could be used for programming vision systems for combat miniature unmanned aerial vehicles, search and rescue robots, car navigation systems and other systems where computing power is at a premium.
“We can create a system that works great, inspired by nature, without a full understanding of how the elements of this system interact. It's a non-linear system, said David O'Carroll, a computational neurophysiologist who studies insect vision at Adelaide University of Australia. “The number of calculations is quite small. We can get the result using tens of thousands of times less floating-point calculations than using traditional methods. ”
The most famous of these is the Lucas-Canada method ,which calculates shifts - up and down, from side to side - by comparing, frame by frame, how each pixel in the visual observation area changes. It is used for control in many experimental unmanned vehicles, but such brute force requires huge processing power, which makes this method impractical in small systems.
In order to make it possible to manufacture small flying robots, researchers would like to find a simpler way to handle motion. And then scientists drew attention to flies that use a relatively small number of neurons to maneuver with extraordinary dexterity. And for more than a decade, O'Carroll and other researchers have carefully studied the optical patterns of flight of flies, measuring their cellular activity and turning the result of evolution into a set of computational principles.
A document published Friday at the Public Library of Scientific Computational Biology O'Carrol and his colleague Russell Brinkworth talk about the results of applying this method in practice.
»Laptops use at least tens of watts of electricity. What we have developed is possible with chips that only consume tenths of a megawatt, "said O'Carrol.

The researchers' algorithm consists of a series of five formulas through which camera data is passed. Each formula is a technique used by flies to process level changes brightness, contrast and movement, and their parameters are constantly changing depending on what is input.In contrast to the Lucas-Canada method, the algorithm does not return a frame-by-frame comparison of each pixel, however it emphasizes large-scale Staff changes: In this sense, it is similar to video compression systems that ignore solid areas.
To test the algorithm, O'Carroll and Brinkworth analyzed several high-resolution animated images using programs such as those that can be run in the robot. When they compared the results with the input, they found that it worked under various natural light conditions, which were usually misleading with conventional motion detectors.
“It's an amazing job,” said Sean Humbert, an aerospace engineer at the University of Maryland who is building miniature autonomous flying robots, some of which work on early versions of O'Carroll’s algorithm. To accommodate traditional navigation systems, the device must be capable of carrying a sufficiently large payload. But the payload of small flying robots is very small - just a couple of Tic-Tacs. You'll never put a few dual cores in a pair of Tic-tacs. The algorithms that use insects are very simple compared to the things we design, and even easier for small vehicles. ”
Interestingly, the algorithm does not work just as well if any one operation is skipped. The sum is greater than the whole, and O'Carroll and Brinkworth do not know why. Since the parameters are in constant feedback, it produces a cascade of nonlinear equations that are difficult to understand in retrospect and almost impossible to predict.
“We were inspired by insect vision, and built a model that is suitable for real use, but at the same time, we built the system almost as complex as the insect itself,” said O'Carroll. “This is one of the most fascinating things. This will not necessarily lead us to a full understanding of how this system works, but it will probably bring us closer to understanding how nature itself was able to do it right. ”
Researchers got their algorithm from neural circuits related to cross-sectional surveys, but O'Carroll believes that similar formulas can probably be used in calculating other optical fluxes, such as those produced to move back and forth in three-dimensional space.