Modern analog computers: is there a future?

    Most of us connect the development of information technology with the digital revolution. The advent of microprocessors, of course, brought electronics to a whole new level. Already the race for owning the most powerful supercomputer has lost its scientific charm - teraflops directly depend on the amount of money and free space. Buy servers and increase computing capacity.

    Ever since the university, I have been haunted by an idea that I would like to throw for discussion to the habrasociety.

    Until the digital age, the direction of analog computers developed.

    An analog computer is a device that performs computational tasks, operating not with discrete, but with continuous data. A bit is a discrete value, unit or zero. Current, voltage, pressure, temperature, brightness, power - this list can be continued for a long time - there are continuous values, that is, their exact value cannot be measured in principle, everything is limited by the accuracy of the measuring device.

    If the ideal environment for digital technology is digital data processing, then the ideal environment for an analog computer by logic should be real-world data processing - an image or sound, for example. But for some reason I do not understand, this area of ​​knowledge is practically abandoned. Probably the answer is some insurmountable difficulties, perhaps something else, but over the past ten years there have been practically no shifts in this direction.

    Most often, analog computers mean purely hydraulic or mechanical devices that convert the input signal to the output using a structurally programmed function - like the same Kaufmann posograph , which determines the most successful exposure when shooting, or an anti-cheeter mechanismpredicting the position of the planets and the sun.

    A classic example of a modern analog computer is an automatic car transmission. When the torque changes, the fluid pressure in the hydraulic drive also changes, and the nature of this “function” can be changed constructively.

    But such examples are already indecent in the 21st century. Science has gone so far ahead that the implementation of the simplest function should have deservedly remained in the middle of the last century. But for some reason nothing came in return.

    I would like to raise the issue of automatic electronic devices that solve the problems of processing real-world signals without digitizing them. Well, or get a convincing answer, why at this stage of the development of civilization there are no such examples.

    Look, on the one hand, almost all interfaces to the real world are analogous to us: a microphone, a webcam, a mouse. On the way from physical phenomena (the mouse was moved, a sound was made or the light was turned on) to the signals recorded by the computer, the signal passes through an ADC - an analog-to-digital converter, where the analog signal is digitized. As a result, we "coarsen" the original signal to an acceptable level. And whatever one may say, seriously processing high-quality video in real time so far is not very good for us (for example, recognizing objects on it).

    If you think about it, digital signal processing has practically no analogues in nature, unlike almost everything else that mankind has invented. Any living organism is arranged differently - it is an exclusively analog computer. Here, both chemical reactions and neurons work with continuous physical parameters, and not with a “number” in any way. If some patterns coincide with what we get from the real world, the brain captures “bursts”, clinging to them, adjusts the direction of memories and past experiences, gives our senses to listen or peer into some key details.

    All this would be impossible if the brain had a digital nature. But how to translate all this into technology?

    Drawing an analogy with bit operations, any physically continuous quantities lend themselves to addition, subtraction, division, or multiplication. But more interestingly, there are solutions that allow you to perform the functions of integration and differentiation with analog signals. These signals can be a laser in optical computers or information about the brightness of individual parts of space. A certain processor could superimpose a two-dimensional or three-dimensional field of the template on a two-dimensional or three-dimensional field of projection of the real world, finding a surge, resonance when superimposed, more accurately analyze the configuration found until the desired certainty threshold is reached.

    As a result, a whole class of tasks related to decision making, recognition of patterns, sounds, and any interaction with the outside world, should have a very effective implementation using analog logic by parallelizing computations.

    Solving the problems of processing data from the real world in a digital way resembles hammering nails with a microscope. To flip a picture, we would rather use a conventional lens than do a similar operation on a digitized copy. How much would a headphone cost if a noise reduction system were made through a bundle of ADC-processor-DAC?

    I think that the next big step in electronics is quantum, analog systems, systems based on the principles of neural networks and not based on digital nature. It should already be a significantly “advanced” analog technique, specializing in a specific task. It is necessary to move away from the analysis model of “screenshots” to the “live image” model, from discreteness to continuity.

    There are very few new developments in this area.

    One of the very interesting, but very poorly lit in RuNet, is technology,
    Built on the principle of Cellular Neural Networks. The architecture of such systems resembles a neural network in which each cell is an independent element of the state, informationally connected with several neighbors. Commercial real-time image analysis solutions using CNN include, for example, Anafocus and  Eutecus . The latter, for example, on its website claims that its systems operate at speeds of 10 ^ 12 operations per second. A similar performance is shown by the Lenslet enlight256 - an optical processor built on another principle, VCSEL lasers.

    It is also clear that for full-fledged decision-making systems, in robotic control systems, more information about the world or the subject under study is required than a conventional camera gives. Look at the nature - there are smells, brightness, temperature, and sound - all complement each other. And stereo vision and the ability to look at the world from different angles plays a significant role in understanding what is happening around you. This all means that the amount of information that will need to be processed by fuzzy logic will be huge. And the underdevelopment of speech recognition systems or images is now connected precisely with the fact that they all receive very limited information, with a lot of losses, distortions or noise. And there is simply nothing to process a large amount of information.

    I would like to hope that in the next ten to twenty years we will not stupidly multiply the number of processors, frequencies, try to create systems based on the terrible coupling of the ADC-processor-DAC where you can leave only the central element, but make it a fundamentally different, more suitable solution tasks.

    So is there a future for analog computers?

    Also popular now: