Bremmermann limit - computable problems
Joking as a joke, but what if you think about the volume of such a "hard" drive?
In 1964, Hans Bremmerman published an article entitled “Optimization through evolution and recombination”, the main conclusion of which is as follows:
There is no data processing system, artificial or natural, that could process more than 2 × 10 ^ 47 bits per second per gram masses.
The mass of the Earth is estimated at about 6 × 10 ^ 27 g, and age 10 ^ 10 years, the year consists of approximately 3.14 × 10 ^ 7 seconds. Our imaginary computer system could process 2.56 × 10 ^ 93 bits, rounding to the order of 10 ^ 93 bits.
Tasks requiring processing more than 10 ^ 93 bits of information are called transcomputingtasks. Unfortunately, combinatorics tells us that such a limit can be reached for tasks even of relatively small size. For example, in pattern recognition: let there be an array q * q, each element of which can be colored with one of k colors, then the total coloring options can be k ^ (q * q). If the array is 18 × 18, then the task is transverse with two colors; for an array of 10 × 10 it becomes transverse with 9 colors. It is clear that it is impossible to consider all variants of the images of the retina, which has about a million photosensitive cones. Another example of such a task is testing a microcircuit, which has 308 inputs and only 1 output.
The most natural way to solve transverse problems is to reformulate, weaken the conditions, and the probabilistic nature of the solution. I would like to believe that artificial intelligence algorithms will nevertheless appear, in particular for solving the problem of pattern recognition, which can automatically solve transverse problems, similar to how a human brain can do it.
Proof of Bremmermann's limit:
Information must be encoded for processing. Suppose that it is encoded in the form of energy levels in the interval [0, E]. Let energy levels be measured accurate to dE. Thus, on the whole interval there will be N = E / dE sub-intervals, each of which can be busy or not (1 or 0).
The maximum number of bits will be log2 (N + 1)
In order to represent a larger amount of information, it is necessary to reduce dE. According to the Heisenberg uncertainty principle, energy can be measured accurate to dE if dE * dt> = h is fulfilled, where dt is the length of the measurement time, h = 6.625 × 10 ^ -27 erg / s (Planck constant)
We obtain, N <= Edt / h
If we present the available energy E with the corresponding amount of mass according to Einstein's formula E = mc ^ 2
N = mc ^ 2 * dt / h
Substituting c and h, we have N = 1.35 × 10 ^ 47 * m * dt
For mass 1 g (m = 1) and time 1s (dt = 1) we obtain the indicated value
N = 2 × 10 ^ 47 bits
PS According to the book by J. Clear "Systemology. Automation of the solution of system tasks"
______________________