Kolmogorov complexity and our search for meaning
What can mathematics tell us about finding order in the chaos of life?
Was the meeting with the dearest person accidental to you, or was there some hidden reason behind it? And what about the strange dream yesterday - were these only random throwings of brain synapses, or did he reveal something deep about your subconscious? Perhaps the dream was trying to tell you something about your future. Perhaps not. Does the fact that your close relative is sick with a dangerous kind of cancer have some deep meaning, or are these just the consequences of random DNA mutations?
In our life, we often reflect on the patterns of the events around us. We ask ourselves whether our lives are random, or if they have some meaning, uniquely true and deep. I, as a mathematician, often turn to numbers and theorems for ideas about such questions. And it so happened that I learned something about the search for meaning in the laws of life thanks to one of the most profound theorems of mathematical logic. This theorem, simply put, demonstrates that in principle it is impossible to know whether the explanation of a pattern is the most profound or interesting of all explanations. In the same way as in life, the search for meaning in mathematics is unlimited.
A small foreplay. Consider the following three lines of characters.
2. 2, 3, 5, 7, 11, 13, 17, 19, 23, 29, 31, 37, 41, 43, 47, 53, 59, 61, 67, 71, 73, 79, 83 , 89, 97
How can we describe them? For example, we can easily do this by simply writing them down - just as we just did. However, it is immediately clear that the first two lines can be described and shorter. The first is simply a sequence of repeating "100". The second is a list of the first few prime numbers. What about the third? It can be described by simply listing the entire line. But is there a better, shorter description for it?
In the early 1960s, American teenager Gregory Haytin, the world famous Russian [and Soviet] mathematician Andrei Nikolaevich Kolmogorov , and the pioneer of computer science Ray Solomonov independently formulated a method for measuring the complexity of sequences of characters. Their ideas began to be called the Kolmogorov theory of complexity or algorithmic theory of information.. They postulate that the complexity of the string is determined by the length of the shortest computer program that can issue it. That is, take a line, and look for the shortest computer program that produces it. A program is one type of string description. If the shortest of such programs is very short, then there is a simple pattern in the line, and it is not very complicated. We say that in such a line there is little algorithmic content. Conversely, if a long program is required to issue a string, then the string is complex, and its algorithmic content is greater. For any line, you need to look for the shortest program that produces such a line. The length of such a program is called the Kolmogorov line complexity.
Let's go back to the first three lines. The first two lines can be described using relatively short computer programs:
1. Print “100” 30 times.
2. Print the first 25 prime numbers.
The Kolmogorov complexity of the first line is less than the Kolmogorov complexity of the second line, since the first program is shorter than the second. What about the third? This line has no obvious patterns. However, you can write a stupid program that outputs this sequence:
3. Output “38386274868783254735796801834682918987459817087106701409581980418”
Such a program copes with the task, but unsatisfactory. Perhaps there is a shorter program that demonstrates the presence of a pattern in this line. When the shortest program outputting a string turns out to be the program “output a string”, we say that this string is very complex and does not contain any known patterns. A string without patterns is called random. But although we did not see patterns, it can exist. In mathematics, as in life, we are faced with a set of patterns that seem random.
We could try to use the amazing capabilities of modern computers to find patterns and the shortest program. Wouldn't it be great if there existed a computer capable of simply calculating the Kolmogorov complexity of any string? Such a computer would accept a string as input, and output the length of the shortest program capable of returning this string. Of course, with all these newfangled things like AI, depth learning, big data, quantum computing, etc., it should be easy to create such a computer.
Alas, such a computer can not be created! Let modern computers and very powerful, this task is impossible. Such is the content of one of the deepest theorems of mathematical logic. The theorem, in fact, says that the Kolmogorov complexity of a string cannot be calculated. There is no mechanical device that determines the size of the smallest program that issues a given string. The point is not that our current level of computer technology does not reach the task, or that we are not smart enough to write such an algorithm. It was proved that the very idea of description and calculation demonstrates that a computer is in principle not able to perform such a task for any line. And if the computer may be able to search for certain patterns in the line, it is not able to find the best pattern. We may also find a short program outputting a certain sequence, but there can always be an even shorter one. We will never know about it.
The proof itself of Kolmogorov complexity for computations is rather formal. But this is evidence to the contrary, and we can roughly imagine how it works by examining a couple of small and sweet paradoxes.
The paradox of interesting numbers is connected with the statement that all natural numbers are interesting. 1 is the first number and it is interesting. 2 - the first even number. 3 - the first odd prime number. 4 is an interesting number because 4 = 2 × 2 and 4 = 2 + 2. In this way, one can go on and find interesting properties of many numbers. At some moment we can meet a number without interesting properties. And we can call this number the first uninteresting number - but this in itself is an interesting property. As a result, uninteresting numbers also turn out to be interesting!
The ideas contained in the Kolmogorov proof are similar to the ideas of the Berry paradox concerning the description of large numbers. Note that the more words we use, the greater the number we can describe. For example, in three words one can describe “trillion trillion”, and by five - “trillion trillion trillion trillion trillion”, where as a larger number. Now consider the number described by the following phrase:
The smallest number that cannot be described by less than fifteen words [The smallest number that can not be described in less than 15 words]
15, 16 or even more words are required to describe the number. It cannot be described in 12, 13 or 14 words. However, this is the problem: the above phrase describes this number with 10 words [in English - 12 words / approx. trans. ]. Our description of the number contradicts the description of the number - here you have a paradox.
In the paradox of interesting numbers and in the Berry paradox we come to contradictions, suggesting the existence of an exact way of describing something. In the same way, the proof of Kolmogorov’s complexity is not computable, because if it were computable, we would come to a contradiction.
The fact that Kolmogorov's complexity is not computable is the result of pure mathematics, and we should not confuse this ideal world with a far more complex and disorderly reality. However, there are some common points related to Kolmogorov complexity, which we can bring to the real world.
Many times we came across what seemed to us utterly chaotic. Accident unnerves us, and we are looking for patterns that partially eliminate chaos. If we find a pattern, it remains unclear whether it is the best pattern explaining our observations. We may wonder if there is a deeper pattern that gives a better explanation. The theory of Kolmogorov complexity teaches us that at a basic level there is no guaranteed way to determine the best pattern. We simply never know whether the pattern we have found is the best.
But this is exactly what makes the search infinitely interesting. By definition, something is interesting if it requires additional thought. The obvious and completely understandable fact does not require further reflection. The fact that six seven is forty-two is completely understandable and uninteresting. Only when we are not sure about the ideas, we need to confirm them and reflect on them. The search for improved patterns will always be interesting.
The real world adds complexity. If there are no errors in the world of strings and computer programs, you can make a mistake in the real world. We can easily find out if a particular program prints a string or not. And although we probably will not be able to determine the optimal program for outputting a particular line, we will be able to determine whether it outputs the required line. And the real world, by contrast, is much more complex. It may seem to us that we see a sequence, when it is, in fact, not.
Our understanding of our search for meaning begins to take shape. We despise randomness and adore patterns. We are biologically programmed to find patterns that explain what we see. But we cannot be sure that the pattern we have found will be correct. Even if we somehow could guarantee the absence of error, and achieve perfection, like a computer, somewhere still there can always be an even deeper truth. This tension feeds our love of literature, theater and cinema. When we read a novel or watch a play, the author or director presents us with a sequence of events with a common theme, pattern or morality. Literature, plays and movies offer us a great way to escape from the usually incomprehensible and senseless chaos that we see in the outside world. Very good literature goes further, and leaves us with the possibility of many interpretations. We face off against the incalculability of Kolmogorov complexity.
This stress also determines how we live our lives. Traveling through supposedly random events, we are looking for patterns and structure. Life is full of ups and downs. There is the joy of falling in love, having fun with children, feeling great accomplishments at the end of hard work. There is the pain of a collapsing relationship, the agony of failure after active attempts to complete a task, the tragedy of the death of a loved one. We are trying to find meaning in all of this. We despise the feeling of complete chance and the idea that we simply follow the chaotic, uncomplicated laws of physics. We want to know if there is any meaning, purpose, significance in the surrounding world. We need a magical life story, and we tell ourselves stories.
Sometimes these stories are just false. Sometimes we deceive ourselves and others. And sometimes we correctly define patterns. But even when the story is true, it will not necessarily be the best. We will never be sure that in the depths there is no even more basic and accurate story. As we grow old and become depressed, we gain certain ideas about the Universe that are inaccessible to us before. We find improved patterns. Perhaps we are starting to see things more clearly. Or not. We will never know. But we know that searches are guaranteed not to end.
Nozon Janowski - Ph.D. in mathematics, works at the Educational Center of the City University of New York, a professor of computer science at Brooklyn College at the same university.