Philosophy of Information, Chapter 2. The Existence of Information


    Before reading this text, it is recommended to read the beginning of this story. Otherwise it will not be clear why it was necessary to build a complex structure instead of doing it as usual, in a simple way.


    Chapter 2. The existence of information


    Signals and Contexts


    We need to learn to get rid of the illusion that the information is contained in books, on hard drives, in cables, radio waves, and other objects from which we are used to “extract” it. If we have finally accepted that the reification of the concept “information” is inadmissible, then we simply have to admit that, for example, reading a book, we acquire information, but in the subject matter that we must use for this, it is not. An object must be present (it is impossible to read a book without having it), but the physical object cannot contain information.

    Let's carefully analyze what happens when we read a book. Certainly, there is a certain physical process, and it is most convenient to describe some stages of reading a book in physical terms. In particular, if we read the paper book with our eyes, then it should exist as a material object, and some acceptable level of illumination should be provided. The optical system of the "eye" should also be, and it should be operational. Using other ways of reading (Braille, voicing programs) does not change the situation very much, and in these cases it also makes sense to talk about a certain material component, which also has to be.

    The fact that we, the readers, occur in the brain after the content is delivered in some way, can also be tried to speak in physical terms, but this is of little prospect. Something, of course, is happening. The material component, no doubt, takes place, but now we don’t have a way to translate such, for example, a simple and obvious situation as “surprised by an unexpected plot twist” into material terms. It cannot be ruled out that we will never have such a method. If only because the mechanism of surprise for the unexpected plot twist can be implemented in different ways in different heads.

    The specificity of information processes, in contrast to the material, is that the same information process can be implemented “in matter” in fundamentally different ways, but at the same time remain yourself. For example, the sum of two numbers can be found using an electronic calculator, a wooden account, counting sticks, a piece of paper and a pen, or even in the mind. The meaning and result of the action will remain the same. The book can be obtained in paper form by mail or electronically by e-mail. The method of implementation, of course, affects many nuances, but the essence and meaning of what is happening remains unchanged. Any attempt to "ground" the information process into the material component ("surprise is nothing but internal dopamine secretion", "delight is nothing, like endorphins' internal secretion ”is akin to the fact that we would say that adding two numbers is nothing more than moving wooden knuckles along iron guides. Material reality is total, therefore any information process must have a material aspect, but to it only what is happening cannot and should not be reduced, otherwise the addition of numbers will have to become the monopoly prerogative of the wooden account. Turning to the consideration of the informational aspect of what is happening, you need to be able to abstract from the material aspect, while naturally realizing that it certainly exists, but what it specifically is, is not very important to us.

    We will continue the consideration of the process of reading a book, abstracting from the details of the material realization of what is happening. In order for the reader to successfully read the text delivered to his receptors, a number of conditions must be met. First, he must know the language in which it is written. Secondly, he should be able to read. Thirdly, he should understand why this particular occupation is more preferable for him now than all the others. It is easy to see that in all these conditions we are talking about the reader having information, because both “knowledge” and “skill” and “understanding” are all synonyms for the concept “information”. Thus, to read a book, we have two sets of conditions for the successful course of the process: the presence of text in some way and the preliminary readiness of the reader. The condition for the delivery of text is denoted as the requirement ofsignal . The condition of the reader's readiness is denoted as a requirement for a context .

    What is important, these same two sets of conditions are observed in any process that we can identify as the acquisition of information. Even if you consider such a simple thing as a radio-controlled little car, it’s possible to receive commands only when, firstly, the radio signal delivery is OK (the antenna is not broken and the little car did not roll too far from the console) and, secondly, the unit control avtomobilchika "understands" the command sent to the remote. It turns out that even despite the fact that everything seemed to be happening in a reliably determined “piece of hardware,” the knowledge that the receiver designer received from the transmitter designer turned out to be the most important component that ensured the receiver successfully received data from the transmitter. It was this knowledge that ensured that the receiver became a material object,in a special way . The radio wave that came to the antenna is not all the information that has entered the receiver. There was still, perhaps, an e-mail received by the developer of the avtomobilchka control unit from a colleague who developed the console.

    Both components - the signal and the context - we can consider both in the material aspect and in the information aspect. But if it is sometimes possible to abstract from the informational aspect of the signal (especially when the channel width is deliberately redundant), then it is impossible to abstract from the informational aspect of the context, which in its essence is the ability to interpret the signal. Context is information about how a signal can be interpreted , and therefore we must consider it as an intangible entity.

    It may seem that in the transfer of mysterious immateriality into this kind of mysterious “context” there is some element of cheating. But it is not difficult to notice that the perceived information and information constituting the context are different information. The plot of the book and knowledge of the language in which it is written are different knowledge. If the resulting recursiveness of the structure (for the existence of a second-order context, a third-order context is needed, and so on deep into infinity) causes some anxiety, then, looking ahead a little, I note that this is not a signal-contextual defect, but probably its most valuable property. We will return to this topic in the fifth chapter in order to prove an extremely useful theorem through the recursiveness of the signal-context construction.

    To solve our metaphysical problems, the essential benefit of viewing information as what happens on the combination of a signal with a context is that this construction is obtained by the very bridge between the worlds that we lacked so much. If in a particular situation we managed to abstract from the informational aspects of the signal (which is often not particularly difficult), we are able to argue about the participation of material objects in the information process. If at the same time we were able to consider the context in its entirety of its dual nature (in our age of information technologies, this is a common thing), then as a result we have for the specific situation a full bridge between the material and information worlds. It should be immediately noted that the presence of the bridge still does not give us the right to re-identify the information. The signal, if it is considered as a material object, can be reified (the file is recorded on a flash drive, a flash drive in a pocket), but the context, that is, the ability to interpret the signal, cannot be retified.

    When considering the classical data transmission situation from the point of view of information theory, we have a transmitter that “places” information into a signal and a receiver, “extracts” information from it. There is a persistent illusion that information is something that exists within a signal. But you need to understand that the interpretation of a specially prepared signal is far from the only scenario of acquiring information. Paying attention to what is happening around, we get a lot of information that no one sent us. The chair does not send us information that it is soft, the table does not send information that it is solid, black paint on the book page does not send us information about the absence of photons, the radio off does not send information that it is silent. We are able to understand the material phenomena around us, and they become information to us because we have a context in advance that allows us to interpret what is happening. When we wake up at night, opening our eyes and seeing nothing, we do not extract information about what has not yet dawned, not from the physical phenomenon present, but from its absence. The absence of the expected signal is also a signal, and it can also be interpreted. But the lack of context can not be some such special "zero" context. If there is no context, then there is no place for information, no matter how much the signal arrives. and it can also be interpreted. But the lack of context can not be some such special "zero" context. If there is no context, then there is no place for information, no matter how much the signal arrives. and it can also be interpreted. But the lack of context can not be some such special "zero" context. If there is no context, then there is no place for information, no matter how much the signal arrives.

    We all know perfectly well what information is (for creatures living in an information spacesuit, there can be no other way), but we are accustomed to consider information only that part of it, which is designated here as a “signal”. The context is a matter of course, for us, taken for granted, and therefore we habitually take it out of the brackets. And after bracing out the context, we are forced to put all the “information” exclusively into the signal and, thus, to mercilessly reify it.

    There is nothing difficult to get rid of the reification of "information". You just need to learn in time to remember that in addition to the signal there is always the context. A signal is just a raw material that acquires meaning (value, usefulness, importance and, yes, information) only if it falls into the appropriate context. And the context is a thing that should necessarily be spoken in non-material terms (otherwise this speaking will definitely not make sense).

    Let us briefly recall the topic “properties of information” and evaluate how these properties fit into the two-component “signal-context” construction.

    1. Novelty. If the acceptance of a signal does not add anything at all to the informational aspect of an already existing context, then signal interpretation events do not occur.

    2. Credibility Interpretation of the signal by the context should not give false information (“truth” and “false” - concepts that are applicable to information, but not applicable to material objects).

    3. Objectivity. The same as credibility, but with an emphasis on the fact that the signal may arise from the work of another context. If the context trying to get information and the mediation context do not have mutual understanding (first of all on the goals pursued), then the information will not be reliable.

    4. Completeness. The signal is, objective, reliable, but the context is not enough for acquiring full information.

    5. Value (utility, significance). There is a signal, but there is no suitable context. All words are clear, but the meaning is not captured.

    6. Availability. Signal characteristic. If the signal cannot be obtained, even the presence of the most beautiful suitable context will not help the information to arise. For example, anyone could easily figure out what can be done with accurate data on how tomorrow’s football match will end. But, unfortunately for many, this signal will appear only after the end of the match, that is, when its usefulness and significance will be far from the same.

    In my opinion, the properties listed above resemble not a property, but a list of possible faults. Properties - it still should be something that describes what we can expect from the subject in question, and what can not be expected. Let's try to derive from the “signal + context” construction at least a few obvious consequences, which, in fact, will be the properties of not specifically taken information, but of information in general:

    1. Subjective information. The signal may be objective, but the context is always subjective. Therefore, information by its nature can only be subjective. One can speak about the objectivity of information only if it was possible to ensure the unity of the context in different subjects.

    2. Informational inexhaustibility of the signal. The same signal, falling into different contexts, gives different information. That is why it is possible, from time to time, re-reading your favorite book, each time to find something new.

    3. The law of preservation of information does not exist. Not at all. We like it when the objects with which we operate strictly obey the laws of conservation and are not inclined to appear from nowhere, and even more so do not have the habit of disappearing into nowhere. Information, unfortunately, does not apply to such items. We can rely on the fact that only the signal can obey the conservation laws, but there is no information inside the signal and there cannot be. You just need to get used to the idea that in the normal mode information just comes from nowhere and goes nowhere. The only thing that we can do to keep it somehow is to take care of the preservation of the signal (which, in principle, is not a problem), the context (which is much more difficult, because it is changeable) and the reproducibility of the situation of getting the signal into context .

    4. Information is always the complete and undivided property of the subject in the context of which it occurred. A book (physical object) may be someone’s property, but the thought generated by its reading is always the undivided property of the reader. However, if we legalize private property on the souls of other people, then it will be possible to legalize private ownership of information. What has been said, however, does not cancel the author’s right to be considered an author. Especially if it's true.

    5. Signals cannot be assigned to a signal that apply only to information. For example, the characteristic “truth” can be applied only to information, that is, to a combination of a signal with a context. The signal itself can be neither true nor false. The same signal in combination with different contexts can give true information in one case and false information in another. I have two news for followers of “book” religions: one is good and the other is bad. Good: their holy books are not a lie. The bad: they do not contain truths in themselves either.

    To answer the question “where does information exist?” Without the use of a two-component signal-contextual construction, one has to use the following popular approaches:

    1. "Information can exist in material objects . " For example, in books. When bringing this approach to logical completeness, one inevitably has to admit the existence of “inforod” - a thin substance present in books besides paper fibers and pieces of paint. But we know how books are made. We know for sure that no magical substance is poured into them. The presence of subtle substances in the objects we use to acquire information contradicts our everyday experience. The signal-contextual construction is excellent without thin substances, but it also gives an exhaustive answer to the question "why do we need the book itself to read the book?"

    2. “The world is permeated with information fields, in the thin structure of which everything that we know is recorded . A beautiful and very poetic idea, but if so, then it is not clear why you need a volume of Hamlet to read Hamlet. Does he work as an antenna tuned to a specific Hamlet wave? We know how Hamlet’s volumes are made. We know for sure that no detector circuits that are configured to receive otherworldly fields are not embedded in them. The signal-contextual construction does not need any assumptions about the existence of parallel invisible worlds. It is well done without these extra essences.

    3. “Information can exist only in our heads”. A very popular idea. The most insidious and tenacious version of reification. Its cunning is due primarily to the fact that science has not yet developed any coherent understanding of what is happening in our heads, and in the darkness of this obscurity it is convenient to hide any shortcomings. In our large and diverse world, it happens that a person writes a work, and then, without having time to show it to anyone, dies. And then, after years, the manuscript is found in the attic, and people find out that none of them have known all this time. If information can exist only in the heads, then how can it skip that period of time when there is not a single head that owns it? The signal-contextual construction explains this effect simply and naturally:

    Let's see how the idea of ​​signals and contexts fit into what happens during the transfer of information. It would seem that something surprising should happen: there is information on the transmitter side, then the transmitter gives the receiver a signal in which there is no information, and there is information on the receiver side again. Suppose Alice intends to ask Bob to do something. Immediately, we note that Alice and Bob do not necessarily have to be living people. Alice can be, for example, a business logic server, and Bob - a database server. The essence of what is happening from this does not change. So Alice has information that, of course, is inside her a combination of signal and context. With this information, as well as information about what signals Bob can receive and interpret, she makes some changes in the material world (for example, writes a note and attaches a magnet to the fridge or, if Alice and Bob are servers, then it uses the network infrastructure). If Alice was not mistaken about Bob, then Bob receives the signal in his context and finds information about what he should do now. The key point is the commonality of the context. If we are talking about people, then the commonality of the context is ensured by the presence of a common language and involvement in joint activities. If we are talking about servers, the commonality of contexts is realized through the compatibility of data exchange protocols. It is the commonality of contexts that allows information as if to jump over that part of the path where it cannot exist and turn out to be on the side of the receiver. Generally speaking, information, of course, does not jump anywhere. That Alice possesses if Alice and Bob are servers, then it will use the network infrastructure). If Alice was not mistaken about Bob, then Bob receives the signal in his context and finds information about what he should do now. The key point is the commonality of the context. If we are talking about people, then the commonality of the context is ensured by the presence of a common language and involvement in joint activities. If we are talking about servers, the commonality of contexts is realized through the compatibility of data exchange protocols. It is the commonality of contexts that allows information as if to jump over that part of the path where it cannot exist and turn out to be on the side of the receiver. Generally speaking, information, of course, does not jump anywhere. That Alice possesses if Alice and Bob are servers, then it will use the network infrastructure). If Alice was not mistaken about Bob, then Bob receives the signal in his context and finds information about what he should do now. The key point is the commonality of the context. If we are talking about people, then the commonality of the context is ensured by the presence of a common language and involvement in joint activities. If we are talking about servers, the commonality of contexts is realized through the compatibility of data exchange protocols. It is the commonality of contexts that allows information as if to jump over that part of the path where it cannot exist and turn out to be on the side of the receiver. Generally speaking, information, of course, does not jump anywhere. That Alice possesses then Bob takes the signal to his context and finds information about what he should do now. The key point is the commonality of the context. If we are talking about people, then the commonality of the context is ensured by the presence of a common language and involvement in joint activities. If we are talking about servers, the commonality of contexts is realized through the compatibility of data exchange protocols. It is the commonality of contexts that allows information as if to jump over that part of the path where it cannot exist and turn out to be on the side of the receiver. Generally speaking, information, of course, does not jump anywhere. That Alice possesses then Bob takes the signal to his context and finds information about what he should do now. The key point is the commonality of the context. If we are talking about people, then the commonality of the context is ensured by the presence of a common language and involvement in joint activities. If we are talking about servers, the commonality of contexts is realized through the compatibility of data exchange protocols. It is the commonality of contexts that allows information as if to jump over that part of the path where it cannot exist and turn out to be on the side of the receiver. Generally speaking, information, of course, does not jump anywhere. That Alice possesses then the commonality of the context is ensured by the presence of a common language and involvement in joint activities. If we are talking about servers, the commonality of contexts is realized through the compatibility of data exchange protocols. It is the commonality of contexts that allows information as if to jump over that part of the path where it cannot exist and turn out to be on the side of the receiver. Generally speaking, information, of course, does not jump anywhere. That Alice possesses then the commonality of the context is ensured by the presence of a common language and involvement in joint activities. If we are talking about servers, the commonality of contexts is realized through the compatibility of data exchange protocols. It is the commonality of contexts that allows information as if to jump over that part of the path where it cannot exist and turn out to be on the side of the receiver. Generally speaking, information, of course, does not jump anywhere. That Alice possessesthe same information as Bob can only be spoken if they have indistinguishably identical signals and indistinguishably identical contexts. In the lives of people this does not happen. It is impossible to see the green color in the same way as another person sees it, but it is possible to agree among themselves that we will designate such a color between ourselves with a green signal.

    The signal-contextual construct is not entirely new to world philosophy. 250 years ago, Immanuel Kant wrote that “our knowledge ( information? ), Though derived from experience ( signal? ), Is completely impossible without a priori knowing knowledge ( context? )”.

    Measuring information


    Measuring information in bits is a favorite thing. It is impossible to deny yourself the pleasure to speculate about it, simultaneously trying on a calculation method to the now-known and, I hope, understandable signal-contextual design.

    If we recall the classical theory of information, then a generalized formula that calculates the amount of information (in bits) looks like this:

    where n is the number of possible events, and p n is the probability of the n- th event. Let's think what is in this formula for what from the points of view of the receiver and transmitter. The transmitter can report, for example, about a hundred events, of which the first, second and third have a probability of 20%, and the remaining 40% are evenly spread over the remaining ninety-seven events. It is easy to calculate that the amount of information in the report about a single event from the transmitter’s point of view is approximately 4.56 bits:
    I = - (3 × 0.2 × log 2 (0.2) + 97 × (0.4 / 97) × log 2 (0.4 / 97)) ≈ - (-1.393156857 - 3.168736375) ≈ 4.56

    Please do not be surprised at the fractional result. In technology, of course, in such cases it is necessary to round up, but the exact value is also often interesting.

    If the receiver knows nothing about the distribution of probabilities (and how should he know?), Then from his point of view the amount of information received is 6.64 bits (this can also be easily calculated using the formula). Now imagine a situation that for the needs of the receiver, only events number 1 (“execute”), 2 (“pardon”) and 100 (“award the order”) are interesting, and everything else is not an interesting “other”. Suppose the receiver already has statistics on previous episodes, and he knows the probability layout: execute - 20%, pardon - 20%, award the order - 0.4%, other - 59.6%. We believe we get 1.41 bits.

    The spread was substantial. Let's look for an explanation of this phenomenon. If we recall that information is not only one objectively existing signal, but the combination “signal + context”, then it is not at all surprising that the amount of information that occurs when a signal is received must also be context-sensitive. Thus, we have a good agreement between the signal-contextual concept and the mathematical theory of information.

    The value of "I" , calculated through the formula, is usually used to solve the following problems:

    1. For the construction of data transmission medium. If the coding problem is formulated as “giving everything that is, but doing it as efficiently as possible”, then when solving it for the case described in the above example, you need to focus on the value of 4.56 bits. That is, try to make it so that on average a million transmission cycles fit as close as possible to 4,561,893 bits. You should not count on the fact that you manage to shrink to a smaller volume. Mathematics is relentless.

    2. To understand how the uncertainty of the recipient decreases when a signal arrives. It is believed that the flow of information reduces the informational entropy of the receiver by the amount of its quantity. If we consider the amount of information in this sense, then the correct answers depending on the properties of the receiver will be 6.64 and 1.41 bits. The value 4.56 will also be the correct answer, but only if the receiver is interested in all the events and is aware of their probabilities in advance.

    In the overwhelming majority of cases, when we speak of bits, bytes, megabytes, or, for example, gigabits per second, we focus on the first interpretation. We all like using the broadband Internet much more than the stunted dial-up connection. But sometimes it happens that we have to spend half a day on the Internet, read a mountain of texts and view a bunch of videos only to finally get a simple binary answer in the "yes or no" style to the question of interest. At the same time, our uncertainty is reduced not by the tens of gigabytes that we had to pump to ourselves, but by just one bit.

    The entropic interpretation of the nature of information raises more questions than it answers. Even from a purely everyday point of view, we see that minimal uncertainty is observed among those citizens who have not read a single book, and all of whose educational contacts with the outside world are limited to watching television shows and sports programs. These respected subjects are in complete happy certainty on all conceivable questions of the universe. Uncertainty appears only with the expansion of horizons and the acquisition of the pernicious habit of thinking. The situation when obtaining information (reading good smart books) increases uncertainty is impossible from the point of view of the entropy information theory, but from the standpoint of signal-context theory it is quite a common phenomenon.
    Indeed, if the reception of a signal results in the formation of a new context, then to feed it we need more and more new signals that will satisfy this context, but can create a new, primordially hungry context as a side effect. Or even a few.

    No less surprising is the reasoning that information can be somehow related to orderliness (if entropy is a measure of chaos, then negentropy, that is, information, should be a measure of orderliness). Let's look at the following sequences of zeros and ones:

    1. 0000000000000000000000000000000000000000. Perfect order in the style of "dream hostess." But there is no information here, as there is no it on a blank sheet of paper or a newly formatted hard disk.
    2. 1111111111111111111111111111111111111111. Essentially the same.
    3. 0101010101010101010101010101010101010101. Already more interesting. The order remained perfect, the information is still not thick.
    4. 0100101100001110011100010011100111001011. This I was not too lazy to throw a coin. 0 - eagle, 1 - tails. I tried to throw honestly, and therefore it can be assumed that the perfect mess turned out. Is there any information here? And if so, what about? The answer is "about everything", but if so, how can it be extracted in a usable form?
    5. 1001100111111101000110000000111001101111. Similar to a coin, but only through a pseudo-random number generator.
    6. 0100111101110010011001000110010101110010. It also looks like the same random nonsense, but this is not it. Below I will say that it is.

    If you remove the text comments and make a riddle of what could be the result of throwing a coin, the first three options will disappear immediately. The 5th is also under suspicion, because there are more units than zeros. This is a wrong reasoning. With an honest coin toss, the loss of all these options has the same probability of 2-40 . If I continue to throw a coin without sleep and rest, hoping to reproduce at least one of the six options presented, then we can expect that if I'm lucky, in about a hundred thousand years I will succeed. But which of these options will be reproduced first is impossible to predict, since they are all equally likely.

    The sixth paragraph, by the way, is represented by the word “Order” (that is, “order”) in the eight-bit ACSII code.

    It turns out that there is no information either in perfect order or in perfect disorder. Or is there? Imagine that a perfectly random sequence of zeros and ones (No. 4), obtained by throwing a coin, not by me, but by an employee of the encryption center of an enemy army, and is now used as a piece of a secret key that encrypts dispatches. In this case, these zeroes and edinichki immediately cease to be meaningless digital junk, and immediately become super important information, for which the decoders will be ready to sell the soul. No wonder: the signal found its context, and thus became very informative.

    I have no desire to assert that the entropic theory of information is not completely correct. There are a number of highly specialized applications in which it gives an adequate result. You just need to clearly understand the limits of its applicability. It can be assumed that one of the limitations should be the requirement that the received signal does not lead to the formation of context. In particular, this criterion corresponds to the majority of communications. It is really possible to speak about the extraction of a signal from noise as a struggle with entropy.

    Information measurement has one more aspect, which is better not to forget. The result of any single measurement is a number. In our case, these are bits, bytes, gigabytes. Having received the number, we usually expect that we will be able to operate on them in the usual way. Compare over / under, add, multiply. Consider two examples of applying the operation “addition” to the amounts of information:

    1. There are two flash drives. The first - 64 GB, the second - 32 GB. Total we have the opportunity to write on them 96 GB. Everything is so, everything is fair and correct.

    2. There are two files. The first is 12 MB, the second is 7 MB. How much information do we have? Hand stretches fold and get 19 MB. But let's not hurry. First, feed these files to the archiver. The first file shrunk to 4 MB, the second to 3 MB. Can we now add the numbers and get the total true amount of available data? I would suggest not to hurry and look through the eyes on the contents of the source files. We look and see that all the contents of the second file are in the first file. It turns out that the size of the second file does not make sense at all to add to the size of the first one. If the first file were different, then the addition would make sense, but in this particular case, the second file does not add anything to the first.

    From the point of view of the amount of information, the situation with quineas, programs, one of the functions of which is to issue your own source code, is very interesting. In addition to this function, such a program may contain something else: some useful algorithm, texts, images, and the like. It turns out that inside the program there is this “something else”, and in addition to this, there is it itself, within itself containing once again all the whole itself plus the very thing “something else”. This can be expressed by the following formula: A = A + B, where B is not equal to zero. For additive quantities, such an equality cannot exist.

    Thus, with the amount of information a very strange situation is obtained. It can be said that the amount of information is a conditionally additive quantity. That is, in some cases we have the right to add the available numbers, and in some - not. When it comes to data transmission channel capacity (in particular, a flash drive can be considered as a data transmission channel from the past to the future), addition is correct, and when weighting a particular signal we get a value, the possibility of which addition with other similar values ​​is determined external factors, the existence of which we may not even know. For example, one can talk about the information capacity of the human genome (DNA can be considered as a medium of data transmission, and, as far as I know, there are groups of researchers trying to design DNA-based drives)“How much information is recorded specifically in my genome?” Will be meaningless. The maximum that can be argued is that whatever the calculation method is used, the result cannot exceed the same 6.2 Gbit. Or, if the reality is suddenly such that it is necessary to take into account not only the sequence of nucleotide bases, then it can. If we talk about the total amount of information contained in a living cell, then, apparently, the answer to this question cannot be obtained at all due to the fact that the cell itself is a living being, and not a medium of data transmission.

    At the end of the topic “measurement of information” I would like to introduce the concept of “informational class”, which allows to estimate the amount of information, if not quantitatively, then at least qualitatively:

    1. The final information content is the situation when all the signal necessary for the context can be encoded by a discrete sequence of finite length. For such situations, the measurement of information in bits is applicable. Examples:

      • The text of "Hamlet."
      • All extant texts ever written by mankind.
      • Information in the genome.

      The information technologies that are currently available work precisely with finite information.

    2. Infinite informativity - a situation when a discrete sequence of infinite length is required for encoding a signal, and any restriction (“lossy compression”) to a finite length is unacceptable. Example: data on the position of the balls, which need to be maintained during ideal modeling of billiards, so that if you then start the process in the opposite direction, the initial position is formed. In this case, the speed and position of the balls must be infinitely accurate (an infinite number of decimal places), because, due to the strong non-linearities that exist, an error in any sign tends to accumulate and lead to a qualitatively different result. A similar situation arises in the numerical solution of nonlinear differential equations.

      Despite the seeming transcendence, there are no fundamental reasons for the fact that with the development of technology we do not have the means to work with endless informativities.

    3. Unsolvable information - a situation in which the required data can not be obtained in any way due to fundamental limitations of either a physical or logical nature. Examples:

      • It is impossible to find out what happened yesterday on a star that is 10 light-years away from us.
      • It is impossible to find out simultaneously with absolute precision the momentum and the position of the particle (quantum uncertainty).
      • Being in a situation of decision-making, the subject cannot know in advance which particular alternative of available alternatives he will take the decision. Otherwise (if he knows the decision) he is not in a decision-making situation.
      • A complete deterministic description of the universe cannot be obtained in any way. The whole complex of fundamental constraints, both physical and logical, works against this at once. Plus, the effects associated with the parade of the barber are added.

      If, regarding the physical limitations, there is still some hope that clarifying the picture of reality will allow to translate some seemingly intractable informativity into finite or at least infinite, then logical constraints cannot be overcome in any technological development.

    "Information" in physics


    Historically, the connection between the topic “information” and the topic “entropy” arose from a discussion about the Maxwell demon. Maxwell's demon is a fantastic creature sitting near the door in the wall separating the two parts of the chamber with gas. When a fast molecule flies to the left, it opens the door, and when it is slow it closes. And if a fast arrives on the right, it closes the door, but if it is slow, it opens. As a result, slow molecules accumulate on the left and fast molecules on the right. The entropy of the closed system grows, and on the temperature difference generated by the demon, we can start a second-order perpetual motion machine to our satisfaction.

    The perpetuum mobile is impossible, and therefore, in order to bring the situation in line with the law of conservation of energy, and at the same time in line with the law of non-decreasing entropy, I had to argue as follows:

    1. When a demon is running, the entropy of the gas decreases.
    2. But at the same time, since the molecules interact with the demon, the gas is not an isolated system.
    3. The “gas + demon” system should be considered as an isolated system.
    4. The entropy of an isolated system cannot decrease, so the entropy plus the entropy of the demon does not decrease.
    5. From this it follows that the entropy of the demon grows.

    So far, everything is logical. But what does "demon's entropy grow" mean? The demon receives information (we work so far in traditional terminology) about approaching molecules. If the information is negative entropy, then the demon's entropy should decrease, not grow. Suppose that the demon performs a simple mental effort, and through the mechanism of the door transmits information to a flying molecule (or, alternatively, does not transmit). Negative entropy returns to the molecule, and thereby reduces the entropy of the gas. But why is the entropy of a demon growing? Why do we consider only the outgoing information flow from the daemon, but do not consider the incoming flow? What will happen if the demon does not immediately forget what signals it received from the arriving molecules, but will memorize them? Is it possible in this case to say that the entropy of the demon does not increase?

    Norbert Wiener, considering Maxwell's demon (“Cybernetics”), writes that a perpetual motion machine cannot be assembled on this thing, because sooner or later the demon’s increasing entropy will reach a critical limit, and the demon will spoil. In principle, this is logical, but it is unlikely that the damage to the demon should be explained by the fact that it will distribute its original wisdom to the molecules, and will itself become stupid. From an information point of view, the work of the demon is very simple and tedious. Neither of which "waste of mental forces" can not speak. Similarly, we do not say that, for example, each file passed through the archiver program increases the entropy of the archiver and thereby gradually reduces its ability to compress data. Most likely, the impossibility of a perpetual motion machine on Maxwell's demon should be explained not by information technology considerations, but by

    The formulas by which thermodynamic and informational entropies are considered are generally similar. Thermodynamic entropy (compare with formula (1) above):

    where p i is the probability of the i -th state, and k B is the Boltzmann constant. But this formula is inevitably tied to the fact that there is a subject who has classified the state and identified a finite number of groups of interest. If you try to get rid of the subject concerned, you may find that there is a high risk that the expression should be correctly written like this:

    The total probability is 1 (the system must be in one of the states):

    An infinite number of possible states is much closer to the truth of life than a finite one. It is easy to show that if in the system under consideration the percentage of states x , for which the probability p x is not zero, does not tend to zero, the integral entropy tends to infinity. In terms of formula (2):

    Thus, if the assumption that the operation of integration is appropriate here is correct (and for this it suffices only that at least one of the physical quantities has the continuity property), then the “information” capacity is practically any (i.e., except in degenerate cases a) The material system turns out to be unlimited. It destroys any meaning to equate the thermodynamic information entropy. The similarity of formulas can be attributed to the fact that in our world there are many fundamentally different things expressed by similar formulas. There are other arguments in favor of complying with thermodynamic and informational entropies, but, as far as I know, they have either never been subjected to experimental verification, or (as, for example, Landauer's principle) themselves derived from the assumption of equality of entropies.

    Speaking about the connection of the topic “information” with physics, one cannot but mention the concept of “quantum information”. The laws of quantum mechanics are such that in some cases, describing what is happening, it really makes sense to use information terms. For example, according to the Heisenberg uncertainty principle, we can know for sure either the particle momentum or its position. From this arises the illusion that by taking a measurement, we can get no more than a certain maximum amount of information. From this, as it were, the conclusion automatically follows that there can be information inside the particle, moreover its volume is strictly limited. I can not say anything about the productivity or counterproductiveness of such use of information concepts, but there is a strong suspicion that a bridge is drawn between the purely physical concept of “quantum information” and that information

    To transmit our macro information, we use not only physical objects and phenomena, but also their absence. The text in the book is encoded not only by the substance of paint, but also by unpainted gaps (nothing can be read from a uniformly colored sheet). You can also easily come up with a lot of situations where a very important signal is transmitted not by the energy impact, but by its absence. I am still ready to imagine that inside the particle there is a certain mysterious substance, which is information, but to imagine that inside the absence of a particle there is also information - this is something completely counter-logical.

    At the current level of development of knowledge about how our world works, it seems to me that the concept of “quantum information” should be treated in the same way as the concept of “color” used in relation to quarks. That is, yes, “quantum information” is quite possible and necessary to recognize as a valuable concept, but it should be clearly understood that it can only indirectly relate to that “information” about which we speak in all other cases. Perhaps the conflict can be resolved by the consideration that physics can quite productively study the material basis of the transmitted signal (in particular, give an answer about the maximum possible capacity of the data transmission channel), but the presence of a signal is a necessary but not sufficient condition for us the right to say that there is information in the object in question.

    We need to clearly understand that we do not have the physical basis of information (an analogue of the phlogiston theory, but only applicable not to heat, but to information), not because we do not know everything yet, but because it cannot be in principle. One of the most essential requirements of the natural science method, which is most clearly and consistently applied specifically in physics, is the expulsion from the phenomenon being studied that has the free will of the acting subject. The subject (the so-called "implicit observer"), of course, should be close to the phenomenon under consideration, but he has no right to interfere in anything. The mechanism of the phenomena studied, that is, the total absence of purposeful activity, is what makes physics physics. But as soon as we start talking about information, we can’t get away from that the signals received by the subject are the raw materials for decision-making. An implicit observer of physical phenomena should be the same as what to observe, and the actor who lives simultaneously in the material world and in the information reality cannot “be in principle” anyway. From this diametrical contrast to the requirements for a subject placed inside the phenomena being studied, it follows that the “information” phenomenon cannot be reduced to any physical phenomena, including even those that are not yet discovered.

    What is particularly surprising is that an excellent consensus has been reached among materialists with idealists on the need for deep physical "information". For materialists, it is in the hands of the fact that physics thus reaches the totality of the description of reality (nothing remains that is not physical reality). And idealists celebrate victory because in this way their “spirit” is officially recognized as the basis of the universe. Both have long been at war with the camps celebrating victory, but rather not over each other, but over common sense. Both materialists and idealists react very aggressively to any attempt to link the material and ideal worlds in any alternative, banal way of reification.

    Data


    As mentioned above, the signal can be considered not only a material object, but also an intangible object. According to the principle of totality of physical reality, the signal, of course, must have a physical embodiment, but quite often there are situations when the physical side of the signal does not interest us at all, but only the intangible component is interested. In such cases, we completely abstract away from the physics of the signal, and as a result we have a very strange subject for further discussion. We have rejected physics, and we still cannot speak about the presence of information inside this subject, since this is just a signal, and in order for information to emerge, it needs a context for it. Such objects will be called data. Data is an intangible signal. Intangible he is not because that it has some otherworldly nature and travels through the subtle astral entities, but because in this particular case it turned out to be not important for us how it travels. For example, a small volume of “Hamlet” in a beautiful binding, and indeed its and some rare edition, is a signal in which we are interested in both the material and non-material components. But if you just need to refresh the monologue “to be or not to be”, we are looking forthe text , and we do not care where we find it. Both the paper book, and the file on the flash drive, and the network library service are suitable. The text of "Hamlet" is the data, and the volume of the gift edition of "Hamlet" is no longer just them.

    Of particular interest is the case of an object for which not only physics is not essential, but also a suitable context is missing. Imagine an inscription in an unfamiliar language (I do not know Chinese, so let it be Chinese). I want to know what this inscription means, and therefore I take a piece of paper and carefully redraw the hieroglyphs. Just copying all the dashes and squiggles. For me, this is all the dashes and squiggles. The meaning of the picture will appear only after I show this leaflet to someone who speaks Chinese, and he will translate the inscription into some language that I can understand. In the meantime, this did not happen, I have an information object on a piece of paper that definitely has a signal, but a signal for a context that is not present at the moment.

    In the case of copying Chinese characters, I could not bother to redraw the data (this is data) on a piece of paper, but take a picture on the phone and send it to my friend by mail. In the course of the journey of this signal to my friend, the lack of context for the interpretation of this inscription would be observed not only in me, but also in the software of the phone, the mail program and all the magnificence of the Internet protocols that would participate in data transmission. One could say that in general such a thing as understanding is peculiar exclusively to us, super-complex creatures of flesh and blood, but this would not be entirely true. For example, when transmitting a picture with hieroglyphs, the transport layer of the network will complement the transmitted data with its service data that is understandable(that is, they will be correctly interpreted) to those mechanisms that implement the transport layer of the data network. If we assume that understanding  is not necessarily something mysterious and high, with a penetrating gaze who sees the very essence of the phenomena, but merely the presence of an adequate context (in the case of the network transport layer, this context is formed by the fact that network infrastructure developers honor the TCP protocol) then we can confidently say that our technical systems are also endowed with the ability to understand. Yes, this understanding is not very similar to our ability to grasp the essence of phenomena, which we observe from within ourselves, but this does not change matters.

    The notion of “data”, although it does not introduce anything fundamentally new to the metaphysics of information, is, nevertheless, from a practical point of view, extremely useful. The two-component “signal-context” design, although it is complete (the third component is not needed), but when you try to apply it in everyday life, immediately there is a mass of inconvenience. The source of inconvenience is that the concept of “signal” is clearly associated with the material side of the process, and when the material side has to be ignored, the “grounding” force of the “signal” begins to interfere strongly. Imagine that your friend is going to make a trip to Bremen and asks you how he could learn more about this city. The first thing that comes to mind is Wikipedia. Looking at the different language sections, you notice that the Russian-language article, though good, but very small, and English-speaking, though much longer, but still inferior to an article in German (which is not at all surprising). Now you need to tell your friend that there is more information in an English-language article than in a Russian-language one, but then, remembering the philosophy of information, you understand that there can be no information in any of the sections. A Wikipedia article is a signal that becomes information when it falls into context. Problem. which becomes information when it falls into context. Problem. which becomes information when it falls into context. Problem.“The signal recorded on the hard drives of the English-language servers of Wikipedia when you get into the context of your perception ...”  - Fu, what a horror. How can a friend get to these hard drives with his context? “The signal delivered via Wi-Fi from English-language servers ...”  is also something wrong. What does Wi-Fi have to do with it, if a comrade can just as well go to Wikipedia via the mobile Internet? When replacing the concept of "signal" with the synonym "data" (in this case, it turns out that it is a synonym), all inconveniences disappear. "You can look at Wikipedia, but keep in mind that in the English, and especially in the German article, data about Bremen is much more". They took advantage of the fact that although, as we now know, there can be no information in the article, but the data is, in fact, the article. The signal, the physical implementation of which in this particular case is not important to us.

    In my practice, I will say that, having experimented with the transition to the correct terminology in everyday life and professional activities (information technology), I have never been confronted so that someone from my interlocutors noticed that something had changed. The only thing that now has to pay attention to what is at stake - about the data, or still about the information. For example, it is not the information that is now stored in the database, but the data, but the users, by transferring this data to the database, thus exchange information. The system still remains informational, but functions on the basis of accumulated data.

    With the development of transmission networks, we have a fairly simple criterion to determine whether we have the right to completely abstract away from the physics of a particular object and, as a result, to speak of it as an information object (that is, data ). The criterion is: if the subject can be transmitted via the Internet, then we have every right to speak about this object as an information object .

    Examples:

    • The cutlet is not an information object, because it is interesting to us (tasty and nutritious) precisely in its physical incarnation.
    • The recipe for cooking patties is an information object. It can be transmitted without loss via the Internet. With all the details and subtleties, with pictures, and even with the video.
    • A coin is not exactly an information object. Especially if it has a numismatic value.
    • Money is an information object. Many of us, including me, had to pay via the Internet. In general, money is an extremely interesting object from the point of view of information philosophy. Perhaps you remember what was said above, that information does not obey the laws of conservation, but in order for the money to work, they must obey the law of conservation. Therefore, for the information objects “money”, such an infrastructure was artificially created, which purposefully keeps the balance “if somewhere increased, it means that somewhere exactly the same amount will decrease”. We will return to the discussion of the phenomenon of money when we discuss subjects and system formation.

    For the purity of terminology, of course, it would be better to speak not about an “informational”, but about an intangible object . But the term “informational” is much more convenient, since there is no particle “not” in it.

    I draw attention to the fact that the considered simple empirical rule for identifying an information object has a “if-that” structure, and therefore works only in one direction. That is, from the fact that we cannot transmit something via the Internet at all, it does not mean that the object is not informational. For example, we cannot transmit the number pi in a “live” form (that is, as a sequence of numbers). We can transfer the recipe of cooking this “cutlet” (that is, a program that sequentially calculates the signs after the comma of the number pi), we can transfer a picture with the designation, but we cannot take this “cutlet” itself.

    Information in pi


    If we are talking about pi, it makes sense to make out one funny case connected with this thing.

    Rumor has it that among the numbers that make up the infinitely long tail of pi, it is theoretically possible to find any pre-set sequence of numbers. To be completely accurate, this is still only a hypothesis, not proven and not refuted. There are real numbers that have the property to contain any finite sequence of numbers (they are called "normal"), but the hypothesis that pi is normal has not yet been proved. In particular, a normal number containing any sequence of zeros and ones can be obtained by successively adding to the tail after the decimal point the iterations of all combinations, gradually increasing the digit capacity. Like this:
    0, (0) (1) (00) (01) (10) (11) (000) (001) (010) (011) (100) (101) (110) (111) (0000) ... and so Further.

    In decimal form, the number will be slightly more than 0.27638711, and this number is guaranteed to contain the contents of any file from your hard disk, even the one that you have not written there yet.

    But we will close our eyes on the fact that the normality of pi is not proven, and we will consider it normal in our reasoning. The number pi is covered with a mass of stories, riddles and prejudices, and therefore it is more interesting to talk about it than about some simple algorithmic output. If you are inconvenienced by a mathematical error, just consider that further here I’m not talking about pi, but about any number normal to base 2.

    It turns out very majestic picture. Imagine that you sit down in your declining years, write your detailed biography, and write it to a file. So, it turns out that among pi this sequence of zeros and ones is already now. And also there is the same sequence, but supplemented with the exact date and circumstances of your death. This is truly a fate book, isn't it?

    The beginning of our book of destinies (the whole part and the first 20 signs of the endless tail) looks like this:
    11.0010000001111110110 ...

    Let's think about how such a book of fate could be read. Suppose I wrote my biography up to the present moment, I took a calculator of fantastic power and made him find the beginning of my biography among the signs of pi. It is foolish to expect that the first occurrence has a meaningful continuation. Most likely, there goes a meaningless jumble of zeros and ones. After a little tweaking over the calculator algorithm, I taught him to find not only the occurrences of a known part of the biography, but also to analyze whether the continuation is meaningful text written in approximately the same style. And finally, my calculator found such a fragment. I don’t know if it will please me or sadden me, but I will not stop the calculator. Let him continue his work. After some time, he will fill me up with a bunch of versions of my further biography found among pi. Some will be quite ordinary (“he worked, he retired then, he grew old, he was sick, he died then”), but the rest will be much more interesting. For example, in one of the versions it will be that tomorrow, not earlier or later, a global zombie apocalypse will happen, and the bloodthirsty dead will devour me. And in the other, it is necessary (as there are all combinations of zeroes and ones), it will be written that I will acquire immortality and omnipotence, and become the master of the universe. And yet an infinite number of options, an endless stream of creeps out of the computer. Which of these versions to believe? Maybe the very first? And why her? He died then "), but the rest will be much more interesting. For example, in one of the versions it will be that tomorrow, not earlier or later, a global zombie apocalypse will happen, and the bloodthirsty dead will devour me. And in the other, it is necessary (as there are all combinations of zeroes and ones), it will be written that I will acquire immortality and omnipotence, and become the master of the universe. And yet an infinite number of options, an endless stream of creeps out of the computer. Which of these versions to believe? Maybe the very first? And why her? He died then "), but the rest will be much more interesting. For example, in one of the versions it will be that tomorrow, not earlier or later, a global zombie apocalypse will happen, and the bloodthirsty dead will devour me. And in the other, it is necessary (as there are all combinations of zeroes and ones), it will be written that I will acquire immortality and omnipotence, and become the master of the universe. And yet an infinite number of options, an endless stream of creeps out of the computer. Which of these versions to believe? Maybe the very first? And why her? that I will gain immortality and omnipotence, and become the master of the universe. And yet an infinite number of options, an endless stream of creeps out of the computer. Which of these versions to believe? Maybe the very first? And why her? that I will gain immortality and omnipotence, and become the master of the universe. And yet an infinite number of options, an endless stream of creeps out of the computer. Which of these versions to believe? Maybe the very first? And why her?

    In order to simplify the task for ourselves, we will try to guess on the number pi a little bit simpler. Let's ask him a simple binary question. For example, would it be profitable for me to buy the stock of shares I was looking at today? If the first in the fractional part of pi is a unit, then it means that the omniscient oracle answered me that it is profitable. If zero, then this means that you need to wait. We look. Zero met right in the first position, but one, won, even not in the second, but in the third. Oh, something tells me that with such an oracle in my life I will not buy a single action. To this oracle would still attach some additional oracle, which suggests which position to look.

    It turns out that to extract information from the databooks of destiny we lack the very small key, which will tell you from which position this book should be read. And without the key, the only information that for us is contained in the infinite tail of the digits of pi is the ratio of the circumference of a circle to its diameter. Somehow it even turns sad ...

    Chapter Summary


    In this chapter, using the two-component “signal-context” construction, we learned not only to get rid of the reification of “information”, but also got a tool that allows us to draw a bridge between the material and non-material aspects of reality without engaging in mystical practices.

    The main concepts and concepts considered are:

    1. Information as a combination of signal and context.
    2. A signal as a certain circumstance that can be interpreted.
    3. Context as information about how a signal can be interpreted.
    4. Communication information and entropy exists, but it should not be absolutized. In some situations, the acquisition of information can be seen as a victory naha chaos, in others - on the contrary, in the third - it is even impossible to identify, the ordering of which may be involved. The connection with entropy can be traced most clearly when solving the problem of data transmission through a noisy channel, but this task is not all that we can do with information.
    5. Each time we measure information , we must ask ourselves whether we end up with an additive value. If a non-additive quantity is obtained, then it is better not to add or multiply anything.
    6. The class of informativeness as a means at a qualitative level to assess the prospects for obtaining the required information. Three classes: finite informational content, infinite and insoluble.
    7. Information cannot have a direct physical basis. Any attempts to search for a physical basis of information can and should be viewed as metastases of reification. Communication physics with information should be carried out only through the concept of "signal".
    8. Data as a signal from which the material component can be abstracted. The concept of “data”, although it does not have a separate philosophical value, makes it possible to get rid of the inconveniences caused by the materialistic orientation of the concept of “signal”.
    9. The instrumental technique "can it be transmitted via the Internet" for a quick determination of whether the subject in question is an information object .

    It will only be more interesting, but if you didn’t understand how we managed to make physics with lyrics with the help of signals and contexts, you will be sad.



    Continued: Chapter 3. Foundations

    Also popular now: