An excerpt from the book “One Day in the Life of the Brain. Neurobiology of consciousness from dawn to dusk "

    imageYou come back home, turn on the radio. Now you have time to have a quick breakfast of cereal and drink a cup of hot coffee. For the next approximately twenty minutes while you eat, active stimulation of your ears, eyes, tongue, fingertips, and nose will control your mind. Of course, in some cases, consciousness exists without explicit stimulation of the senses — for example, in the process of meditation or when you are simply extremely focused on a certain thought — but this requires special skills. Most of the time, the processes occurring in the mind are determined by what is happening directly around you - the five senses continuously fill the brain with a host of signals. Sensations in one degree or another affect the consciousness at every moment of wakefulness: they keep in touch with the outside world and allow you to orient yourself in it. Returning to the metaphor with a stone thrown into the water, we raise the question on which we will focus in this chapter. This is a question of the power of the throw: how do feelings and sensations, pure and simple, affect the mind? But we immediately encounter two problems, and one of them is connected with space, the other with time.

    FIVE FEELINGS: SPATIAL PROPERTIES OF THE BRAIN


    The spatial problem is based on neuroanatomy and is that the signals from different sense organs are processed differently. At first glance, everything is simple: you either see something, or hear, feel touch, taste, smell. We have five senses at our disposal that are clearly differentiated among themselves. But even at the most basic level, the areas of the brain reserved for processing signals from various organs of the senses are inherently not specific. In adults, various sensory systems may even violate formal anatomical boundaries: the visual cortex of blind people, for example, is activated by touch when reading Braille. Moreover, it is well known that if you lose the ability to perceive one of the five senses, the others become stronger. Neuroscientist Helen Neville has demonstrated that deafness improves vision and that deaf people use the auditory areas of the brain to process visual signals. Meanwhile, blind people can distinguish sounds better than non-blind, and they are capable of more accurately determining the location of the sound source. People with impaired vision are also better developed and other abilities, such as speech perception and voice recognition. And in experiments on animals deprived of the ability to use one or another sensory organs, it was revealed that these changes can be enormous: for example, rats are able to demonstrate a threefold improvement in hearing after a few days of being in complete darkness. and they are capable of more accurately determining the location of the sound source. People with impaired vision are also better developed and other abilities, such as speech perception and voice recognition. And in experiments on animals deprived of the ability to use one or another sensory organs, it was revealed that these changes can be enormous: for example, rats are able to demonstrate a threefold improvement in hearing after a few days of being in complete darkness. and they are capable of more accurately determining the location of the sound source. People with impaired vision are also better developed and other abilities, such as speech perception and voice recognition. And in experiments on animals deprived of the ability to use one or another sensory organs, it was revealed that these changes can be enormous: for example, rats are able to demonstrate a threefold improvement in hearing after a few days of being in complete darkness.

    However, even without direct stimulation of the senses, the brain can perform interesting tricks in processing signals of various modalities. The phenomenon of synesthesia (literally, "unification of the senses") has been known to science for three centuries already. With synesthesia, excitement from one sense organ, which the vast majority of people identify with only one category of sensations, causes sensations in two different modalities. For example, colors and shapes can be “seen” while listening to music.

    Here it’s not at all that one region intrudes into the sphere of competence of another, but rather that the connections between brain regions are extraordinarily rich and multifaceted: activation of one — say, responsible for letter recognition — also causes direct activation of another, for example , with color recognition. Perhaps there is a blocking mechanism between different parts of the cortex, which should provide a clear segregation of feedback in order to avoid any ambiguity, but, obviously, this inaccessible barrier is broken in the case of synesthesia. If the feedback signals are not interrupted in a typical manner, then returned from the later stages of multisensory processing, they can affect the earlier stages of processing to the extent that the audio signals begin to activate the visual areas.

    In any case, the existence of the phenomenon of synesthesia, along with compensation for lost perception channels by strengthening others, leads us to an inevitable, but intriguing paradox: while the subjective experience of sensory perception is very diverse and individual, the neuronal mechanisms that mediate the act of perception are standardized and interchangeable. . As soon as a signal from the outside world is converted into volleys of action potentials, its echoes are instantly sent to different parts of the brain, where they find themselves in the corresponding areas of the cortex, nevertheless similar in structure and principle of signal processing. It seems that everything is tailored to one pattern.

    So, what is the qualitative difference in subjective experiences? How does the formation of the subjective experience of a particular modality become possible? What is the reason for such selective sorting, if the physiological mechanisms of treatment are almost the same? Answers to these questions will help us understand the connection between the objective and the subjective, the physical and the mental.

    FIVE FEELINGS: TIME PROPERTIES OF THE BRAIN


    Another problem is related to the sensation of time: signals from different sensory systems are processed in the brain at different speeds, but you can nevertheless experience the totality of sensations at the same time. You can hear the clap and see the joined palms, and you will perceive these events as simultaneous, despite the fact that the auditory processing is faster than the visual. And if at this moment you experience any tactile sensation in the face - say, touching the tip of the nose - all these events will merge into one multi-modal moment of consciousness, although the signal from your nose reaches the brain most quickly, as it passes significantly less distance. This means that there are time windows that condition a seemingly single moment of consciousness: a window is a time during which sensations can catch up with each other, to unite in a familiar multisensory whole, which we call the “moment of consciousness”. Your brain must somehow synchronize events. In order to streamline all the different sensory modalities, it is necessary to provide appropriate time delays, and, of course, the slowest sensory signal will set the pace.

    It turns out that these time windows can span up to several hundred milliseconds. “We are not aware of the actual moment of the present. We are always a little late. ” Almost half a century ago, the brilliant physiologist Benjamin Libet came to this conclusion by examining patients from the neurosurgery department of a local hospital who had a hole in the skull for accessing the cortex. In one experiment, Libet used an electrode to stimulate certain parts of the brain, which caused the patient to have a tingling sensation in various parts of the body. The patient did not report that he was aware of the stimulus for a strikingly long period of time — as much as 500 milliseconds. These half seconds are an eternity on the scale of brain processes, given that the action potential is only one thousandth of a second. In addition, Libet demonstrated that when stimulation was applied to a distant part of the body, for example, to the foot, a significant period of time passed from the moment the event was recorded in the brain to the patient being aware of this event. And it's not just the existence of a time window that guarantees the timely processing of even the slowest signals: awareness of consciousness seems to come even later. Studies show that when subjects classify images presented in random order into categories (say, “animals” and “vehicles”), the brain recognizes the difference at an early stage of processing, while a “conscious” solution occurs much later (after approximately 250 milliseconds). These periods, obviously, provide the optimal time for the formation and dissolution of neural ensembles. It took a considerable period of time from the moment the event was registered in the brain to the patient being aware of this event. And it's not just the existence of a time window that guarantees the timely processing of even the slowest signals: awareness of consciousness seems to come even later. Studies show that when subjects classify images presented in random order into categories (say, “animals” and “vehicles”), the brain recognizes the difference at an early stage of processing, while a “conscious” solution occurs much later (after approximately 250 milliseconds). These periods, obviously, provide the optimal time for the formation and dissolution of neural ensembles. It took a considerable period of time from the moment the event was registered in the brain to the patient being aware of this event. And it's not just the existence of a time window that guarantees the timely processing of even the slowest signals: awareness of consciousness seems to come even later. Studies show that when subjects classify images presented in random order into categories (say, “animals” and “vehicles”), the brain recognizes the difference at an early stage of processing, while a “conscious” solution occurs much later (after approximately 250 milliseconds). These periods, obviously, provide the optimal time for the formation and dissolution of neural ensembles. And it's not just the existence of a time window that guarantees the timely processing of even the slowest signals: awareness of consciousness seems to come even later. Studies show that when subjects classify images presented in random order into categories (say, “animals” and “vehicles”), the brain recognizes the difference at an early stage of processing, while a “conscious” solution occurs much later (after approximately 250 milliseconds). These periods, obviously, provide the optimal time for the formation and dissolution of neural ensembles. And it's not just the existence of a time window that guarantees the timely processing of even the slowest signals: awareness of consciousness seems to come even later. Studies show that when subjects classify images presented in random order into categories (say, “animals” and “vehicles”), the brain recognizes the difference at an early stage of processing, while a “conscious” solution occurs much later (after approximately 250 milliseconds). These periods, obviously, provide the optimal time for the formation and dissolution of neural ensembles. presented in random order, by category (say, “animals” and “vehicles”), the brain recognizes the difference at an early stage of processing, while a “conscious” solution occurs much later (after about 250 milliseconds). These periods, obviously, provide the optimal time for the formation and dissolution of neural ensembles. presented in random order, by category (say, “animals” and “vehicles”), the brain recognizes the difference at an early stage of processing, while a “conscious” solution occurs much later (after about 250 milliseconds). These periods, obviously, provide the optimal time for the formation and dissolution of neural ensembles.

    Neurons within an ensemble do not work as isolated telephone cables that independently transmit information. Instead, the ensemble is a self-organizing, holistic structure that functions for hundreds of milliseconds. The area of ​​this self-organization slowly spreads from the epicenter, like ripples, and only when it reaches a significant area, can we talk about a moment of consciousness. Now it does not seem surprising that this process takes up to half a second.

    But the problem of space is still not solved. It remains unclear how the location of the corresponding structures of the cortex correlates with the subjective differences of hearing and vision. It is possible that differences in the perception of sensations of different modalities are somehow connected with differences in the properties of the neural ensembles of the visual and auditory cortex, which manifest themselves only after a certain period of time. If this were so, we could identify the phenomenology of hearing and vision using some criterion of objective physiology. But how to identify this criterion?

    It is still very difficult to compare the phenomenology with what we objectively observe in the brain. However, I have one suggestion. In the physiological sense, vision primarily (but not exclusively) captures the difference in the spatial arrangement of elements, while hearing primarily (but not exclusively) captures temporary differences. Then the spatial features of neural ensembles, changing over a certain time period, can help us develop a new addition to the neuroscience toolkit. Ideally, we should form a unified criterion of space-time, a kind of phenomenological mathematical equation, which can also be applied to the description of subjective consciousness.

    MULTISENSOR PERCEPTION


    But how does consciousness really work? Is perception one or all five senses to be considered separately? Everyone would agree that there are five different types of sensations, so it would be reasonable to conclude that consciousness is also fractional and the brain supports five independent processing channels, clearly distinguishing between five separate categories of feelings, which then contribute to the formation of consciousness. This reasoning seems crude and straightforward, but, as we know, the late Francis Crick and his colleague Christoph Koch, who sought to identify the neural correlate of consciousness separately for visual perception, which, as expected, can fully exist independently of other senses, adhered to this point of view.

    Back in 1978, a new approach to learning was developed based on this concept. The idea was to identify three “learning styles”: visual (“V”), auditory (auditory) (“A”) and kinesthetic (“K”) - “VAK”. VAK was originally proposed by American educators Rita and Kenneth Dunn more than thirty years ago as a way to explain individual differences in children's learning abilities. Based on this concept, methodologies were developed to optimize the learning process. But the theory has evolved much further, suggesting that some people, by their very nature, are predominantly “visuals,” others are “audials”, and still others are “kinesthetic.”

    Nevertheless, not a single independent study found confirmation of the “VAK” theory, and the only factor that influences the results of the application of the corresponding methodology seems to be the teacher's enthusiasm. But why this theory for a long time seemed so attractive? Justification again arises from the deceptive concept of autonomous brain structures, a kind of "modules", each of which performs its own independent function. Over millions of years of evolution, many specialized structures have arisen and improved in the brain, modern people have adapted many of these structures to perform the most complex cognitive functions. However, the proof of the failure of the theory of "VAK" is that these functional modules work properly, only being interconnected, and not able to function in isolation.

    An experiment performed by cognitive neurophysiologist Stanislas Dehine serves as confirmation. He asked his subjects to carry out a series of simple arithmetic calculations during a brain scan - for example, subtract seven from a hundred, then subtract seven from the resulting residue, and so on. Nevertheless, when Dehayn studied the obtained images in order to identify areas of significant activity, it turned out that a whole dozen different areas of the brain are involved in the process of simple arithmetic calculations. In other words, another study showed that the brain always functions as a whole.

    Based on the incoming visual signals, the brain creates spatial "maps" of the world. This is true even for people who are blind from birth: their brain also creates such maps. Obviously, the blind receive initial information not visually, but by focusing on touches and sounds, but this data is processed in the same way as in sighted people. So, there is a multisensory, cross-modal process in which information, be it kinesthetic, sound or visual, is interconnected and develops into a single information picture of the world.

    You may have noticed that lip reading helps you to hear it, even with strong background noise. Multisensory stimuli increase the efficiency of information processing, even in those in areas of the cortex that are sharpened by the initial signal processing of one sensory modality.

    Although we can distinguish five different senses, our brain, nonetheless, usually perceives the whole picture. All kinds of thinking include an element of abstraction. Regardless of the sensory input through which we receive information, consciousness places emphasis on meaning. A good example of “abstraction” is a walk through the morning forest: breathing in the cool, moist air, watching the game of sun glare, listening to the noise of the treetops, you feel first of all peace and tranquility. You feel no need to distinguish between individual sensations. The moment of consciousness is more than the sum of its components.

    However, there is an opinion that the perception of various modalities correlates with a different “amount” of consciousness. Vision takes the largest share, followed by taste, touch, hearing and, finally, smell. But the term "consciousness" in this case may be misleading. Consciousness implies not only the expressiveness of direct sensory experience, but also the contribution of personal meaning. As anthropologist Clifford Hertz perfectly noted: "Man is an animal confused in the networks of meanings that he himself has placed." Therefore, it is worth reconsidering the ranking of sensations - not so much by the “quantity” of consciousness, but by context and sense.

    Take the vision, which is certainly the most concrete and least abstract of the senses. The world around us consists of silhouettes, patterns, shades of highlights and shadows, and all these colorful shapes usually have a clear meaning for us. What you see, as we discussed in the previous chapter, invariably “means” something personal to you, there is always a context. When you look around you do not just see abstract colors and shapes, you get access to your personal memories, associations, and feelings at a certain point in your life: this stone will be relatively large.

    Next comes the taste. Again, the context will be clear: you feel the very specific properties of food or drink. One of the factors determining the taste is mapping. In one study, subjects evaluated a sample of lemonade in terms of how sweet or sour it was. After the first tasting, volunteers were offered another sample of lemonade, which contained less sugar and more lemon juice. When the turn came the third drink, which in fact was identical to the first sample, most people rated it as the sweetest of the three. The taste of the dish, its consistency and temperature, etc., can strongly influence the taste. And since the taste essentially depends on the accompanying sensations, all of them together will determine the context, and, therefore,

    Eyesight and taste are considered respectively 90% and 80% "conscious", but a more accurate term is context-sensitive. Formal interest is meaningless: it is only their relative importance compared to other feelings. Touch is much less context-sensitive. The touch of velvet, silk, bark or bare skin can be felt in a variety of situations. But usually the significance of this sensation is important for you here and now, while the rest of the context into which this object fits is not so significant. More attention is now paid to the direct sensation of interaction with the surface: this stone is much smaller, and the throwing force becomes extremely important.

    Next comes the sense of touch. Compared to eyesight, taste and touch, hearing is more passive and less context-dependent. Sound always finds you, not vice versa. Less trap nets required. It is the ability to hear that disappears last under the influence of general anesthesia, and also the first to return when the patient awakens. This stone is small, and the power of the throw is paramount.

    Finally, the smell. Of all the senses, it is the most free of context. Interestingly, the loss of smell is one of the earliest signs of Alzheimer's disease, because the path connecting the nose and the brain goes directly into the "limbic system." The limbic system is an extensive cluster of brain structures that is associated with the early stages of memory processes and, most importantly in this case, with emotions. It is therefore not surprising that the smell can cause such strong and immediate emotions, being the most primitive of all sensations. The smell, no doubt, is a powerful primitive stimulus that allows the animal to instantly determine whether something is edible, wounded or sexually attractive, allows you to track prey at long distances. It is logical that people have this feeling associated with instant, instinctive reactions, blunted. However, compared with other species in humans, larger areas of the brain are associated with olfactory perception. The quantity and quality of olfactory processing can be oriented more towards memory formation. However, even for us humans, the "subconscious" effects of smells should not be underestimated.

    Take pheromones. These insidious chemicals are found in the animal world in a variety of contexts, ranging from defining the boundaries of a territory and ending in a signal to reproduction. For humans, pheromones generally act as a regulatory factor for social and sexual behavior. Although the mechanisms of action of pheromones remain controversial, there is evidence that these chemicals do have a surprising effect on us. For example, using only the sense of smell, people can identify blood relatives. Mothers can recognize their children by the smell of their bodies, and vice versa. Children can also distinguish their siblings, which may be intended to prevent incest. Obviously, the primitive feeling of kinship is due to non-cognitive factors. In this case, the stone itself is quite small,

    The activation of different feelings is equal to the different force of throwing stones of different sizes. Vision, by virtue of its strong dependence on context, is related to a large stone, not necessarily thrown with great force. While the smell would be the opposite limit — a strong raw sensation without immediate and obvious context — a tiny stone. But such a stone, thrown with great force, can still put tangible ripples on the water. Perhaps one of the best examples of this is music.

    BRAIN AND MUSIC


    Music is defined by the dictionary as "vocal or instrumental sounds (or both), combined in such a way as to create harmony, beauty of form and expression of emotions." However, this definition does not reflect the colossal significance of music for our species. Consider that the music industry actually occupies a larger share in the economy than the pharmaceutical industry.180 In the centuries-old debate about what makes us human, some scientists argue that sign language skills are to a limited extent possible in other specially trained primates, but not noticed so that one of our smaller brothers is able to create music and enjoy it as a person can.

    Music is an integral part of our life. But does it have an evolutionary value, or is it a “by-product of evolution,” a “cheesecake for the ear,” as psychologist Stephen Pinker put it, pleasant enough, but hardly evolutionarily significant? This may mean that it affects the reward systems of the brain in the same way as recreational drugs, thereby subordinating mechanisms that were originally developed to meet the needs of survival, such as food and sex.

    We can include social cohesion, the development of perception and motor skills in a number of factors determining the significance of music. Robin Dunbar, an anthropologist from the University of Oxford, puts music and dance on par with religion and folklore: these are phenomena that encourage "social cohesion." He believes that without music and dance, social cohesion would not reach the kind of strength and sophistication that people have. We can enter into complex social interactions, and music is sometimes a direct link. Perhaps the value of music comes from ritual. Rituals provide a structured community and a strong link between generations that are not found in comparable forms in other species.

    At first glance, it seems that music is too culturally diverse to be an integral part of life. However, Ian Cross, a musicologist from the University of Cambridge, argues that there is a common feature in all types of music, namely: “regular and periodized temporary organization”. It is not surprising that the instruments that provide the basic rhythm, such as rattles, shakers and drums, were among the first to be created by people. But why is rhythm so important?

    The meaning of rhythm in the interaction of an adult with an infant will be that the infant concentrates attention and responds to time cycles in the sounds of the voice and movements of the adult that would not be available in normal speech. This is the most common form of communication with a child in the world: a mother shakes a baby on her lap, holding his hands and singing or pronouncing a rhyme. Along with the training of sensory-motor coordination, this game is also an experience of interpersonal interaction, reinforcement of communication skills. Cross defines music as "the natural desire for socio-cultural learning, which begins in infancy."

    No doubt everyone will agree that music inevitably entails movement. If, as we saw in Chapter 3, “thinking is a movement limited to the brain,” then music awakens that movement. The impact of music certainly helps the brain to evolve.

    Taken together, these data suggest that music can replace biologically beneficial stimuli, imitating pleasant experiences similar, for example, with pleasure from eating chocolate or drugs, such as cocaine. However, a decrease in the activity of the amygdala leads to the fact that a positive feeling can also be associated with blocking the reactions of fear. It is interesting to note that music was noted as one of the few examples of positively stimulating stimuli that can reduce activity in this area of ​​the brain. Thus, the "pleasure" of music can be caused by both push and pitch. On the one hand, the desire to positively activate areas of the brain associated with pleasure, and on the other, the desire to get rid of fears and negative emotions.

    No wonder so many areas of the brain are activated while listening to music. Rhythm, pitch and harmony of music provide a repetitive cycle of waiting and reward. We can also add the cerebellum to the long list of brain regions involved in this process. The presence of this structure, similar to the mini-brain, is a characteristic feature of all vertebrates - it has the function of "autopilot", carrying out the sensory-motor coordination of the most automated type. If you “unconsciously” beat the rhythm, most likely, this is hello from your cerebellum.

    Needless to say, if the music causes strong emotions, it would be strange if dopamine was not involved. Dr. Valori Salimpur and her team at the Rothman Institute in Toronto wondered if music can cause feelings of euphoria and sensations similar to awaiting rewards that are mediated by the dopaminergic system, then listening to music can contribute to the release of dopamine. Subsequently, her team received confirmation of this theory.

    It has long been known that activity in the right hemisphere is clearly correlated with emotions, and the fact that this part of the brain is sensitive to music suggests that there is a connection between emotions and the tone of music. This is quite reasonable, since the melody in music can be viewed as an analogue of tone in human speech, which, in turn, indicates emotional coloring: musical tones may simply be exaggerations of the usual speech tonality. The further similarity between music and language is even more obvious: both phenomena are unique to our biological species, reflect the characteristics of different cultures and historical eras. Both have clear, culturally dependent rules and a framework for expression.

    But there are significant differences between these phenomena, indicating that they complement and do not duplicate each other. While the spoken language originally arose to ensure effective interaction with a small number of people, music is a more mass way of conveying information. But most importantly - the music is not limited to the description of specific facts or ideas. In addition, music is capable of evoking emotions without evoking memories: we have already found out that hearing is least context-sensitive. As well-known neurologist Oliver Sachs eloquently put it: “Music has no concepts and makes no assumptions. She has no power to materialize anything. She has nothing to do with the world. ” The words of David Huron are even more concise: “Music can never achieve unambiguous clarity of language,

    Thus, language and music are two sides of the same coin, and the possession of this medal is a unique privilege of our species. Music gives us the opportunity to experience life brighter, clearer and more multifaceted, while language is necessary in order to refer to things that you cannot detect directly using your senses. The music absorbs, but does not call for immediate response, like a parachute jump or rafting.

    However, if music plunges into a state of “here and now”, which is still different from the interactivity of, say, alpine skiing, then, returning to our allegory, how wide is the ripple on the water's surface?

    But you do not pay attention to all these frauds of your brain. You only feel that Mozart fills your ears, brazenly seizing consciousness, causing a storm of sensations in an arbitrary, illogical sequence, while the eyes, arms and legs seem to exist autonomously. But suddenly something intrudes into your mind. Your precious inner world filled with music is now receding into the background - you have come to the door of your office.

    »More information about the book can be found on the website of the publisher
    » Table of contents
    » Fragment

    For Habrozhiteley 20% discount on the coupon - Brain

    Also popular now: