Three of the most popular misconceptions about emotions in Affective Computing

    Today, emotions play an increasingly important role in business. At some point, people came to understand that the buyer makes a purchasing decision based not only on what they think about the product, but also on what it feels about it. That is why companies are actively trying to add an emotional aspect to their work: analytics, service, technology.

    The era of unemotional rationalism came to an end for a man a long time ago, but for modern machines, the dawn of their emotional intelligence is coming. Over the past ten years, we have seen the rapid development of emotional technologies, an area commonly referred to as “affective computing” (Affective Computing). But where there are emotions, there are always a lot of mysteries.

    Some erroneous cliches are especially common. We consider the three most popular myths about emotions in Affective Computing, which are actively spreading in business and media space.

    Myth 1: Paul Ekman's Legacy

    In short, Paul Ekman's theory can be described as follows: when it comes to facial expressions, people are able to express and recognize a certain set of emotions, which he calls “basic”. Regardless of where we are and with whom we speak, we are always able to recognize when our interlocutor shows 5 * of emotions: anger, fear, disgust, joy, sadness.
    * After the theory was revised, surprise was excluded from the list of basic emotions.

    The basic emotions of Paul Ekman (plus a neutral state) performed by Tim Roth, the actor who played the main character of the series “Lie to Me.”

    James Russell, one of the first critics of Ekman's theory, rejected the idea of ​​the universality of emotions.. He believed that the relationship between face and emotion is not as straightforward as Ekman originally believed, and emotions have a certain meaning depending on the context. Later in her book Emotions and the Body, Beatrice de Gelder wrote that as a result of experiments with fMRI, no neurological basis was found to confirm the universality of certain emotions.

    Not so long ago, one of the most prominent critics of the theory of basic emotions, the famous neurobiologist Lisa Barrett said that emotions are not innate, but quality acquired through experience. Understanding emotions manifests itself differently for different people and cultures. There is a series of studiesin which the research team traveled to Namibia to study how the tribe of hermits Himba will recognize joyful, sad, angry, frightened or neutral facial expressions. With the perception of manifestations of positive emotions did not arise, however, Himba was often confused with anger and disgust. Similar experiments in other tribes showed similar results. This allowed Barrett to conclude that our explanation and understanding of emotions lies in the culture - we give similar names to things that in reality mean different concepts.

    Despite the fact that in 2011Ekman changed his definition of “emotions” to include cultural and individual aspects, and even eliminating one of the basic emotions, many companies still base their work in affective computing on the old theory. They still include the concept of “basic emotions” in their databases, and, according to Lisa Barrett , it is this approach that will become their Achilles heel. However, one adds, given the external and internal context, then this technology has great potential to revolutionize the science of emotions.

    Laboratories and companies that work with emotional analytics should not understand emotions as something universal. First, affective datasets must be specific, because they are used to train the algorithm, and therefore must include information about culture, language, gender, and even age, in order to correctly determine the emotion. Secondly, emotion recognition algorithms must be context sensitive. It is very important to note the fact that some laboratories tried to take context into account (for example, here ), but not a single “big” company engaged in affective computing has yet made such attempts.

    Myth 2: Smile is an indicator of happiness

    On the other hand, Ekman's theory led to the natural conclusion that the expressed emotion may be related to the feelings that a person is experiencing.
    For example, a smile, which algorithms detect most easily, may have different meanings: feelings of happiness, joy, satisfaction, support, etc. Hence the question arises: what is its function?

    In a recent study [1], subjects were asked to do nine difficult exercises that were displayed on a monitor. When the participants managed to give the correct answer to any of these difficult tasks, they smiled, although they only had a computer screen in front of them. At the same time, the theory of social tools ( social displays) claims that the function of a smile can vary depending on whether a person is alone or in a certain social environment.

    In affective computing, at least in their commercial version, modern recognition technologies are able to analyze emotions only separately from the social context. Thus, in order to truly understand the meaning of a smile, we must teach the machine to distinguish between emotions in different situations, both social and not. The way we express emotions of happiness depends on the context: sometimes we smile, and sometimes we don’t. That is why one should approach the nature of emotions more seriously. Analysis of facial expressions can be carried out in conjunction with the addition of acoustic parameters, analysis of body movement or physiological characteristics - this approach is called multimodality of emotions.

    Myth 3: Body “tongue”?

    So, we came to the conclusion that emotions are not universal, the concept of “basic emotions” is controversial, and the manifestation of emotions is directly related to cultural, individual and contextual aspects. Since the expression of emotions is not limited to our face, but includes voice, body movements, interpersonal distance, and various physiological manifestations, the situation becomes more complicated.

    Also, as people often try to understand whether they are being deceived, being guided by the interlocutor's face, they also observe the body. Body movements were trying to connect with almost anything. The most famous options - a person touches his mouth when lying, or takes an open posture, when he feels calm and safe. This theory has become so widespread that its echo has fallen into the realm of stress management, safety issues, and even cinema.

    For example, airport security has always been a priority. The first automatic recognition systems of behavior patterns (behavior detection systems) were installed at US airports at the end of the 20th century. Since then, they have spread throughout the world. Usually, the calculation of the probability of whether a passenger is potentially dangerous is based on the key characteristics that are associated with high risk. Until today, many scientists argue that certain psychological characteristics of the person, which may be characteristic of terrorists, have not been found. [2] Similarly, the correlation between the way a person moves and whether he is lying at this moment is not as straightforward as popular psychology asserts.

    The existence of a popular version of the “language” of the body, revealing the true feelings of a person, is more than a controversial issue. Of course, you can draw a link between non-verbal cues and emotional behavior. To date, there is a whole body tracking technology, body-tracking. In Affective Computing, tracking is used to collect statistics on the relationship between movements and certain emotions.


    Affective computing is an amazing, but difficult area, both for science and for business. She truly is at the turn of high technology. However, in many cases, the approach to the use of emotion recognition technology in commerce is still old-fashioned. Someone is attracted by the authority of the name of the founder of the famous approach, someone limited goals that can be achieved.

    Of course, everyone would like to have the ability to “read” emotions, like the main character of the TV series “Lie to Me.” However, one should not forget that emotions are much more complicated and mysterious, and one should not get involved in phrenology and palmistry.

    We talked about the three most common myths about emotions in Affective Computing. It is important to eradicate such misconceptions so that these technologies can work for the good of mankind with precision and impartiality.

    Coauthor : Olga Perepelkina, Chief Researcher, Neurodata Lab.

    References :

    [1] Harry J. Witchel et al. The 36th European Conference on Cognitive Ergonomics - ECCE'18 (2018). DOI: 10.1145 / 3232078.3232084

    [2] Airline Passenger Profiling Based on Fuzzy Deep Machine Learning (2016). Zheng, Yu-Jun et al. doi: 10.1109 / TNNLS.2016.2609437

    Also popular now: