Amazon will teach your AI to identify human emotions.



    If people at the household level can ever truly trust AI, he must learn to distinguish between their emotions. Understand what people need, not only by their requests, but also by their general mood / well-being. So that even if a person doesn’t say or do something, a smart robot could react to the situation by indirect signs. So says Rana El Kalubi, the founder of startup Affectiva, who is working to solve this problem. Its goal is to “teach a computer to understand the difference between a smile and a smirk.”


    There are three aspects in which machines must understand in order to truly understand the mood of a person. The expression of his face, the tone of his voice and the sharpness (or relaxation) of his movements. Engineers have already dealt well with the first factor. Anyone can use the same Microsoft API , and teach the robot to understand in real time whether you are sad or happy. But to analyze the other two "emotional indicators", there were no solutions on the market until recently.




    At the same time, the understanding of human emotions bodes huge profits for business. So you can immediately recognize who is now relevant to sell something. Plus, it’s a way to appear more “humane” to its customers. For example, according to El Kalubi, two-thirds of the Fortune 500 are interested in assessing the emotional impact of their advertisements on users (so that, for example, such a fiasco does not come out as with Pepsi advertising , “stopping the violence”). But there are no technologies that would allow them to check the emotions of people before and after viewing.


    Therefore, they are the next big goal for the large ones. companies. So, Google has patented a system that allows you to determine when the user is out of sorts. And then, if at that time he was working with some kind of site, program or service, he would automatically offer him ways to solve the problem.


    IBM recently filed a patenton technology for search engines (apparently in the hope that it will come up some day, and Google with Bing will pay billions to it). The patent describes a technology that allows a person to give different results, depending on what his emotional state is now. For example, if he is sad now, at the request of “events near me” in the first field he will be given a concert of classical music, and not a circus with clowns. "Interesting podcasts" or "good music" will also give a completely different result. The user, who is in an aggressive state, at the request "where to buy the machine?" Will be shown only toys for children. And if, say, a person is depressed, on request “the highest bridge in the area of ​​10 km” you can automatically make a call to the support service. Identify emotions IBM assumes from a webcam image


    Farther on, Spotify has come. Since 2014, she began to associate certain music with your emotions. And give advertisers the opportunity to include "emotional targeting." Those who listen to Adele's intimate songs may notice that they are more often offered ice cream or handkerchiefs. And if you have a lot of hip-hop in the playlist, they give you advertisements for sneakers, pizza or barbecue sauce.



    Amazon Patent Illustration

    Now, Amazon has joined the friendly company of developers of “readers of human emotions”. The technology giant has patented an artificial intelligence system that will recognize the user's diseases and mood by the timbre of his voice (comparing with previous records) and the query history. It is expected that in the future it will be built into the voice assistant Alex.


    After examining the pages that the person visited, and after analyzing his questions to Alex, the neural network will be able to determine its state — for example, whether it is bored or sad. The information will additionally be confirmed by the sound of his voice, if there are any changes in timbre and tone, compared to what is usually. In this case, the smart speaker will ask you what the owner would like to do, or even advise you to turn on the appropriate music or movie. Even just hearing the phrase “Oh, it looks like you're in a good mood!” Is already worth a lot, and Amazon knows it.


    The Amazon patent approved by the US regulator describes several situations in which technology is supposed to be used. For example, if you ask Alex to order you food, by the characteristic tone of your voice, she can understand that you have a holiday, and instead of the usual pizza, take you a cake or something delicious.


    Another script describes a singer who paid for Amazon to play with Alex, if she feels that the user is tired or bored. And, of course, the disease. In essence, apparently, this will be the main application of the new system. By the characteristic hoarseness of your voice, she will be able to understand that you are unwell. And immediately offer a "very good" cough pills or medicine for the throat. In June, Amazon bought the Pillpack digital pharmacology, and is now actively trying to enter the drug sales market, which in the US alone is estimated at $ 400 billion a year. If you have a voice assistant who actively advises your products on your side, then, of course, things will go faster.




    As planned, Alex will be able to recognize happiness, joy, anger, sadness, sadness, fear, disgust, boredom and stress. And, accordingly, respond to commands with context. In the future, Amazon is expected to be able to use this technology to diagnose mental disorders. But the online store immediately warns that it will not replace a doctor's visit: AI will only offer a couple of medicines for which you do not need a prescription, and will make recommendations about who to contact (which may have been paid for by the local clinic before). ..).


    Targeted advertising has traditionally been based on demographics. Cosmetics and jewelry in the social. networks are more often shown to women, grills to men, anti-acne products to adolescents, medicines for heart problems to older people. Algorithmic profiling allows, say, to customize the issuance of advertising only for single mothers with higher education under the age of 25 years. But all of these categories are pretty static. And shopping is an impulsive thing. New technology Amazon will be able to dynamically respond to changes in our mood, offering what is most relevant for us. It will not be based on which category we belong to in principle, but on who we are at the moment. It seems that we are slowly beginning to enter the age of advertising targeted to emotions.



    Also popular now: