How rude humanoid robots can fool you

Original author: Matt Simon
  • Transfer

This tiny humanoid robot is named Mekanoid, and he is just a scoundrel. The test person without a second thought asks the robot: if you wanted to make friends with someone, what would you like them to know about you? “What bored me,” says Mekanoid.

Okay, let's start over. A new participant in the experiment asks Mekanoid the same question, only now the robot has been programmed for politeness. What would this robot want his friend to know? “I already like him very much,” says Mekanoid. So much better.

Researchers from France subject test subjects to coarse and polite humanoids for a reason: they conduct research on how the attitude of a robot to people affects the ability of people to perform certain tasks. In August 2018 they published their study.in the journal Science Robotics, in a number that also included a study of whether robots can incline children to certain decisions. This pair of studies demonstrates how advanced the development of advanced social robots is ahead of our understanding of how we perceive them.

Let us return to the Mekanoid. Participants began with a task in which they had to determine the color with which the word was written, and not the word itself. For example, the word "blue" could be written in green ink. There is a temptation to blurt out “blue” when you have to say “green”. This exercise is called the Stroop task .

Participants at first passed the test independently, and then communicated a little with the Mekanoid - exchanged remarks. But each of the subjects encountered only one of the options for the changeable mood of Mekanoid.

Then they returned to the task of the Stroop, and the robot watched them. “We saw that in the presence of an impolite robot, participants significantly improved their performance compared to those who did it under the supervision of a polite,” said study lead author Nicolas Spatola, a psychologist at the University of Clermont-Auvergne, France.

What happens in the experiment? “During the experiment, we observed how a robot can emotionally influence a person,” says Spatola. “The rude robot seemed more threatening.” Despite the fact that the robot was not reasonable, the observer, it turns out, doesn’t really care what and how the robot thinks of him. “Because the robot is rude, you are actively monitoring its behavior and movement, because you consider it unpredictable,” Spatola said. That is, the participants, faced with a rough robot, were more focused, which is why, probably, they showed the best results in the tests.

In the second study, published a little later, the robots were not so angry. Three small humanoids of the Nao model, made in SoftBank Robotics, sat around the table (very nice that they were sitting on highchairs to be on par with large children). They looked at the screen, on which the left was one vertical bar, and on the right - three lines of different lengths. The subjects had to choose which of the three lines corresponds to the length of the one that was to the left.

But first, robots were chosen. Autonomous machines that worked on a specially written program, in two thirds of the cases gave the wrong answers, which, however, did not care about the adult participants. If we compare this with a group where the same experiment was conducted with the participation of adults who gave the wrong answers instead of robots, then the participants were inclined to trust people, not machines.

Children, on the other hand, were in the wake of the robots. Three-quarters of their answers coincided with incorrect answers given by robots. In other words, the researchers argue that the children succumb to the pressure from their side. Children tend to suppress mistrust, says Anna-Lisa Volmer, lead author of a study at the University of Bielefeld. “Something similar happens when interacting with robots: children see in them not a plastic and electronic machine, but a social character,” she says. “This may explain why they are influenced by robots.”

But can this be called pressure from members of his circle ?, if these members are actually robots? Here the situation is confused. “I think there is a big assumption about the reaction of children, because it does not necessarily involve the social aspect of pressure from members of their circle,” said Julia Carpenter, who studies the interaction of humans and robots who did not participate in this study. “Both children and adults can rely too much on technology.” Perhaps the children did not consider humanoids equal to themselves, but simply regarded them as useful technological tools.

And yet, these robots, like the robot, alternately being rude and polite, cause a certain reaction in experimental people. Therefore, the near future, in which we will increasingly interact with machines, especially humanoid, seems to us so interesting. These studies state that humanoid robots are capable of manipulating us in various intricate ways. And scientists are just beginning to understand the dynamics of this process.

Imagine a very smart doll robot with which a child has a close relationship. Well, let's say children love their dolls for thousands of years. But what if this robot begins to use this connection for personal gain, trying, for example, to convince a child to spend $ 20 to upgrade the software so that the robot becomes even smarter and more fun?

Machines do nothing all at once. Someone once programmed them to behave in a certain way, be it the wrong choice of line on the screen, rude treatment or deception of unsuspecting children. “We need to ask ourselves, what are the goals of the robot? - says Carpenter. “Do they coincide with mine?”

This needs to be recalled the next time when it seems to you that the robot is too rude to you.

Also popular now: