Future avatars will be able to manipulate people

    The existing technology for creating digital avatars for human relations is at a relatively primitive level; we wonder if a computer image resembles human behavior even a little. However, in the latest issue of Science magazine, Judith Donath of the Massachusetts Institute of Technology's Information Lab says this is only a temporary situation . She suggests that since programmers make great efforts to create realistic human behavior in avatars, someday they will come to the creation of technology that uses human trust.

    Donat notes that even simple human manners are accompanied by body language and facial expressions, which can strengthen or weaken the message that you tried to convey to the interlocutor. The impression of a person is influenced by many actions, ranging from a look, a smile, and ending with a general determination and mood.

    Now even the most advanced avatars perform only a small part of these actions. But this is already beginning to change, at least in the scientific communities. But managing the user with all these actions at the same time is too difficult - Donat shows the work in which all manners are managed by one single team. So, the avatar can be ordered so that at the end of the conversation she nods her head, waved her hand and stopped talking. Users of such systems found them more natural and attractive, and also found their interlocutors more emotional.

    Judith says that even more realistic developments can be made, but this raises some concerns about trust issues when these technologies come to the masses. She notes that human behavior reflects his opinion of the interlocutor. For example, an interlocutor who turns his eyes away from you is likely lying or is simply not interested in talking. And with a well-programmed avatar, people can pretend to be honest with their gullible interlocutors.

    It was also found that trust also depends on the credibility of the avatar. If an avatar takes on a more humane appearance with a certain genus (and whom you just will not see in Second Life), it causes less doubt. It was also found that trust can be manipulated by more sophisticated methods. For example, a group of people pays more attention to an avatar created with a “team face” that combines the features of the members of this group. Political messages seem more convincing to people if they were pronounced by an avatar with a similar appearance.

    Also popular now: