How millions of children are raised by all-knowing voice assistants

Original author: Michael S. Rosenwald
  • Transfer
image

Children adore their new robotic relatives.

Millions of American families buy automatic voice assistants so that they turn off the light, order pizza and give cinema sessions, children happily use gadgets to resolve disputes at dinner, look for answers to homework and entertain friends who stay overnight.

Many parents are surprised and interested in how these disembodied omniscient voices - Amazon Alexa, Google Home, Microsoft Cortana - affect the behavior of their children, making them more curious, but sometimes less polite.

In just two years, technology has surpassed all marketing promises. People with disabilities use voice assistants to control the house, order groceries and listen to books. Caring for the elderly sisters say that these devices help with dementia, recalling what day it is or when to take medicine.

For children, the potential of changing their interactions is as substantial as at home and at school. Psychologists, technologists and linguists are just beginning to explore the possible harm from surrounding children with AI devices, especially in those periods when children go through important stages of social and language development.


13-year-old Escher Labovich and his 10-year-old sister Emerson Labovich are played with family assistant Alexa, while their mother is watching them.

“For me, the biggest question is how they react and deal with this non-human entity,” says Sandra Calvert, a psychologist at Georgetown University and director of the Children's Digital Media Center . “And as a result, does this affect family dynamics and social interaction with other people?”

This year, it is expected that 25 million voice assistants will be sold at a price ranging from $ 40 to $ 180 - compared to 1.7 million sold in 2015 - and this can seriously affect even young children.

Mattel, the toy industry giant, recently announced the launch of Aristotle, a child-watching device (video nanny), which is planning to summer, which “soothes, teaches, and entertains” using Microsoft's AI. Growing up, children will be able to ask or answer questions. The company says that “Aristotle was specially designed to grow with the child.”

Proponents of technology say that children usually learn to receive information using the dominant technology at the time - first there was a catalog of library cards, then Google, then brief dialogues with friendly and all-knowing voices. But what if these devices lead children who are already stuck to the screens further away from the situations in which they learn important interpersonal skills?

It is not clear if the companies that deal with these things care at all.

Amazon didn’t respond to a comment request for this article. A representative of the new Partnership for AI organization, including Google, Amazon, Microsoft, and other companies working on voice assistants, said that no one could answer this question.

“These devices have no emotional intelligence,” says Alison Drouin, a professor at the University of Maryland who studies the use of technology by children. "Their intelligence is factual."


Labovichi at the kitchen table; Alexa in the foreground. "We love to ask her a lot of random questions," says Emerson about the device.

Children clearly enjoy the company of these devices and speak of them as family members.

“We love to ask her a lot of random questions,” says Emerson Labovic, a fifth-grader from Bezard, pc. Maryland annoyingly paired up with her older brother Asher.

This winter, Emerson asked her almost every day to help her count down the number of days left to travel to Florida's Wizarding World of Harry Potter amusement park.

“She also knows how to rap and rhyme,” says Emerson.

Today’s children will be brought up by AI in much the same way their grandparents were brought up with new devices called “TVs.” But with the TV it was impossible to talk.

Ken Yarmosh, a 36-year-old application developer and founder of Savvy Apps from Northern Virginia, installed many voice assistants in his home, including models from Google and Amazon. (The Washington Post is owned by Amazon founder Jeffrey P. Bezos, whose middle name is Preston, according to Alexa).

Yarmosh’s 2-year-old son was so fascinated by Alexa that he was trying to talk with cup holders and other cylindrical objects that looked like a device from Amazon. And his 5-year-old son, comparing two digital assistants, decided that Google knows him better.

“Alexa is not smart enough for me,” he says, asking random questions that his parents cannot answer, such as how many miles to China (Google says that “There are 7248 miles to China in a straight line”).

Evaluating the device plugged in this way, the son of Jarnos anthropomorphic it - that is, as Alexa explains with pleasure, “ascribes human properties to something”. Calvert says that people often sin with this. We do this for dogs, putting them on Halloween costumes. We give names to boats. And when we meet robots, we - especially children - treat them almost as equals.

In 2012, researchers from the University of Washington published the results of a studycomprising 90 children interacting with Robovie, about the size of a man. Most of the children thought that Robovie had “moods” and that he was a “social being”. When Robovie was put in the closet, more than half of the children thought it was dishonest. A similar emotional connection arises with Alexa and other assistants - even with parents.

“It definitely becomes a part of our life,” says Emerson’s mother, Laura Labovich, and is getting better: “She’s already a part of our life.”

According to Drouin, the problem is that the emotional connection leads to the expectations of children, which the devices cannot justify, because they were not created for this. This leads to confusion, frustration and even a change in how children speak and interact with adults.

Yarmosh’s son thought Alexa didn’t understand him, but the algorithms could not recognize his voice or how children formulated questions. Teachers introducing these devices in their classes and libraries encountered the same problem.

“If Alexa doesn't understand the question, is it her fault, or is the fault of the question?” Says Gwyneth Jones, a librarian using an Amazon device at Murray Hill High School in Laurel. “People will not always be able to understand what the children say, so it’s important that they learn to ask the right questions.”

Naomi S. Baron, a linguist from the American University who studies digital communication, belongs to those who consider whether devices, even if becoming smarter, push children to prefer simple language and simple queries instead of using nuances and complex questions.

If you ask Alexa: “How to ask a good question?”, She will answer: “I could not understand the question I heard.” But she can answer his simple version: “What is the question?”.

“The linguistic expression used to request information,” she says.

There is also the possibility of a change in how adults communicate with children.

Although the new assistant from Mattel will have an option forcing children to say “please” when requesting information, assistants from Google, Amazon and others are designed so that users can quickly and directly ask a question. Parents note visible changes in their children.

In a blog post from last year, a Californian venture investor wrote that his 4-year-old daughter thought that Alexa is the best at home knows how to spell words. “But I’m afraid she’s turning my daughter into a bitch, ” Hunter Walk wrote . “Because Alexa tolerates bad behavior.”

To ask her a question, you only need to say her name, and then a question. Without the "please." And without "thank you" before the next question.

“From a cognitive point of view, I’m not sure that a child will understand why Alexa can be spyed, but not a person,” Walk wrote. “At the very least, it makes sure that as long as you have good diction, you can get anything without using politeness.”

Jones, a librarian, saw several people asking questions at the same time. “You are pushing too much,” she said, if Alexa repeated again and again that she did not understand. - You knock her down. One by one, as with a man. ”

Personal, but the business relationship like children and adolescents. Parents, including the author of the article, noticed that the questions previously asked to parents are now asked to assistants, especially about their homework - how words, mathematics, historical facts are written.

Or take the weather, especially in winter. Instead of asking the parents about the temperature, the children go to the device and believe his answer as true truth.

Pros: no controversy over what the temperature will be and what to wear.
Cons: children are less suitable for their parents, and lose in communication.

"Interaction with these devices that mimic conversations can have many unintended consequences," said Kate Darling, a professor at MIT who studies the interaction of people with robots. "We do not know them all yet."

But most researchers, teachers, and parents — and even some children — already agree that these devices should be put in place, just like a know-all relative.

Jones, the librarian, sometimes takes Alexa away for a couple of weeks so her students don't hope for her too much. Yarmosh, who recently launched a project to supervise children's online video, does not put helpers in children's rooms. Emerson and her brother choose a school playground approach. "Alexa," they say, "well, you and ass."

Also popular now: