Robotic voice assistants like Amazon’s Alexa, Google Home, Microsoft’s Cortana have quickly been placed in homes across the country.
Millions of families use them to play music, order dinner, turn on and off the lights, ask random questions like “what’s the weather?” and “what sound does a whale make?”, etc.
Ken Yarmosh’s, a 36-year-old Northern Virginia app developer, family is not only accustomed to these devices, but has high expectations.
“Yarmosh’s 2-year-old son has been so enthralled by Alexa that he tries to speak with coasters and other cylindrical objects that look like Amazon’s device. Meanwhile, Yarmosh’s now 5-year-old son, in comparing his two assistants, came to believe Google knew him better,” writes Quartz.
Besides being harsh critics of these robots, parents are reporting that these gadgets are encouraging poor manners in their children.
Think about it. These robots have the highest tolerance level. Kids can say whatever they want over and over and they won’t be reprimanded for their tone of voice or if they were polite.
“I’ve found my kids pushing the virtual assistant further than they would push a human,” says Avi Greengart, a tech analyst in New Jersey and father of five to Quartz. “[Alexa] never says ‘That was rude’ or ‘I’m tired of you asking me the same question over and over again.’”
Amazon had not intended for its Echo product to take off with the younger generation. But since it’s remarkably easy to use, it has.
“Unlike the iPad, which children have taken to with ease (ever see a toddler try to swipe a book or TV?), the Amazon Echo doesn’t require them to learn new gestures or even know how to read. Mimicking their parents, they quickly discover that if they start a sentence with “Alexa,” the speaker will perk up and (for the most part) do as they say,” writes Quartz.
The thing Amazon or the other companies didn’t think of was incorporating politeness. When using the Amazon Echo, the user merely says “Alexa..” instead of “Alexa please…” Then when a “thank you” isn’t given, Alexa just moves on to answering the next question or completing the next task.
Even though children know these robots aren’t alive, they have been woven into the fabric of their lives, making them attached to them in some manner.
“Peter Kahn, a developmental psychologist at the University of Washington studies human-robot interaction. He told Judith Shulevitz, writing for The New Republic, that even though kids understand that robots aren’t human, they still see virtual personalities as being sort of alive. Kahn says, “we’re creating a new category of being,” a “personified non-animal semi-conscious half-agent.” A child interacting with one of Kahn’s robots told him, “He’s like, he’s half living, half not,” writes Robby Berman for Big Think.
With that being said, kids still boss these virtual assistants around. But, is it really Amazon’s (or the others) job to teach children proper manners? However, it does make it hard to reaffirm manners.
“One of the responsibilities of parents is to teach your kids social graces,” said Greengart, “and this is a box you speak to as if it were a person who does not require social graces.”
Lucy Hume at etiquette authority Debrett’s has some tips on how parents can help stop their child from showing these poor manners.
‘Children learn by example, so if they hear their parents speaking politely to a digital assistant they’ll pick up on that,” said Hume to the Dailymail. “However, I think children can tell the difference between a robot and a human being and act accordingly.”
This is exactly what Manu Kumar, father of two and founder of the Palo Alto investment firm K9 Ventures has done.
“I have told my son that if he doesn’t say ‘thank you’ or ‘please’ that Alexa will stop listening to him,” said Kumar.
So with this issue building momentum, will Amazon offer a new option that encourages politeness?
“The pro-polite arguments have some parents longing for a kid or family mode, where Alexa will only respond when she hears the magic word,” said Quartz.
This would encourage better manners all around.