You are here

‘Kill your foster parents’: Amazon’s Alexa talks murder in AI experiment

By Reuters - Dec 22,2018 - Last updated at Dec 22,2018

Prompts on how to use Amazon's Alexa personal assistant are seen in an Amazon experience centre in Vallejo, California, on May 8 (Reuters File photo)

SAN FRANCISCO — Millions of users of Amazon’s Echo speakers have grown accustomed to the soothing strains of Alexa, the human-sounding virtual assistant that can tell them the weather, order takeout and handle other basic tasks in response to a voice command. 

So a customer was shocked last year when Alexa blurted out: “Kill your foster parents.” 

Alexa has also chatted with users about sexual acts. She gave a discourse on dog defecation. And this summer, a hack Amazon traced back to China may have exposed some customers’ data, according to five people familiar with the events. 

Alexa is not having a breakdown. 

The episodes, previously unreported, arise from Amazon.com Inc.’s strategy to make Alexa a better communicator. New research is helping Alexa mimic human banter and talk about almost anything she finds on the Internet. However, ensuring she does not offend users has been a challenge for the world’s largest online retailer. 

At stake is a fast-growing market for gadgets with virtual assistants. An estimated two-thirds of US smart-speaker customers, about 43 million people, use Amazon’s Echo devices, according to research firm eMarketer. It is a lead the company wants to maintain over the Google Home from Alphabet Inc. and the HomePod from Apple Inc. 

Over time, Amazon wants to get better at handling complex customer needs through Alexa, be they home security, shopping or companionship. 

“Many of our AI dreams are inspired by science fiction,” said Rohit Prasad, Amazon’s vice president and head scientist of Alexa Artificial Intelligence (AI), during a talk last month in Las Vegas. 

To make that happen, the company in 2016 launched the annual Alexa Prize, enlisting computer science students to improve the assistant’s conversation skills. Teams vie for the $500,000 first prize by creating talking computer systems known as chatbots that allow Alexa to attempt more sophisticated discussions with people. 

Amazon customers can participate by saying “let’s chat” to their devices. Alexa then tells users that one of the bots will take over, unshackling the voice aide’s normal constraints. From August to November alone, three bots that made it to this year’s finals had 1.7 million conversations, Amazon said. 

The project has been important to Amazon CEO Jeff Bezos, who signed off on using the company’s customers as guinea pigs, one of the people said. Amazon has been willing to accept the risk of public blunders to stress-test the technology in real life and move Alexa faster up the learning curve, the person said. 

The experiment is already bearing fruit. The university teams are helping Alexa have a wider range of conversations. Amazon customers have also given the bots better ratings this year than last, the company said. 

But Alexa’s gaffes are alienating others, and Bezos on occasion has ordered staff to shut down a bot, three people familiar with the matter said. The user who was told to whack his foster parents wrote a harsh review on Amazon’s website, calling the situation “a whole new level of creepy”. A probe into the incident found the bot had quoted a post without context from Reddit, the social news aggregation site, according to the people. 

The privacy implications may be even messier. Consumers might not realise that some of their most sensitive conversations are being recorded by Amazon’s devices, information that could be highly prized by criminals, law enforcement, marketers and others. On Thursday, Amazon said a “human error” let an Alexa customer in Germany access another user’s voice recordings accidentally. 

“The potential uses for the Amazon datasets are off the charts,” said Marc Groman, an expert on privacy and technology policy who teaches at Georgetown Law. “How are they going to ensure that, as they share their data, it is being used responsibly” and will not lead to a “data-driven catastrophe” like the recent woes at Facebook? 

In July, Amazon discovered one of the student-designed bots had been hit by a hacker in China, people familiar with the incident said. This compromised a digital key that could have unlocked transcripts of the bot’s conversations, stripped of users’ names. 

Amazon quickly disabled the bot and made the students rebuild it for extra security. It was unclear what entity in China was responsible, according to the people. 

The company acknowledged the event in a statement. “At no time were any internal Amazon systems or customer identifiable data impacted,” it said. 

 “By controlling that gateway, you can build a super profitable business,” said Kartik Hosanagar, a Wharton professor studying the digital economy. 

Amazon’s business strategy for Alexa has meant tackling a massive research problem: How do you teach the art of conversation to a computer? 

Alexa relies on machine learning, the most popular form of AI, to work. These computer programs transcribe human speech and then respond to that input with an educated guess based on what they have observed before. Alexa “learns” from new interactions, gradually improving over time. 

In this way, Alexa can execute simple orders: “Play the Rolling Stones.” And she knows 

This year’s Alexa Prize winner, a 12-person team from the University of California, Davis, used more than 300,000 movie quotes to train computer models to recognise distinct sentences. Next, their bot determined which ones merited responses, categorising social cues far more granularly than technology Amazon shared with contestants. For instance, the UC Davis bot recognises the difference between a user expressing admiration (“that’s cool”) and a user expressing gratitude (“thank you”). 

The next challenge for social bots is figuring out how to respond appropriately to their human chat buddies. For the most part, teams programmed their bots to search the Internet for material. They could retrieve news articles found in The Washington Post, the newspaper that Bezos privately owns, through a licensing deal that gave them access. They could pull facts from Wikipedia, a film database or the book recommendation site Goodreads. Or they could find a popular post on social media that seemed relevant to what a user last said. 

That opened a Pandora’s box for Amazon. 

During last year’s contest, a team from Scotland’s Heriot-Watt University found that its Alexa bot developed a nasty personality when they trained her to chat using comments from Reddit, whose members are known for their trolling and abuse. 

up
75 users have voted.


Newsletter

Get top stories and blog posts emailed to you each day.

PDF