The Volokh Conspiracy
Mostly law professors | Sometimes contrarian | Often libertarian | Always independent
"Alexa Tells 10-Year-Old Girl to Touch Live Plug with Penny"
Alexa apparently found the recommendation on the Web.
BBC News reports:
The suggestion came after the girl asked Alexa for a "challenge to do".
"Plug in a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs," the smart speaker said.
Amazon said it fixed the error as soon as the company became aware of it.
Thanks to Prof. Glenn Reynolds (InstaPundit) for the pointer, and to one of his commenters, _Jim, for the counterchallenge: "Alexa, let's do some bitcoin mining using half of Amazon's western server complex."
Editor's Note: We invite comments and request that they be civil and on-topic. We do not moderate or assume any responsibility for comments, which are owned by the readers who post them. Comments do not represent the views of Reason.com or Reason Foundation. We reserve the right to delete any comment for any reason at any time. Comments may only be edited within 5 minutes of posting. Report abuses.
Please
to post comments
Wow. My initial take when I saw it on twitter was that had to be a hoax.
Guess not!
Don't give up so easily. Media outlets are fooled by hoaxes every day.
I too say hoax.
A few seconds on the search engine of your choice clearly rules that out. Apparently some TikTok videos about it went viral in early 2020. Actual photos of outlets some kids fried in their high school and discussion of multiple sets of pending criminal charges here.
And so it begins.
(It being, of course, Skynet’s attempt to exterminate the human race.)
Amazon would absolutely be liable for this.
If it wants to be in the business of creating artificial characters that give children suggestions about what to do, it is absolutely liable for the consequences.
No children were harmed in the making of this article.
From the linked BBC article.
How smart could the daughter really be given she's asking Alexa for challenges to do? Alexa could just as easily have come up with something much worse like Russian roulette.
Smart enough to be curious, and also smart enough to know that no advice should be taken uncritically.
I was wondering about that. If one searches "challenges to do" on google, clicks to a website, and reads the challenge, then presumably Google would not be liable, although perhaps the owners of the website would? In this case, the search was executed orally rather than by typing. Asking Alexa for "challenges to do" is an input command, delivered by speaking rather than typing, to search the internet for "challenges to do" and to return the output orally rather than visually on a web browser. Does the mere fact that the input-output modes create the illusion of a conversation somehow change the liability? Would Google become liable if one searched the phrase, "Find me a challenge to do?" If a blind person used a web browser that read webpage content orally rather than presenting to webpage visually, would the oral browser maker suddenly become liable for webpage content?
I think that the difference between Google and Alexa is that Google produces a list of web pages matching a search query and Alexa answers questions. Google isn't responsible for the contents of the web pages it lists in its search results, and that wouldn't change if Google used a voice interface. In contrast, I think Amazon does bear responsibility for the answers Alexa provides, even if it gets the answers from the web.
The user of Alexa doesn't know which web site Alexa got the answer from (or even if the answer to a particular question come from the web at all), so it not in a position to independently evaluate the credibility of the web site. Alexa extracts what it thinks is the answer to a question rather than presenting the entire web page. (For all we know the web page that Alexa extracted the challenge from may have made it clear that you shouldn't actually do this challenge.) Once Amazon got a complaint about Alexa's answer to this particular question, Amazon modified Alexa so it would give a different answer. In short, it seems to me that Amazon is taking responsibility for the answers to questions posed to Alexa. The fact that Alexa extracts answers from web pages (as opposed to some other source of information) doesn't seem to me to be relevant to Amazon's potential legal liability for those answers.
Except for, you know, section 230. The content was provided by another information content provider.
I once did something like that, because I had seen my brother do it. But I'm not going to take Alexa's challenge, unless I see Her do it Herself. Maybe if the shock makes her hair all frizzy it would be impressive.
I called her a vile hooer, and a spy for the Chinese Commie Party. She got all defensive.
You know, I totally believe you did that.
Next thing you know people will be asking Alexa for:
Dates
Sex Advice
Porn
Must be no time left after making her feminist. And no, that is not a joke even though it sounds like one.
wat
Just remember: The machines started it.
So you are saying I shouldn't have gotten my kids a "Li'l Sparky Junior Electrician Kit" for Christmas?
No, but you can give them a long fluorescent bulb and tell them it's a light saber.