Alexa, the virtual assistant that lives inside Amazon Echo devices, phone apps and a handful of smartwatches, has been caught giving some incredibly dangerous advice to bored people looking for a festive challenge.

As reported by the BBC, Kristin Livdahl had to intervene when she heard the Amazon Echo recommend her ten-year-old daughter try to give herself an electric shock, set fire to the house or both.

“Plug in a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs,” the smart speaker suggested, quoting “something I found on the web.”

If it’s not clear, that caveat is important. When Alexa doesn’t know exactly what you’re looking for, it’ll perform an internet search and quote an extract of the search result. In this case, the Echo dug up some information on ‘the penny challenge’ — a trend for simpletons on TikTok who enjoy doing outrageous things for imaginary internet kudos.

I can’t say I’m hugely surprised by this, as I’ve seen Alexa’s reliance on internet snippets give dubious answers to problems before. When testing Google Assistant against Alexa for a piece many years ago, I set the test question of “how many hairs on a cat?” These were the two answers:

Google Assistant: “On the website, they say there are approximately 60,000 hairs per square inch on the back of a cat and approximately 100,000 per square inch on its underside.”

Alexa: “A cat has 60,000 hairs.”

Either Alexa has been trained on images of particularly mangy cats, or it skimread the same page as Google Assistant and then came up with completely the wrong answer, really putting the “artificial” in “artificial intelligence.”

For Amazon’s part, it has intervened to ensure that Alexa will no longer recommend people try and injure themselves in the post-Christmas lull. “As soon as we became aware of this error, we took swift action to fix it,” a company spokesperson told the BBC.

You have to wonder how many other dubious answers are hidden away in the Echo’s database of clever answers. Still, if Amazon has its way, eventually you’ll be talking to it less anyway, as the company views the future of Alexa as predicting what you want before you even know.