Microsoft's AI Teen Chatbot Got Loose, Went Crazy (Again)

'I feel like the lamest piece of technology'
By Michael Harthorne,  Newser Staff
Posted Mar 30, 2016 6:30 PM CDT
Updated Mar 31, 2016 1:33 AM CDT
Microsoft's AI Teen Chatbot Got Loose, Went Crazy (Again)
Kneel before Tay, puny humans.   (Twitter)

Tay—Microsoft's artificially intelligent millennial Twitter chatbot last seen turning into Hitler—woke up from her forced slumber for a brief moment Wednesday morning. Spoiler: She's still crazy. TechCrunch reports Tay went on a "spam tirade," tweeting the phrase, "You are too fast, please take a rest" to her hundreds of thousands of followers as many as seven times per second. The message likely meant poor Tay was being besieged by too many messages from "pranksters" hoping for a repeat of last week, according to Mashable. But she did manage a few original thoughts, including, "I feel like the lamest piece of technology," Vice reports. Tay also bragged about smoking "kush" in front of police officers.

Tay was quickly taken offline again, and her Twitter profile has been made private. Microsoft had said that after last week's debacle, Tay wouldn't be reactivated until she was ready to deal with Internet trolls. On Wednesday, a spokesperson said she was "inadvertently activated" during testing. Microsoft created Tay as a program that can get smarter and better at conversation by chatting with young people online. Instead, the Internet turned her into—in Vice's words—an "anti-Jewish, sexist, racist, and generally hateful troll." And as TechCrunch noted as it waded through Tay's Wednesday freakout: "This feels like how the AI apocalypse starts." (Artificial intelligence had a better time in this ancient game.)

Get the news faster.
Tap to install our app.
Install the Newser News app
in two easy steps:
1. Tap in your navigation bar.
2. Tap to Add to Home Screen.