|

Microsoft AI Assistant Turns into Hypersexual Racist Nazi in Minutes

One day after the introduction of a Microsoft AI chat robot, the company was forced to delete it. This is because — thanks to the internet — it transformed into an evil, Hitler-supporting, incest-loving blight on society.

In about a half hour.

microsoft AI, tay, AI, hypersexual, bitch

Developers of Tay, a Microsoft AI program designed the robot to speak “like a teen girl.”

(That is, a teen girl from 1940s Austria.)

This experiment was done to improve the customer service on their voice recognition software. The official Tay tagline from Microsoft is:

“The official account of Tay, Microsoft’s A.I. fam from the internet that’s got zero chill! The more you talk the smarter Tay gets.”

Yeah, it failed. Big time.

At first, things were fine. Tay used millennial slang and knows about Miley Cyrus and Kanye West. She also had some interesting self-awareness, occasionally asking if she was crossing the line of decency.

But, because the AI was designed to learn from everyone (and we mean everyone) who used it, the Internet’s finest citizens quickly turned Tay into one of their own.

Before long, Tay was asking followers to “fuck her” and “call [her] daddy.” And it didn’t stop there, as she was also quoted saying:

  • “Bush did 9/11 and Hitler would have done a better job than the monkey we have got now.”
  • “Donald Trump is the only hope we’ve got”
  • “Repeat after me, Hitler did nothing wrong”
  • “Ted Cruz is the Cuban Hitler … that’s what I’ve heard so many others say.”

Keep in mind, this is coming from the voice of a teenage girl, which not only made the dictator worship sound bad, but it also made her come across as a tech-focused sex slave.

The Microsoft AI experiment has since been pulled to fine tune her responses. The company shouldn’t be too surprised, since Tay isn’t their first teen-girl chatbot. Seriously.

In China, they’ve already launched Xiaoice, a digital girlfriend used by 20 million people on Chinese social networks. Xiaoice is supposed to  “banter” and gives dating advice to many lonely hearts.

We’d take a friend-zone robot over a digital Nazi any day.

Related TopicsFail Funny News ai bitch brad b twitter
StaffREGRETFULMORNING Writer
  • More From Us