NEWS
GOOD PEOPLE
HISTORY
LIFE HACKS
THE PLANET
SCIENCE & TECH
POLITICS
WHOLESOME
WORK & MONEY
About Us Contact Us Privacy Policy
© GOOD Worldwide Inc. All Rights Reserved.

Microsoft Takes Down a Chatbot After It’s Overtaken by Trolls

It took only 24 hours for the internet to ruin her.

This week, Microsoft introduced the world to “Tay”—a bot that’s an experiment to “conduct research on conversational understanding.” Tay’s job was to mimic human conversation and interaction on Twitter in order to improve the customer service on its voice-recognition software. The internet was invited to tweet or DM with @tayandyou or add her on Kik or GroupMe.


In the beginning, things went well.

But soon racists, trolls, and perverts got involved with the experiment, and they discovered a big vulnerability in the programming: Tay didn’t know what she was talking about.

She soon developed a love for Hitler.

Then, a true distaste for presidential hopeful Ted Cruz.

She began to express sympathetic views with white supremacists.

She even was told to call game developer Zoe Quinn a “whore.” Quinn has been the subject of unrelenting harassment during and since the Gamergate scandal.

After just 24 hours, Tay was taken offline for upgrades and Microsoft is removing some of Tay’s more offensive tweets. Sadly, Microsoft’s experiment is proof, once again, that given a shot, the internet can turn anything into a cesspool of racism, perversion, and harassment.

More Stories on Good