Tech Break

Mar 24th 2016

Tech Break 2461

Net Turns Chat Bot Racist

Tay was trained by the net to be racist, sexist, and homophobic.
Category: TWiT Bits

Alex Kantrowitz joins Erin Carson from TechRepublic, Jason Howell, and Megan Morrone to talk about Microsoft's chat AI Tay. Kantrowitz participated in a closed beta for Tay where she took a stance on the question "Would you kill baby Hitler." After a day online, the AI was trained to be racist, sexist, homophobic, and more. Was she a sociopath or just a parrot? Microsoft deleted the offending tweets and took the bot offline for more training.

For the full episode visit

Bandwidth for TWiT Bits is provided by Cachefly.