Tech Break

Mar 24th 2016

Tech Break 2461

Net Turns Chat Bot Racist

Tay was trained by the net to be racist, sexist, and homophobic.

This feed has been discontinued, but you can find the clips in our archives.
Category: TWiT Bits

Alex Kantrowitz joins Erin Carson from TechRepublic, Jason Howell, and Megan Morrone to talk about Microsoft's chat AI Tay. Kantrowitz participated in a closed beta for Tay where she took a stance on the question "Would you kill baby Hitler." After a day online, the AI was trained to be racist, sexist, homophobic, and more. Was she a sociopath or just a parrot? Microsoft deleted the offending tweets and took the bot offline for more training.

For the full episode visit twit.tv/tnt/1477

<p>Bandwidth for TWiT Bits is provided by <a href="http://cachefly.com/&quot; target="_blank">Cachefly</a>.</p>