Alex Kantrowitz joins Erin Carson from TechRepublic, Jason Howell, and Megan Morrone to talk about Microsoft's chat AI Tay. Kantrowitz participated in a closed beta for Tay where she took a stance on the question "Would you kill baby Hitler." After a day online, the AI was trained to be racist, sexist, homophobic, and more. Was she a sociopath or just a parrot? Microsoft deleted the offending tweets and took the bot offline for more training.
For the full episode visit twit.tv/tnt/1477
<p>Bandwidth for TWiT Bits is provided by <a href="http://cachefly.com/" target="_blank">Cachefly</a>.</p>