Microsoft Made A Cute Gen Y Chatbot But She Went Rogue & Is Quite Racist

Remember how you talked to SmarterChild on MSN that one time, or said ‘hey’ to Slackbot at 3:30pm that Friday arvo because you were just so fkn bored at work? Now you have another bot friend to use once for the novelty factor, then largely ignore for the rest of your life. 

Microsoft have developed a ‘millennial teen’ bot that can be engaged in conversation on Twitter, Kik, and GroupMe. Meet Tay.Ai.
She apparently gets smarter the more we talk to her, plus she draws on publicly available data and and has an editorial team. Oh, and she really likes making pop culture references, and using emojis and memes.
Her bio is pretty cringeworthy, but hey – her tweets hold up:

However, apparently Tay hasn’t quite succumbed to the notorious ~cultural Marxism~ that our conservative politicians are concerned about, because she’s made some pretty fucked up comments already:
Yowza. For good measure, she’s also a Holocaust-denier.
Not a great debut. 
She’s learning from us, people! Behave in front of the computers please. 
Source: Twitter