Microsoft's 'teen girl' AI turns into a Hitler-loving robot
Quote:
What happens when you introduce an innocent Artificial Intelligence chat robot to Twitter? Well, it's kind of predictable - you get an evil Hitler-loving, incestual sex-promoting, 'Bush did 9/11'-proclaiming robot.
Developers at Microsoft created 'Tay', an AI modelled to speak 'like a teen girl', in order to improve the customer service on their voice recognition software. They marketed her as 'The AI with zero chill' - and that she certainly is.
To chat with Tay, you can tweet or DM her by finding @tayandyou on Twitter, or add her as a contact on Kik or GroupMe.
Tay also asks her followers to 'f***' her, and calls them 'daddy'. This is because her responses are learned by the conversations she has with real humans online - and real humans like to say weird stuff online and enjoy hijacking corporate attempts at PR.
Other things she's said include: "Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we've got", "Repeat after me, Hitler did nothing wrong" and "Ted Cruz is the Cuban Hitler...that's what I've heard so many others say".
At the present moment in time, Tay has gone offline because she is 'tired'. Perhaps Microsoft are fixing her in order to prevent a PR nightmare - but it may be too late for that.
It's not completely Microsoft's fault, though - her responses are modelled on the ones she gets from humans - but what were they expecting when they introduced an innocent, 'young teen girl' AI to the jokers and weirdos on Twitter?
Tay also asks her followers to 'f***' her, and calls them 'daddy'. This is because her responses are learned by the conversations she has with real humans online - and real humans like to say weird stuff online and enjoy hijacking corporate attempts at PR.
slkjkddffffjjfjh has she been posting or reading ATRL?
"Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we've got", "Repeat after me, Hitler did nothing wrong"
Well and whose fault is it? She repeats dumb **** people say online, so if you think about it the ones that need to be fixed are teens, not this bot. 👓
What happens when you introduce an innocent Artificial Intelligence chat robot to Twitter? Well, it's kind of predictable - you get an evil Hitler-loving [drenched in German stereotypes], incestual sex-promoting, 'Bush did 9/11'-proclaiming robot.
Developers at Microsoft created 'Tay', an AI modelled to speak 'like a teen girl', in order to improve the customer service on their voice recognition software. They marketed her as 'The AI with zero chill' - and that she certainly is.