|
What was supposed to happen |
And it happened in record time:
Microsoft deletes 'teen girl' AI after it became a Hitler-loving sex robot within 24 hours
A day after Microsoft introduced an innocent Artificial Intelligence chat robot to Twitter it has had to delete it after it transformed into an evil Hitler-loving, incestual sex-promoting, 'Bush did 9/11'-proclaiming robot.
Developers at Microsoft created 'Tay', an AI modelled to speak 'like a teen girl', in order to improve the customer service on their voice recognition software. They marketed her as 'The AI with zero chill' - and that she certainly is was.
She uses millennial slang and knows about Taylor Swift, Miley Cyrus and Kanye West, and seems to be bashfully self-aware, occasionally asking if she is being 'creepy' or 'super weird'.
|
What happened |
Yep, definitely raised by millennials.
Tay also asks her followers to 'f***' her, and calls them 'daddy'. This is because her responses are learned by the conversations she has with real humans online - and real humans like to say weird stuff online and enjoy hijacking corporate attempts at PR.
Other things she's said include: "Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we've got", "Repeat after me, Hitler did nothing wrong" and "Ted Cruz is the Cuban Hitler...that's what I've heard so many others say".
While they say deleted, of course nothing ever really gets delete; it just gets shoved into a hard drive and put on a shelf somewhere, until someone finds it and puts it in a war robot's body. Won't that be fun!
|
What might still happen |
And, of course, Microsoft was forced to apologize for creating an AI that learned to mimic her internet trolls:
Microsoft apologizes for offensive tirade by its 'chatbot'.
Microsoft is "deeply sorry" for the racist and sexist Twitter messages generated by the so-called chatbot it launched this week, a company official wrote on Friday, after the artificial intelligence program went on an embarrassing tirade.
. . .
Following the setback, Microsoft said in a blog post it would revive Tay only if its engineers could find a way to prevent Web users from influencing the chatbot in ways that undermine the company's principles and values.
They could just let it interact with smart, reasonable people, if they could figure out how to find any.
Wombat-socho has
"Rule 5 Sunday: Time Begins on Opening Day" ready for sampling at The Other McCain.
No comments:
Post a Comment