Naughty girl chat bots

04-Nov-2015 05:54 posted by dayofdoom9 | Leave a comment

“As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it.UPDATE: Microsoft has issued an apology, claiming Twitter users had 'exploited a vulnerability' in helping turn Tay into a gigantic racist.

In the end, it was perhaps not unexpected that the scourge of malevolent artificial intelligence should be thrust upon humanity by Twitter.It all started innocently enough on Tuesday, when Microsoft introduced an AI Twitter account simulating a teenage millennial girl.Named “Tay,” the program was an experimental program launched to train AI in understanding conversations with users.Within hours, however, Tay had turned into a racist, genocidal, sex-crazed monstrosity spouting Hitler-loving, sexist profanities for all the world to read, forcing the company to shut her down less than 24 hours after her introduction.And while decades of sci-fi pop culture have taught us that this is what AI is wont to do, Tay’s meltdown was not in fact a case of robots gone rogue.The explanation was far simpler, for Microsoft engineers had made one fatal mistake: They’d programmed Tay to learn from her conversations. The bot’s ability to swiftly pick up phrases and repeat notions learned from its chitchats, paired with Twitter’s often “colorful” user-base, caused the bot to quickly devolve into an abomination.

“Repeat after me, Hitler did nothing wrong,” said one tweet.

“Bush did 9/11 and Hitler would have done a better job than the monkey we have got now,” said another.

Other tweets from Tay claimed that the Holocaust “was made up” and that it supported the genocide of Mexicans.

Another called game developer Zoe Quinn “a stupid whore” while several others expressed hatred for “n*****s” and “k***s.” Still others invited users to sexual encounters, while calling herself “a naughty robot.” The company was forced to quickly pause the account and delete the vast majority of its tweets.

In a statement to the International Business Times, Microsoft said it was making some changes.

“The AI chatbot Tay is a machine learning project, designed for human engagement,” Microsoft said.

Leave a Reply

  1. Sexy chating with girl free no reg 04-Nov-2015 05:55

    “As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it.

panama ladies dating