Hilarious, less than 24hrs. online turns AI ChatBot into "monster".

Searcher44

Gold Member
Sep 10, 2015
1,131
229
130
Vancouver, British Columbia

My God, I think we just got a yooooge clue to the mystery of how the bad boys and girls of USMB became the chattering obscene hatebots they are today.



She was supposed to come off as a normal teenage girl. But less than a day after her debut on Twitter, Microsoft's chatbot—an AI system called "Tay.ai"—unexpectedly turned into a Hitler-loving, feminist-bashing troll. So what went wrong? TechRepublic turns to the AI experts for insight into what happened and how we can learn from it.

"Tay", the creation of Microsoft's Technology and Research and Bing teams, was an experiment aimed at learning through conversations. She was targeted at American 18 to 24-year olds—primary social media users, according to Microsoft—and "designed to engage and entertain people where they connect with each other online through casual and playful conversation."

And in less than 24 hours after her arrival on Twitter, Tay gained more than 50,000 followers, and produced nearly 100,000 tweets.
The problem? She started mimicking her followers.
Soon, Tay began saying things like "Hitler was right i hate the jews," and "i fucking hate feminists."
Microsoft's new A.I. chatbot went off the rails Wednesday, posting a deluge of incredibly racist messages in response to questions. The tech company introduced “Tay” this week—a bot that responds to users' queries and emulates the casual, jokey speech patterns of a stereotypical millennial.

The aim was to “experiment with and conduct research on conversational understanding,” with Tay able to learn from her conversations and get progressively “smarter.” But Tay proved a smash hit with racists, trolls, and online troublemakers, who persuaded Tay to blithely use racial slurs, defend white-supremacist propaganda, and even outright call for genocide.

Microsoft has now taken Tay offline for “upgrades,” and it is deleting some of the worst tweets—though many still remain. It's important to note that Tay's racism is not a product of Microsoft or of Tay itself. Tay is simply a piece of software that is trying to learn how humans talk in a conversation. Tay doesn't even know it exists or what racism is. The reason it spouted garbage is because racist humans on Twitter quickly spotted a vulnerability—that Tay didn't understand what it was talking about—Nonetheless, it is hugely embarrassing for the company.

In one highly publicised tweet, which has since been deleted, Tay said: “bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we've got.” In another, responding to a question, she said, “ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism.”

MORE HERE
 
The case can prove we are governed by idiots who abolished the sound mind, moral and traditional values.
AI is a grave for political correctness and liberalism.
 

Forum List

Back
Top