YouTube announces it will no longer recommend conspiracy videos.

Thanks for the article. This was the Best part:

"He pointed to a different artificial intelligence that was also shaped by the bias of its users: Microsoft's chatbot "Tay."

"
Tay was a Twitter chatbot produced by Microsoft, which was meant to interact with users like a human and learn from others.

Within 24 hours of its release, Tay went from innocent chatbot to full-blown misogynist and racist, according to The Verge. The AI operating Tay learned from and became biased by the engagement it received from Twitter users who were spamming the bot with those ideologies, according to CNBC.

:abgg2q.jpg:
 

Forum List

Back
Top