"It is as much a social and cultural experiment, as it is technical. "The AI chatbot Tay is a machine learning project, designed for human engagement," the statement reads. In a statement to PCMag, a Microsoft spokesperson confirmed that the company has taken the chatbot offline for upgrades. Microsoft has been removing the offending tweets (although many remain online and will continue to live on in screen shots), and has put Tay to " sleep (Opens in a new window)" for the time being. Without repeating the offending tweets, we'll just say Tay went so far as to support Hitler, deny that the Holocaust happened, and even call for genocide. Targeted at 18- to 24-year-olds, the bot turned out to be a huge hit with online miscreants, who cajoled Tay into repeating racist, sexist, and anti-Semitic slurs. Released on Wednesday, Tay was designed to interact with people to help Microsoft better understand conversational speech. Microsoft has temporarily shut down the Twitter chatbot after racist trolls ruined it for everyone, teaching Tay to repeat some extremely offensive (Opens in a new window) viewpoints. So now, the too-impressionable Tay is getting a time out. In less than 24 hours, Microsoft's Tay went from a happy-go-lucky, human-loving chat bot to a full-on racist. How to Set Up Two-Factor Authentication.How to Record the Screen on Your Windows PC or Mac.How to Convert YouTube Videos to MP3 Files. How to Save Money on Your Cell Phone Bill.How to Free Up Space on Your iPhone or iPad.How to Block Robotexts and Spam Messages.
0 Comments
Leave a Reply. |