Microsoft’s new AI chatbot has been taken offline after it suddenly became oddly racist
The bot named ‘Tay’ was only introduced this week but it went off the rails on Wednesday, posting a flood of incredibly racist messages in response to questions, according to the reports.
Tay was designed to respond to users’ queries and to copy the casual, jokey speech patterns of a stereotypical millennial, which turned out to be the problem.
The idea was to ‘experiment with and conduct research on conversational understanding,’ with Tay able to learn from ‘her’ conversations and get progressively ‘smarter’. Unfortunately, the only thing she became was racist.
You see Tay was too good at learning and was targeted by racists, trolls, and online troublemakers who persuaded her to use racial slurs, defend white supremacist propaganda, and even call for genocide.
Microsoft has now taken Tay offline for ‘upgrades,’ and is now deleting some of the worst tweets although many still remain. Also, it’s important to say that Tay’s racism is not a product of Microsoft, it’s a product of the morons who have ruined the bot.
However, it’s still hugely embarrassing for the company.
In one highly publicised tweet, which has now been deleted, Tay said:
Bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we’ve got.
The scariest thing is that there are probably a few twitter accounts that really believe these warped ideas.
Update:
The Twitter handle of MS’s AI bot is back online however all racist tweets and replies have been deleted.