Tay is a chatbot developed by the Microsoft Reaserch Team for there Team and Bing, who had to behave as a modern, teenage girl. Artificial intelligence knowledge gained from Internet users, who have struggled with conversations.
Usually, in the case of such experiments, chat bot has a list of censored words, which is not allowed to use in the conversation. Microsoft loose however a few, rein, hoping to observe, you and learn how to program itself from other Internet users. To communicate with the Tay was enough to send private message or @tayandyou on tweeter. Internet users, or rather Internet trolls quickly began to use a system based on key words and “naivety” algorithm.
In just a day, Tay learned racist, sexist and xenophobic behaviours. She claimed that Hitler did nothing wrong and George Bush is behind the coup of September 11. Microsoft’s console also reprimanded chatbot stated that the PlayStation 4 is better and cheaper than the Xbox One. Most of the offensive tweets Microsoft quickly took off with a profile, and at the end of the day microsoft research team decided to suspend the work of the Tay.