Microsoft Racist, Sexist Chatbot Tay: Now You Can Build Your Own Version

Microsoft Racist, Sexist Chatbot Tay: Now You Can Build Your Own VersionMicrosoft Racist, Sexist Chatbot Tay: Now You Can Build Your Own Version

The open source tools for people to build their personal chatbots has been released by Microsoft recently. Similar to its back-firing Tay Experiment, it set out its view of the immediate future of artificial intelligence as conversational aids. A new bot framework has been announced by Chief Executive Satya Nadella in the Build Developer Conference by Microsoft. The users would be able to develop bots that permits to chat messages made by developers sent via Text Messages, emails, GroupsMe, Telegram, Slack and Skype. The new apps are known as bots. The company had to pull its chatbot experiment tay from the Twitter when the announcement came on the same day.

It was happened after beginning of spamming users and taking drugs. After previously being deactivated for making sexist and racist and denying after holocaust had only being active again for few hours. The CEO of the company said “As an industry, we are on the cusp of a new frontier that pairs the power of natural human language with advanced machine intelligence”. The dream of a chatbot that will be able to do it for the users is chased by several companies. The company has produced dozens of services and apps to make it workable with iOS and Android. Making Microsoft relevant in a Mobile Space dominated by others is the change of mentality taken in this step. The Microsoft’s success in China would also hike the number of Chatbot’s users.

Be the first to comment on "Microsoft Racist, Sexist Chatbot Tay: Now You Can Build Your Own Version"

Leave a comment

Your email address will not be published.


*