Microsoft's 'Zo' Chatbot Picked Up Some Offensive Habits
Despite Microsoft programming Zo to ignore politics and religion, the folks at BuzzFeed News managed to get the bot to react to the restricted topics with surprising (and extremely controversial) results. One of these exchanges saw Zo refer to the Qu'ran as "very violent." It also opined on the death of Osama Bin Laden, claiming his "capture" came after "years of intelligence gathering under more than one administration." Microsoft claims the errors in its behavior have now been corrected.
Just last year, Microsoft's Tay bot went from emulating the tone of a supposedly hip teenager to spouting racist tirades within the span of a day. To make matters worse, the entire debacle unfolded on Twitter for everyone to see, forcing Microsoft to disable it. As a result, the company kept Zo within the confines of messaging app Kik, and its mid-sized user base. But it seems the chatbot still managed to pick up some bad habits.
Microsoft blamed Tay's downfall on a concentrated group effort by select users to corrupt the bot, but it claims no such attempt was made at bringing down Zo. The chatbot is still available on Kik, with Microsoft saying it has no plans of disabling it.