Remember Microsoft’s last AI chatbot that earned the wrath of the internet after it turned rogue? Well, that doesn’t seem to have discouraged Microsoft from experimenting with chatbots and the company has already come up with yet another chatbot called ‘Ruuh.’ The chatbot is available only for users in India and will support English.
Microsoft has already filed for a trademark for Ruuh on March 15 according to which the chatbot will talk about the area of interest primarily evolving around Chatting, Bollywood, Music, Humour, Travel and Browsing the Internet.[stories-so-far title=”Also Read” post_ids=”77000, 78596″] Tay.ai was the first chatbot that was created by Microsoft Research and Bing Teams and was designed keeping in mind 18 to 24-year-olds. The ugly turn of events resulted in the Tay being trained by the users to act racist before it was eventually taken down by Microsoft. However, in December Microsoft was back with yet another chatbot called Zo and this time around the program was invite only and pretty much restricted.
Microsoft has been bullish about its AI efforts and last year it formed a new AI Research group led by Harry Shum in partnership with IBM, Google, Facebook and Amazon. Microsoft also has a dedicated page for the project in India that involves creating text messaging chatbots. The excerpt from the page goes as follows:
“As text-messaging chatbots become increasingly ‘human’, it will be important to understand the personal interactions that users are seeking with a chatbot. What chatbot personalities are most compelling to young, urban users in India? To explore this question, we conducted Wizard-of-Oz (WoZ) studies with users that simulated interactions with a hypothetical chatbot. Participants were told that there might be a human involved in the chat, although the extent to which the human would be involved was not revealed. We synthesize the results into a set of recommendations for future chatbots.” Ruuh is currently available as a chatbot on Facebook Messenger and you can check it out here.