Advertisment

Microsoft debuts another chatbot, Zo who refuses to talk politics

author-image
CIOL Writers
New Update
CIOL Microsoft debuts another chatbot, Zo who refuses to talk politics

Though Tay was a disaster, Microsoft isn’t done with chatbots just yet. The company is testing another bot christened Zo, on social instant messaging app Kik, according to a report by technology website MSPoweruser.

Advertisment

Interestingly, the new version prefers not to talk politics and steers clear of racism. Zo is essentially an English-language version of Microsoft’s Chinese bot, Xiaoice that wants to come off as casual without touching upon any controversial topic.

CIOL Microsoft debuts another chatbot, Zo who refuses to talk politics

According to MSPoweruser, when asked about President-elect Donald Trump, Zo responded: “People can say some awful things when talking politics so I don’t discuss.” Similarly when asked if Hillary Clinton is a crook, Zo said, "Maybe you missed that memo but politics is something you're not supposed to casually discuss."

Advertisment

Though many companies already employ chatbots as help-desk assistants on their websites, the recent thrust on Artificial Intelligence has made them much more powerful.

Microsoft has plans to make enterprise bots, and the company has made and featured some of its own bots in its Bot Directory, but Zo and Tay are part of a series of special AI-powered assistants rolled out by the company in various parts of the world.

At this point, there is little clarity on Microsoft’s purpose around Zo or why it chose to release it on only one messaging app. In March, Microsoft tested a chatbot Tay on Twitter , but Internet miscreants commenced a serious of offensive conversations with it. The more racist and unpleasant conversations Tay absorbed, the more the chatbot began to speak to others in a similar offensive manner, which caused Microsoft to shut down the experiment and apologize.

ai microsoft chatbot