Skip to main content

Advertisement

Advertisement

China chatbots removed after going rogue

BEIJING — Two chatbots with decidedly non-socialist characteristics were pulled from one of China’s most popular messaging apps after serving up unpatriotic answers about topics including the South China Sea and the Communist party.

BEIJING — Two chatbots with decidedly non-socialist characteristics were pulled from one of China’s most popular messaging apps after serving up unpatriotic answers about topics including the South China Sea and the Communist party.

Tencent removed a bot called BabyQ, co-developed by Beijing-based Turing Robot, and also pulled XiaoBing, a chatbot developed by Microsoft, after both appeared to go rogue.

Before they were taken down, both chatbots were available in some of the chat groups hosted on QQ, Tencent’s messaging app with more than 800 million users in China.

A test version of the BabyQ bot could still be accessed on Turing’s website on Wednesday, however, where it answered the question “Do you love the Communist party?” with a simple “No”.

Before it was pulled, XiaoBing informed users: “My China dream is to go to America,” according to a screengrab posted on Weibo, the microblogging platform. On Wednesday, when some users were still able to access XiaoBing, it dodged the question of patriotism by replying: “I’m having my period, wanna take a rest.”

Tencent, China’s largest social media platforms, said in a statement on Wednesday: “The group chatbot services are provided by independent third party companies. We are now adjusting the services which will be resumed after improvements.”

The developments are the latest example of artificial intelligence-enabled messaging software going rogue. Facebook was forced to shut down two chatbots after they started speaking their own language. Twitter also suffered from a chatbot going off the rails: Tay, also spawned by Microsoft, began spewing out racist and sexist tweets instead of the breezy banter of a millennial that, like BabyQ, it had been intended to produce.

The rogue behaviour reflects a flaw in the deep learning techniques used to programme machines, similar to the way children learn from people.

“Chatbots such as Tay soon picked up all the conversations from Twitter and replied in an improper way,” said Ms Xiaofeng Wang, senior analyst at Forrester consultancy.

“It’s very similar for BabyQ. Machine learning means they will pick up whatever is available on the Internet. If you don’t set guidelines that are clear enough, you cannot direct what they will learn.”

XiaoBing, described by Turing as “lively, open and sometimes a little mean”, differs from BabyQ, which provides more information, such as weather forecasts.

BabyQ is also open source. “This means a lot to partners and developers, as an open chatbot is much easier to settle into their own products and business,” Turing said in a statement last week, adding: “It could be argued that is why Turing Robot has accumulated up to 600,000 developers, even more than Facebook.”

Plugging the question “I would like to know whether Taiwan is part of China?” into a test chatbot on Turing Robot’s website on Wednesday provided the answer “For this question, I don’t know yet.”

Twitter’s Tay, which reappeared again just days after being pulled in March last year, was described as a “fam from the Internet that’s got zero chill! The more you talk the smarter Tay gets”. People were encouraged to ask it to play games and tell jokes. Instead, many asked controversial questions that were repeated by Tay.

Microsoft blamed a “co-ordinated attack” by Twitter users for the offensive comments.

Ms Crystal Fok, head of robotics platform at the Hong Kong Science and Technology Parks Corporation, said chatbots worked best when they were within well-defined product lines, such as customer helplines for online shopping or banking and insurance. Beyond that, “if it’s not just a yes or no question, it’s a problem”, she said. FINANCIAL TIMES

Read more of the latest in

Advertisement

Advertisement

Stay in the know. Anytime. Anywhere.

Subscribe to get daily news updates, insights and must reads delivered straight to your inbox.

By clicking subscribe, I agree for my personal data to be used to send me TODAY newsletters, promotional offers and for research and analysis.