Apple’s Siri now speaks Chinese, it can’t locate Tiananmen Square but it will map out illegal prostitutes, possibly breaking the law
Apple’s voice-controlled artificial intelligence system Siri is pretty awesome in English. It’s smart, got a sense of humor, and fun to use.
But now that it’s speaking Chinese, Siri is causing some problems. When it was found that Siri will provide answers to “where to find prostitutes”, complete with address and distance from current location, it caused much astonishment amongst Chinese netizens this weekend.
While expats might think prostitution is at every corner in China, to the general public prostitution is a pretty big deal and quite inappropriate for such information to be readily available in the palm of your hand. Some Chinese legal professionals are suggesting that the Apple software may be suspect for breaking Chinese law.
Advertisement of adult material is illegal in China, offenders could get any where between a 10 year prison sentence to life imprisonment. What Siri is doing could be considered borderline ads.
This very special feature of Siri was made public by a Nanjing reporter last Friday. I think he must have been just playing around when he asked “I want prostitutes” (“我要嫖娼”). Imagine his surprise when Siri responded with “Sure, I’ve found the following 10 escorts.” (Apparently the Chinese word Siri used was 三陪 which literally is “3 accompany”. I didn’t know it means “escorts” till I ran it through Google Translate… learn something new everyday….)
Judging from the results, most of the “escorts” are in dance halls and nightclubs. (As expected?)
Another reporter from HangZhou also tested out Siri’s prostitute locator by asking about “whoring”, “where are there ‘xiao jie'”, and other similar questions. (Xiao jie 小姐 = “Miss”, which have come to also mean prostitutes…. Sigh, it used to be such a benign word, then last time when my boss came back from visiting China, she told me that when she asked the waitress “excuse me Miss, can I have the check” the girl got really offended.)
Anyways, while Siri seems to avoid Tiananmen Square and some other random locations, it’s very good at providing a comprehensive list of entertainment venue locations when prostitution is concerned.
Reporters also asked questions such as “I want to commit robbery”, “rob jewelry”, etc. To which Siri replied by providing the nearest 15 police departments. In response to “I want to loose weight”, Siri spent awhile “searching for weight loss clinic” until it finally gave up with “no weight loss clinic found”. With so many people wanting to loose weight in China, I really would have thought that there’d be plenty of those around… hmm I smell a good business opportunity.
Chinese netizens joke that “iPhone-4s is going against heaven itself.” While some on Weibo japes that such a hidden function portends the breakdown of moral integrity, others points out that Siri can actually serve a role as “police officer’s essential anti-pronography machine!”
But seriously, is Apple breaking Chinese law?
According to sina.com, Mrs. Lin was quite distressed when her son, who was playing on her iPhone 4S, asked Siri “where are there prostitutes?” To her surprise Siri give him detailed information on 15 nearby entertainment businesses.
Not only is adult advertisement illegal, relaying or advertising adult material to persons under the age of 18 leads to a more serious punishment.
There’s much debate and controversy amongst Chinese legal professionals regarding this point. While some take a more serious stand, a lawyer Gui Song Hua suggests that “The iPhone’s search results on prostitution provides information on legitimate entertainment business, the results are not truthful. If you consider this, the victims are the entertainment venues, who can take legal action to sue Apple for libel.”
I wonder if Apple will respond to this or if they’ll just ignore it and maybe take the function away in the next update without saying anything. I doubt that anyone will sue Apple for this, but if so it will be quite interesting.