GoingCellular.com
http://goingcellular.com/google/googles-nexus-one-censorship-extends-further-than-cuss-words-442178/
by Michelle L on February 1, 2010
Google’s Nexus One censorship extends further than cuss words
News of Google’s Nexus One censoring swear words has been all over the Internet for the past week or so. The phone’s voice-to-text feature doesn’t transcribe cuss words, instead replacing them with hash marks. In a personal experiment, author Neil Gaiman found that the built-in censor could be circumvented by following the swear word with the words dot com. But it seems the Nexus One is much more prude than initially thought. Boston resident Zechariah-Aloysius Hillyard discovered Google went to great lengths to avoid offending people by also censoring references to certain works of literature.
According to CNET, Hillyard put the Nexus One through a personal experiment, trying out several combinations of swear words, all of which were censored by the device, producing the now-familiar hash marks. So he took it a step further, and using the Google Voice feature, tried to perform a search for Vladimir Nabokov’s novel Lolita. Nabokov went through just fine. But Lolita became a series of hashmarks, even with safe search turned off.
He continued trying several other words and combinations of words, with increasingly surprising results. A search for “incest” turned into hashmarks, but “bestiality” went through just fine. Really? Google has a problem with Lolita, but not bestiality?
Google’s initial statement to Reuters becomes less and less satisfactory. A spokeswoman said:
“We filter potentially offensive or inappropriate results because we want to avoid situations whereby we might misrecognize a spoken query and return profanity when, in fact, the user said something completely innocent.”
Does this mean Google thinks there are words out there that may be interpreted by the Nexus One’s software to sound like incest, but nothing could be mistaken for bestiality? How did they come to that conclusion? What kinds of tests were they doing at the Googleplex that resulted in this selective censorship? Did they have a linguistic expert on hand? Or was it some poor tech’s job to sit at a computer and try to think of all the possibly naughty words people can say? Did he just miss bestiality?
CNET has been trying to elicit a second, hopefully more detailed response from Google, but so far, the only thing they’re saying is ####.