Welcome To Search Scandals, AppleIt’s been interesting to watch the Siri Abortiongate scandal blow up in Apple’s face over the past few days. Apple is learning for the first time what it’s like to run a search engine. People hold you accountable for everything, even if you the information isn’t even from your own database.
Google is a battle-scarred veteran in these matters. Why does an anti-Jewish site show up in response for a search on “Jew?” Why did President George W Bush’s official biography rank for miserable failure? Why do you get THAT result for a search on Santorium?
Sometimes, Google’s opens up to explain some of these oddities, which tend to have reasonable explainations. Not always. The company stayed closed-mouthed about why exactly a search for “climategate” was suggested and suddenly disappeared, taking ages to finally explain why. That harmed it.
Inside SiriThe same silence is harming Apple now. Sure, the company has issued a statement to various outlets saying there’s nothing intentional happening, and it’s merely a bug that needs to be fixed. Here’s one of the statements from Apple CEO Tim Cook, given to NARAL Pro-Choice America:
Our customers use Siri to find out all types of information and while it can find a lot, it doesn’t always find what you want. These are not intentional omissions meant to offend anyone, it simply means that as we bring Siri from beta to a final product, we find places where we can do better and we will in the coming weeks.Here’s another, given to the New York Times:
“Our customers want to use Siri to find out all types of information, and while it can find a lot, it doesn’t always find what you want,” said Natalie Kerris, a spokeswoman for Apple, in a phone interview late Wednesday. “These are not intentional omissions meant to offend anyone. It simply means that as we bring Siri from beta to a final product, we find places where we can do better, and we will in the coming weeks.”But opening up on how exactly Siri works would help. It would help a lot. Without that, the speculation continues, as you can see in this article from The Raw Story:
Kerris did not, apparently, explain why Siri, although still in beta, has no difficulty locating escort services, plastic surgeons that will perform breast augmentation procedures or hospitals to direct users if they have erections lasting longer than five hours (a condition known as priapism).I’ve got no inside knowledge of how Siri works. Heck, we weren’t allowed to attend the launch event of Siri, despite Search Engine Land being the leading news site that focuses on search. But because I’ve covered search so long, I can take a pretty good shot at explaining what’s wrong, why Siri will suggest where you can get Viagra or bury a body but not where you can find an abortion.
It Can Find Viagra, But Not…Let’s start with the ACLU’s post, which says:
If Siri can tell us about Viagra, it should not provide bad or no information about contraceptives or abortion care.Personally, I can’t get Siri to search for Viagra. It insists on seaching for “biography” no matter how I speak Viagra. But here’s an example of what the ACLU is upset about, taken from the Siri Failures, Illustrated blog post from Amadi:
Siri Doesn’t Understand Many Things
Conspiracy Or Generally Confused?
Past Tense, Different Word
No Abortion Clinics, No Tool Stores….
But Yelp Has Them!
But You Do Get Abortion Clinic Listings
I highly doubt it was intentional, probably more to do with places not listing the word “abortion” in their titles. i just tried it and she pointed me right to the nearest clinic in boston, for whatever that’s worth.
I was unable to reproduce the problem here in rural Texas, not far From Austin. The first listing that Siri came up with was to the Killeen Women’s Health Center, the web link for which took me to the site for the Austin Women’s Health Center, a legitimate clinic offering a full range of reproductive choices and services.
Confusing Human & Computer Results
|The Colbert Report||Mon – Thurs 11:30pm / 10:30c|