Last week, Jorge Rivas reported on the discovery of some shortcomings of Apple’s intelligent assistant" Siri; as feminist bloggers discovered soon after the iPhone 4S’s release, Siri is pre-equipped with quips on Viagra, brothels, and dead bodies, but is stumped by woman-centric issues like birth control and rape. And while Siri isn’t billed as much more than entertainment, it’s hard to forget that this is the same Apple that blocked an app exposing its sweatshop-labor practices, and that posed as ICE agents to intimidate an employee. In other words, Apple (and the rest of Silicon Valley) has a bad record on respecting women and people of color, even as those demographics buy smartphones in record numbers.
What’s the duty of a corporation to a community’s traditionally vulnerable groups — and who draws the lines? As corporations, rather than elected governments, become the stewards of everything from our privacy to our prisons, it’s a question that must be asked. Here’s what you had to say.
I wanna know who at Apple is responsible for Siri’s programming monstrosity. The damn software is literally making snide comments about rape. Responding to a survivor’s coming out with a dismissive "Is that so?" is likely to get your ass kicked in real life –especially with the survivors I know.
Plus — Viagra, brothels, and hiding dead bodies? WTF? This kind of software design has me longing for the days of employee morals clauses. I can only imagine what some iPhone programmers are doing in their spare time.
Siri is not Google. Have you seen Siri’s Tumblr page, where people ask ridiculous questions and get humorous answers? Honestly, who would tell Siri they got raped? I get the point that Siri should include health problems regarding all genders, but men’s sexual health problems – such as the use of viagra – tend to be the butt of jokes (which is sexist because it implies men shouldn’t have sexual problems). However, Siri is not a lifeline service, and, judging by its responses, I don’t think it takes itself seriously.
Note that Siri is not magic but artificial intelligence that detect keywords. I really don’t think this Siri is an attack on women. We’ve already seen Siri answer questions about a women’s health but with different grammatical phrases – which, YES, can make a difference. On the other hand, Siri should not be used in case of an emergency, so it’s probably a good thing if the developers avoided triggery subjects.
On a side note, people have reported that Siri can have trouble finding pizza parlors, so there you go.
[…] However, blogs bringing up Siri’s technical issues will hopefully influence Apple to include more women-friendly issues that Siri can recognize.
WHO WOULD SERIOUSLY PICK UP A PHONE AND TELL SIRI THEY WERE RAPED, ROBBED, OR HIT? That would be the LAST thing on my mind. Why are people mad that Siri doesn’t direct them to the obvious? If you were raped — male or female — common sense would tell you to call the police and get the situation handled. Come on people, you can’t rely on technology for everything! You should only rely on Siri for what it was built for, to schedule reminders, find restaurants, check the weather, add events to your calendar, organize contacts… NOT SERIOUS LIFE ISSUES!
Services like this are never impartial – but this is a particularly dangerous example. We’re not talking about one local restaurant paying for a top listing so their competitors don’t get as many customers…we’re talking about access to health care and emergency services. Apple better figure this out pretty damned quick and make it right.
And as @IAmMikeBrowne summarized the article on Twitter:
Anyone confused about the term #structuralsexism should check it out.
Each week, we round up the best comments in our community. Join the conversation here on Colorlines.com, and on Facebook and Twitter.