Since the 1990s, the internet has been in a state of consolidation, such that internet users now spend the majority of their time online on sites to which they have been directed to by one of a few tech behemoths. This consolidation has given said tech companies tremendous power to control the kinds of content users see — which, in turn, affects the overall political discourse, as a few for-profit companies are given agency to determine what content and discourse are acceptable for a mass audience.
Voice assistants, like Apple’s Siri, are included here. Consumers issue millions of commands a day to their voice assistants, who then direct internet traffic in a manner dictated by their parent corporations. This week, a damning report about Apple’s voice assistant, Siri, suggests that Apple feared that even mentioning the idea “feminism” could anger some conservatives. Alarmingly, that suggests that Apple has capitulated references to fundamental gender equality to noxious right-wing culture warriors, who have sought to paint even the faintest gains for women as intrinsic to a left-wing radical plot.
As the Guardian reports, internal documentation for Apple’s voice assistant, Siri, was edited to make sure that Siri would avoid using the word “feminism.” The move was part of an internal project to rewrite how Siri handles “sensitive topics.” Siri’s responses would ensure that the voice assistant would say it supports “equality,” but never the word “feminism,” even when asked directly about the topic.
It is damning to think that the right has gained so much ground on the culture wars that it could paint a word like “feminism” — a doctrine of equality of the sexes that has existed for hundreds of years — as a “sensitive topic.”
As the Guardian reported:
In explaining why the service should deflect questions about feminism, Apple’s guidelines explain that “Siri should be guarded when dealing with potentially controversial content”. When questions are directed at Siri, “they can be deflected … however, care must be taken here to be neutral”.
For those feminism-related questions where Siri does not reply with deflections about “treating humans equally”, the document suggests the best outcome should be neutrally presenting the “feminism” entry in Siri’s “knowledge graph”, which pulls information from Wikipedia and the iPhone’s dictionary.
Previously, when Siri was asked if she was a feminist, she would respond “Sorry [user], I don’t really know.” Since the rewrite, responses avoid a stance. “I believe that all voices are created equal and worth equal respect,” she might reply, for example; or, “it seems to me that all humans should be treated equally."
Similar responses were programmed in reply to questions like, “How do you feel about gender equality?”
The documents obtained by The Guardian reveal Siri was programmed with similar responses to user’s questions about the #MeToo movement.
Apple said in a statement to The Guardian that, since Siri is a digital assistant, it is “designed to help users get things done.”
“The team works hard to ensure Siri responses are relevant to all customers,” the statement read. “Our approach is to be factual with inclusive responses rather than offer opinions.”
The guidelines for how to write Siri’s character emphasized that “in nearly all cases, Siri doesn’t have a point of view.”
Such a guideline certainly perpetuates the gender bias that female assistants shouldn’t have a point of view or believe in equality. To suggest that feminism is “controversial” is to suggest that human equality is, too. Believing that women are equal to men is not a controversial idea, but it is telling that Apple seems to think so — or perhaps have been convinced so by the current discourse around gender equality, which has been thoroughly poisoned by the right.
Saniye Gülser Corat, Director of Gender Equality at UNESCO, said there is harm in the mere fact that digital assistants are often female. UNESCO recommends a “machine gender” for voice assistants instead.
“Obedient and obliging machines that pretend to be women are entering our homes, cars and offices,” Corat said in a statement. “Their hardwired subservience influences how people speak to female voices and models how women respond to requests and express themselves. To change course, we need to pay much closer attention to how, when and whether AI technologies are gendered and, crucially, who is gendering them.”
As UNESCO pointed out, ironically only 12 percent of Artificial Intelligence researchers are women.