Chances are, you have an Alexa at home. So do I. So, this post is for you.
A UN study has found that voice assistants such as Alexa and Siri have been designed excessively servile that perpetuate gender stereotypes about women. The report recommended that companies stop making digital voice assistants female by default, and explore gender-neutral options.
They are programmed to be submissive and servile – including politely responding to insults – meaning they reinforce gender bias and normalise sexist harassment, said researchers from the U.N. scientific and cultural body UNESCO.
The study highlighted that Siri was previously programmed to respond to users calling her a “bitch” by saying “I’d blush if I could” as an example of the issue.
“Siri’s submissiveness in the face of gender abuse – and the servility expressed by so many other digital assistants projected as young women – provides a powerful illustration of gender biases coded into technology products,” it said.
Earlier this year, Q was developed as a gender-neutral voice assistant developed by a team of technologists at EqualAI to promote gender equality in technology and to address concerns of sexism. A Quartz article explained why genderless technology such as Q matters:
Adding a voice like Q’s to a menu of audio options would address more than one ethical dilemma. As Q articulates in its introductory recording, it would make tech more inclusive by recognizing people who identify as non-binary, a population that’s becoming increasingly visible as social norms change. “It’s because Q is likely to play with our minds that it is important,” Kristina Hultgren, a linguist who was not part of the project, told Wired. “It plays with our urge to put people into boxes and therefore has the potential to push people’s boundaries and broaden their horizons.”
Wide adoption of a genderless voice would also pave the way for some much needed women’s liberation among AI assistants, which are infiltrating our lives at a rate that has even surprised industry analysts. Currently, all of the major digital voices who answer our questions about the weather, or provide the exchange rate between the peso and a dollar, or remind us to make a phone call, are undeniably feminine, even though their makers claim the bots are genderless.
Yet, however big a deal Q might be, gender bias in technology can’t be fully removed unless diversity and inclusion in technology (AI et al), creative and leadership roles happen. Until then, robots and digital assistants will continue to reflect what they learn from human behavior.