window.dataLayer = window.dataLayer || []; function gtag(){dataLayer.push(arguments);} gtag('js', new Date()); gtag('config', 'G-GEQWY429QJ');

 

ENTITY discusses gendered digital assistants.

Siri. Alexa. Google Home. Cortana. What do they have in common?

They’re all digital personal assistants. They’re womenesque.

Digital assistants officially assumed the voice and stereotypical role of women. With their comforting, soothing, non- threatening voice, feminized digital assistants help people all over the world to make calls, play music, find directions and much more.

They’re incredibly convenient.

But, in order to maintain feminist integrity, it’s important to question why digital assistants assume a female voice and identity. It’s time to reflect on the implications feminized digital assistants have on our culture. 

The history of assistance 

Historically, women worked under the power and control of their male bosses. Deeply ingrained gender roles worked to keep women in their place, under the thumb of male domination.

Feminist scholars identified gender discrimination and segregation as a fundamental part of professional organization and hierarchy. Historically thought to be inferior to men, women were and still delegated to positions of servitude. 

Not only in the workplace, but at home, the expectation of many women is still to cater to the needs of family. Cooking, cleaning and assisting others is still thought to be explicitly or not, a “woman’s job.”

Unfortunately, the history of strict gender roles is hard to shake.

Gendering digital assistance

These assistants may sound like women, but women they are not. Digital assistants haven’t cognitively experienced the hurt and suffering caused by sexism. Nor do digital assistants possess the lived perspective of a real-life woman.

“Female” digital assistants don’t have feelings that can be hurt, a body that can protect them or a mind that can fight back. The assignment of feminine names and voices to digital assistants offers users control and domination over virtual “women” without consequences. And while Siri and Alexa don’t have feelings or emotions, real women who suffer the brunt of sexism do.

Feminized digital assistants, therefore, provide society with a consequence-free outlet to continue the subordination and commodification of women.

What’s more, these assistants do not require a gender. As postmodern feminist scholar, Donna Haraway, so famously argues that robots have the potential to  represent a third, non-binary gender. Unfortunately, the tech industry failed to take advantage of the opportunity to further normalize non-binary gender through virtual intelligence. 

Rather, tech giants such as Amazon, Apple and Google continue to gender their digital assistants.

Further sexism in technology

A spokesperson for Amazon reportedly told PC Magazine that the company chose the female voice in response to audience preference. While audiences  may prefer the female voice for service-oriented artificial intelligence, studies suggest that the male voice is more frequently used for prestigious, research projects.

Not only does this virtual gender dichotomy reflect a culture of sexism, but it works to further fix women within strict gender roles.

So next time you’re using your handy dandy digital assistant, think about the reasons it sounds like you and what that means.

And if you’re really trying to “smash the patriarchy,” why not go ahead and change that default female voice to a male one. See for yourself what it’s like to be served by a virtual man.

Edited by Shahrazad Encinias
Send this to a friend