Virtual assistant default settings have misogynistic undertones

Image by: Katharine Sung

Virtual assistants like Apple’s Siri and Amazon’s Alexa are a part of daily life nowadays. Despite interacting with this technology so often, the fact that almost all virtual assistants have a default female setting tends to go unnoticed. 

Understandably, tech companies develop virtual assistants based on what psychology research says people respond to best: female voices. This psychological preference has led to the creation of a more extensive backlog of female voice recordings for companies to use. 

However, technology doesn’t exist in a vacuum, and the implications of design choices matter. Even if tech is inherently genderless, this is a gendered issue.

Artificial Intelligence products like Siri exist to serve people, and that’s where the default female gender of virtual assistants becomes problematic. Regardless of intention, it veers on the side of female objectification, especially when virtual assistants with female voices are sexualized and demeaned for their user’s entertainment. 

Women’s objectification has a long history in science fiction and technology fields, with female characters commonly appearing as sexualized machines and cyborgs. Asian women are especially prone to fetishization and orientalist portrayals that contribute to their oppression. 

This pattern of objectification legitimizes the dehumanization of women, both in the media and in everyday life, and is continued by the modern design of virtual assistants.

Having a female virtual assistant reinforces tropes that women have fought to overcome to be taken seriously at work and in their relationships. 

The misogynistic preference for permissiveness in women is built into virtual assistant programming. While the same docility exists in male-voiced virtual assistants, historically, men haven’t had permissiveness imposed upon them.

You can harass virtual assistants without consequence, which makes them ideal for receiving misogynistic abuse. Even better, they’ll never respond with agency or stand up for themselves. 

Yes, calling Siri a “dirty whore” for fun because she can’t defend herself is misogynistic. 

Fixation on the possession of a female virtual assistant and her obedience to you can be harmful and is something we should actively discourage. As AI evolves, the concern is that real women will suffer the abuse once reserved for Alexa and Siri.

Of course, technology itself is inherently genderless, but it’s in our nature to anthropomorphize objects. Doing so makes them better products because we relate to them more, which isn’t bad thing in itself. 

Automatically assigning female genders to objects designed to serve their users is the problem.

If non-gendered tech isn’t marketable, maybe the female default makes good business sense—but that doesn’t mean it’s good for society. 

As technology expands and occupies more and more of our lives, we have to think about the prejudices and biases built into our tech by human programmers. 

While there are more pressing gender-equality issues to address, the fact our technology reflects and perpetuates the misogyny in our society merits discussion. 

At this point, it’s much too late to remove the gender bias baked into our technology. However, we can still take steps to reduce the perpetuation of inequality by the technology industry and its products. 

More than anything, the tech industry needs diversity to combat its prejudice and covert misogyny.

—Journal Editorial Board

Tags

Gender, misogyny, Technology, virtual assistant

All final editorial decisions are made by the Editor(s)-in-Chief and/or the Managing Editor. Authors should not be contacted, targeted, or harassed under any circumstances. If you have any grievances with this article, please direct your comments to journal_editors@ams.queensu.ca.

Leave a Reply

Your email address will not be published. Required fields are marked *

Skip to content