If you’re comforted by the maternal nature of your smartphone’s voice assistant—good. That means it’s working. You’re falling right into her trap.
Wide research has exposed the blatant sexism that arises from the predominantly “female” voice assistants like Siri and Alexa. Just the decision to utilize female voices in the subservient, “pink collar” role of voice assistants reinforces sexist stereotypes of women as unskilled, servile laborers.
Of course, these voices are feminine because society wants them that way; research indicates that people prefer female voice assistants due to their nurturing, submissive responses. Ultimately, tech companies allow societal sexism to shape their products, instead of using their platform as a means of rewiring society’s implicit biases.
But why? Well, the fact that women only make up 12% of A.I researchers and 6% of software developers certainly doesn’t help. But arguably the largest reason behind sexist voice assistants is (as you can probably guess) money.
Sure, having feminine voices in a company’s tech products that satisfy the masses increases sales. But these comforting, maternal figures have an ulterior—yet still overwhelming capitalistic—motive: to pacify us into handing over our sensitive personal data.
Heather Suzanne Woods, in her article “Asking more of Siri and Alexa: feminine persona in service of surveillance capitalism,” demonstrates how these docile, feminine agents that reinforce gender stereotypes constitute an intentional (and successful) attempt by companies to profit off our data. Siri and Alexa lull us to sleep with their motherly charm, getting us to spill our most personal desires and interests without ever thinking twice.
The tech companies behind these products store this data, using it to display personalized advertisements based on your conversations with your cheery, female helper. Netflix documentary The Social Dilemma details how social media sites already harvest our data for personalized ads, so it’s no surprise that our voice assistants do the same. Moreover, these human-voice assistant conversations train our devices, teaching them how to placate us even more effectively.
And thus, surveillance capitalism is born. The vast amount of data we “willingly” supply to Siri and Alexa is scrutinized by machine learning technology, eventually forming predictions of our future behavior—AKA when we will be most receptive to those personalized advertisements.
But it doesn’t stop there. Shoshana Zuboff, the Harvard Business School professor who coined the term “surveillance capitalism” states how surveillance capitalists “learn to tune, herd, and condition our behavior with subtle and subliminal cues, rewards, and punishments that shunt us toward their most profitable outcomes” (read the full interview with Zuboff in The Harvard Gazette). Every suggestion, every notification from our voice assistants has monetary motivations.
Just when we thought we had the authority over our voice assistants, it turns out they (thanks to surveillance capitalists) were the ones influencing our actions the whole time.
So, the next time you go to ask Siri for pizza recommendations or to pick Alexa’s brain on tropical getaways, you might want to think about just who else is listening.