Experiencing Bots in Everyday Life

While the world we live in today relies heavily on digital media and technology, the digital era has become more prominent amidst the COVID-19 pandemic. Humans now depend on technology to work, to learn, to read and to obtain entertainment now more than ever.

Artificial Intelligence (AI) exists all around us, even if it may not be realized. IOS developer and Computer science graduate, Ilija Mihajlovic, talks about the impact bots have on our everyday lives in his article How Artificial Intelligence is Impacting our Everyday Lives.  He states, “AI assists in every area of our lives, whether we’re trying to read emails, get driving directions, get music or movie recommendations (Mihajlovic, 2).”

One place we experience the use of bots in our daily life is through digital assistants. First developed within iPhones as the well-known AI, Siri, digital assistants have since then been created through various platforms: Alexa, Google Now, Microsoft’s Cortana, and more. While these digital assistants may be used on different sorts of sites, they still all serve the same purpose, and that is to assist.

Although quite annoying, most people have likely encountered the security measures taken to enter most websites at least once. The websites that use these these types of security checks solidify their reason by saying “are you a robot?” We take the tests to ensure to the robots running the sites that we are not robots.

HackTX 2018 Puzzle 3: Shopping Cart | by Florian Janke | Medium

Some of the different security tests that we use include the image provided where you have to choose all the cats; a combination of letters and numbers that you have to type into an answer box; or simply just a check box with the phrase, “I’m not a robot,” beside it. While it has been said that bots may be grading and writing first drafts of our papers (according to McKee & Porter), they are still unable to distinguish the difference between a cat and a dog.

As our world continues to make technological advances, the use of bots in our everyday lives will become more substantial. Heidi McKee and Jim Porter discuss the role that many bots and AI’s will take in the near future in their article The Impact of AI Writing and Writing Instruction. Bots will eventually be used in the workplace and in our classrooms; but the question is: is it such a bad thing for our society to rely on the use of bots as we go about our day?

Siri the Spy: How Surveillance Capitalists Reinforce Sexist Stereotypes and Harvest Our Data Through Voice Assistants

If you’re comforted by the maternal nature of your smartphone’s voice assistant—good. That means it’s working. You’re falling right into her trap.

Wide research has exposed the blatant sexism that arises from the predominantly “female” voice assistants like Siri and Alexa. Just the decision to utilize female voices in the subservient, “pink collar” role of voice assistants reinforces sexist stereotypes of women as unskilled, servile laborers.

Of course, these voices are feminine because society wants them that way; research indicates that people prefer female voice assistants due to their nurturing, submissive responses. Ultimately, tech companies allow societal sexism to shape their products, instead of using their platform as a means of rewiring society’s implicit biases.

But why? Well, the fact that women only make up 12% of A.I researchers and 6% of software developers certainly doesn’t help. But arguably the largest reason behind sexist voice assistants is (as you can probably guess) money.

Sure, having feminine voices in a company’s tech products that satisfy the masses increases sales. But these comforting, maternal figures have an ulterior—yet still overwhelming capitalistic—motive: to pacify us into handing over our sensitive personal data.

Heather Suzanne Woods, in her article “Asking more of Siri and Alexa: feminine persona in service of surveillance capitalism,” demonstrates how these docile, feminine agents that reinforce gender stereotypes constitute an intentional (and successful) attempt by companies to profit off our data. Siri and Alexa lull us to sleep with their motherly charm, getting us to spill our most personal desires and interests without ever thinking twice.

The tech companies behind these products store this data, using it to display personalized advertisements based on your conversations with your cheery, female helper. Netflix documentary The Social Dilemma details how social media sites already harvest our data for personalized ads, so it’s no surprise that our voice assistants do the same. Moreover, these human-voice assistant conversations train our devices, teaching them how to placate us even more effectively.

And thus, surveillance capitalism is born. The vast amount of data we “willingly” supply to Siri and Alexa is scrutinized by machine learning technology, eventually forming predictions of our future behavior—AKA when we will be most receptive to those personalized advertisements.

But it doesn’t stop there. Shoshana Zuboff, the Harvard Business School professor who coined the term “surveillance capitalism” states how surveillance capitalists “learn to tune, herd, and condition our behavior with subtle and subliminal cues, rewards, and punishments that shunt us toward their most profitable outcomes” (read the full interview with Zuboff in The Harvard Gazette). Every suggestion, every notification from our voice assistants has monetary motivations.

Just when we thought we had the authority over our voice assistants, it turns out they (thanks to surveillance capitalists) were the ones influencing our actions the whole time.

So, the next time you go to ask Siri for pizza recommendations or to pick Alexa’s brain on tropical getaways, you might want to think about just who else is listening.

The image is divided into two halves. On the left, a black background surrounds a spherical speaker, commonly known as an Amazon Alexa device. A blue ring is lit up around the top edge of the device. On the right side of the image, a blue background surrounds the symbol associated with Apple's voice assistant Siri: a bluish-white, disfigured star, with the brightest white in the middle of the shape. The disfigured star is surrounded by an incomplete pink circle.