Soulless Assistant

Google and Peerless Insights (2017) reported that 41% of users felt that their voice-activated speakers were like another person or friend.
According to robotic intelligence company Robin Labs, at least 5% of digital assistant inquiries are sexually explicit in nature.

How We View Them

“Voice assistants play a unique role in society; as both technology and social interactions evolve, recent research suggests that users view them as somewhere between human and object. While this phenomenon may somewhat vary by product type — people use smart speakers and smartphone assistants in different manners — their deployment is likely to accelerate in coming years.”

Ai Women

“Women are more likely to both offer and be asked to perform extra work, particularly administrative work — and these “non-promotable tasks” are expected of women but deemed optional for men. In a 2016 survey, female engineers were twice as likely, compared to male engineers, to report performing a disproportionate share of this clerical work outside their job duties.”

Ai Race

“In her book “Race After Technology,” Princeton professor Ruha Benjamin described how apparent technology glitches, such as Google Maps verbally referring to Malcolm X as “Malcolm Ten,” are actually design flaws born from homogenous teams.

Voice Recognition Errors

Emily Couvillon Alagha et al. (2019) found that Google Assistant, Siri, and Alexa varied in their abilities to understand user questions about vaccines and provide reliable sources. The same year, Allison Koenecke et al. (2019) tested the abilities of common speech recognition systems to recognize and transcribe spoken language and discovered a 16% point gap in accuracy between Black participants’ voices and White participants’ voices.

Our platonic relationships can be just as nourishing, intimate, and meaningful as our romantic ones.

Anthropomorphic engagement

Your iPhone opens on your thumbprint
Your Siri & Xbox respond to your voice
Your Windows 10 opens your laptop after scanning your face
Your google & facebook show ads based on what you’ve searched for in the past

Ai Women (by Caitlin Chin and Mishaela Robison, 2020)

One particular area deserving greater attention is the manner in which AI bots and voice assistants promote unfair gender stereotypes. Around the world, various customer-facing service robots, such as automated hotel staff, waiters, bartenders, security guards, and child care providers, feature gendered names, voices, or appearances. In the United States, Siri, Alexa, Cortana, and Google Assistant — which collectively total an estimated 92.4% of U.S. market share for smartphone assistants — have traditionally featured female-sounding voices.

Sexism Towards Ai


Implicit Bias 101

Our brains produce biases the way Google’s autofill search bar produces what it thinks we’re going to type. Eyes see dark skin… brain’s autofill predicts thug/aggression.

Existence & Essence

As entities with free will, we create meaning for ourselves while we’re alive. To quote Simone De Beauvoir’s lifelong bae, Sartre, our “existence precedes our essence.”

Given the increased usage of Replika during the pandemic, it’s easy to understand why people would hold a funeral for their robots

Algorithmic Self

Nick Clegg (May 31, 2021)

”Imagine you’re on your way home when you get a call from your partner. They tell you the fridge is empty and ask you to pick some things up on the way home. If you choose the ingredients, they’ll cook dinner. So you swing by the supermarket and fill a basket with a dozen items. Of course, you only choose things you’d be happy to eat — maybe you choose pasta but not rice, tomatoes but not mushrooms. When you get home, you unpack the bag in the kitchen and your partner gets on with the cooking — deciding what meal to make, which of the ingredients to use, and in what amounts. When you sit down at the table, the dinner in front of you is the product of a joint effort, your decisions at the grocery store and your partner’s in the kitchen.

Erin Rivero, 2020

[Jarryd: You have more range of expression with your Ai & user engagement using a female voice than a male voice given how we socialize males.]

According to Lai, the more that culture teaches people to equate women with assistants, the more real women will be seen as assistants — and penalized for not being assistant-like.

This demonstrates that powerful technology can not only replicate gender inequalities, but also widen them (Mark West et al., 2019). The problem of gendered AI is not limited to these social repercussions; equally troublesome, the digital divide is no longer defined by access inequality alone, but by a growing gender gap in digital skills (West et al., 2019). This gap is both global and slyly inconspicuous. And it is a devastating root cause beneath the current and future paucity of women in technology roles. It’s also a gap made all the more unsettling by the crucial moment in which we find ourselves, as transformative digital assistant technology is in a skyrocketing developmental phase while simultaneously influencing…

Algorithmic Bias

¨In a world where inequalities run deep, deficits in AI risk deepening those inequalities, perpetuating bigotry, homophobia, xenophobia, and violence. Professor and codirector of the UCLA Center for Critical Internet Inquiry, Safiya Umoja Noble, penned a treatise on the topic, Algorithms of Oppression: How Search Engines Reinforce Racism (Safiya Noble, 2018). From racist and misogynist misrepresentation of women and people of color in online spaces, to predictive policing and bias in housing, employment, and credit decisions, algorithms of oppression are as ubiquitous as the voice assistant technology they power. As Virginia Eubanks delineated in Automating Inequality, “Automated eligibility systems, ranking algorithms, and predictive risk models control which neighborhoods get policed, which families attain needed resources, who is short-listed for employment, and who is investigated for fraud” (Virginia Eubanks, 2017).

Felipe Pierantoni, 2020

Impoliteness as a consequence of smart speakers. As this technology becomes mainstream, children learn communication habits that they might reproduce with actual people (Childwise, 2018).

Voice fosters intimacy and leads us to treat voice-capable devices — especially smart speakers — as if they had their own mind (Shulevitz, 2018).

Considering these assumptions, it is reasonable to consider that, just as people attach social responses to voice assistants, these social human-computer interactions could then influence our human-human interaction.

Different Kind of Yes

“[In the theatrical play El Sí De Las Niñas,] the ‘yes’ pronounced by the girls on the occasion of their imposed weddings with much older men involved renouncing their biological families for the sake of adopting and being accepted into the families of their husbands, changing deeds and often even friends, social circles and lifestyles. That ‘yes’ performed a very different act than the ‘yes’ given in response to ‘Do you want a cup of coffee?’ or ‘Do you live in Stockholm?’ Each ‘yes’ might sound the same, but it does different things, paves the path to different consequences and defines different actors.” (Barinaga, 2009).

Voice assistants are designed to be helpful, humble and deprived of many negative features that would describe a bad listener, as “they will patiently listen to everything, without ridiculing or revealing the secrets ‘entrusted’ to them” (Biele et al., 2019) — even if this latter part is not completely true. The result is a computational agent that is seemingly capable of fulfilling our need for relatedness.

“People reveal more intimate things to voice assistants; as such, there are numerous reports of depressive statements and suicide threats recorded by smart speakers” (Shulevitz, 2018).

Company representatives state that voice assistants “should be able to speak like a person, but should never pretend to be one” (Shulevitz, 2018). However, for the social brains of humans, what is the difference between speaking like a person and pretending to be one?

Gavin Abercrombie et al., 2021

Personification and anthropomorphism.

Yochanan Bigman et al.,



Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Dr. Jarryd Willis PhD

Dr. Jarryd Willis PhD

I'm passionate about making a tangible difference in the lives of others, & that's something I have the opportunity to do a professor & researcher.