Siri’s 2nd Language
Students whose first language isn’t English are more likely to use Siri in English than in their first language
A female Siri was the overwhelming preference (86.83%), χ2(1, N = 372) = 201.82, p < .001.
Interestingly, students whose first language isn’t English are more likely to use Siri in English than in their first language.
Students whose 1st language is English mostly use Siri in English (97.71%), χ2(1, N = 175)= 159.37, p < .001.
Students whose 1st language is NOT English also mostly use Siri in English (81.52%), χ2(1, N = 92)= 36.57, p < .001.
Most students indicated their first language is English (65.54%), χ2(1, N = 267)= 25.8, p < .001.
What vocal gender is your Siri (or other voice assistant if not iPhone) & which language do you use?
🔲 Female Siri — I use English, my first language
🔲 Female Siri — I use English, though it’s not my first language
🔲 Female Siri — In my first language, which isn’t English
🔲 Female Siri — Not in English, though English is my first language
🔲 Male Siri — I use English, my first language
🔲 Male Siri — I use English, though it’s not my first language
🔲 Male Siri — In my first language, which isn’t English
🔲 Male Siri — Not in English, though English is my first language
🎼Gender Play Gap (Caruso, 2019; Lamere, 2014) 🎶
— Men’s most listened to artists are mostly men whereas women listen to both men & women.
✝️☪️🔯🕉️☯️Gender Pray Gap (Landon Schnabel, 2019)
— Women pray more than men (assume Christianity unless [specified])
(Argyle, 1958; Baker, 2008; Batson et al., 1993; Bradshaw & Ellison, 2009; Collett & Lizardo, 2009; Firth, 1997 [Hinduism]; Ghorbani et al., 2014 [Iran Muslims]; Gonzalez, 2011 [Kuwait Muslims]; Jones, 2021; Jung, 2014 [South Korea]; Khan et al., 2015 [Pakistan Muslims]; Krause & Chatters, 2005; Miller & Hoffman, 1995; Ryberg et al., 2018; Schnabel, 2015 [Judaism]; Schnabel, 2019 [Hindu women & Buddhist women]; Voas et al., 2013).
— Women are more likely to be baptized than men
(Diane M. Notarianni, 1996; John Richard Johnson, 1982; Judge H. Klein, 2018; Ralph Peay, 2005; P. Jonsson et al., 2020).
🤖 The Gender Ai Gap
— Female voices are preferred when interacting with Ai/ voice assistants/ technology
(Borau et al., 2021; Erin Rivero, 2020; Eugenia Kuyda & Replika, 2020; Gill Martin, 2010; Nicole Hennig, 2018; Schwar and Moynihan, 2020; Shead, 2017; Stern, 2017) as people perceive greater warmth, friendliness, & emotional IQ from a female AI-bot than a male (Eyssel & Hegel, 2012; Gustavsson, 2005; Lopatovska et al., 2021; Otterbacher & Talias, 2017; Stroessner & Benitez, 2019), and view female AI as more trustworthy (Siegel et al., 2009).
The issue is that “constantly representing digital assistants as female gradually “hard-codes” a connection between a woman’s voice and subservience; [thus,] as female digital assistants spread, the frequency and volume of associations between “woman” and “assistant” increase dramatically” (Mark West et al., 2019; EQUALS and UNESCO, 2019).
West & Fenskermaker (1995) emphasized the interactive nature of how we “do” identities. Markus and Moya (2010) extend this to racialization by arguing “that race is not something that people or groups have or are, but rather a set of actions that people do” (emphasis in original).
They write that “doing race always involves creating groups based on perceived physical and behavioral characteristics, associating differential power and privilege with these characteristics, and then justifying the resulting inequalities” (2010, 4). The theorizing of doing race is consistent with the conceptualization of race as a social construct formed over time and influenced by context (Omi & Winant, 1994).
The Female Voice of Zoom
Siri, Alexa, Cortana, Bank of USA Erica, self-checkout, grocery intercoms, GPS, Far Cry 6 opening menu, headphones, &…
Replika — Sex, Gender, & Soulless SAvIors
Whether human or Ai, most people seek emotional support from women
Our brains produce biases the way Google’s autofill search bar produces what it thinks we’re going to type. Eyes see…