Posted by & filed under Consciousness, Cultural Variation, Group Processes, Intergroup Relations, Language-Thought, Moral Development, Motivation-Emotion, Social Cognition, Social Psychology, The Self.

Description: Do inanimate objects have a gender? Well while your quick answer to this question might be no, of course not, it is worth thinking about the question a bit more deeply. Since ancient times, for example, ships have been referred to as female (as ‘shes’). Why? Well, the word ship comes from the Latin word ‘navis’ which is a feminine word but that just indicates we have been feminizing ships for ages and ages. How about this, ships are vessels that contain and sustain life which sounds like a feminine characteristic, stereotypically speaking. Poetic, perhaps, but why do it at all? Certainly this seeming simple question has been attached to historical and ongoing concerns over the broad stereotypes associated with a binary definition of gender as either male or female and all that has been associated with such binary stereotypes. Jumping up closer to current times why is it that virtual assistant technologies (e.g., Alexa, Siri and ‘Google’) were all initially shipped with females’ voices and what is added to this question when we appreciate that such devices do not seen nearly as inanimate as most other inanimate objects? Rather than dismissing such questions a gender-politics overreach consider how this is yet another problematic feature of our tendency to process information about the entire world is stereotypic terms and adding or attributing binary gender concepts to things all along the way. So, think about what some of the psychological and sociological issues may be associated with calling our virtual assistants Alexa instead of Alex or of even giving them names at all and then have a read through the article linked below to learn more about these questions and about possible areas for research and debate.

Source: New psychology research finds people feel more attached to gendered technology, Laura Staloch, PsyPost.

Date: November 12, 2022

Image by mcmurryjulie from Pixabay

Article Link:

So, what were your take-aways from the research descriptions provided in the linked article? It is not surprising that people who generalize their technology as female tend to think and ‘talk’ (write) about them using more attachment terms and phrases as relational things are generally stereotypically seen as feminine. But are there broader implications here than might first appear to be the case? Are gender stereotypes (and perhaps even prejudices) reified by anthropomorphically genderizing our technologies? Perhaps that is a question we should be asking ourselves rather than our Google Nest Minis (“I am happy to serve”) or other virtual assistant technologies.

Questions for Discussion:

  1. What do we typically refer to ships as female?
  2. What are some gender stereotype issues related to the typical release of virtual assistant devices with female ‘voices”?
  3. What should we consider and what might we do to address issues associated with our tendencies to anthropomorphize and genderize our technologies and devices?

References (Read Further):

Martin, A. E., & Mason, M. F. (2023). Hey Siri, I love you: People feel more attached to gendered technology. Journal of Experimental Social Psychology, 104, 104402. Abstract

Bray, F. (2007). Gender and technology. Annu. Rev. Anthropol., 36, 37-53. Link

Waytz, A., Heafner, J., & Epley, N. (2014). The mind in the machine: Anthropomorphism increases trust in an autonomous vehicle. Journal of experimental social psychology, 52, 113-117. Link

Burkett, C. (2017). I call Alexa to the stand: The privacy implications of Anthropomorphizing virtual Assistants accompanying smart-home technology. Vand. J. Ent. & Tech. L., 20, 1181. Link

Waytz, A., Cacioppo, J., & Epley, N. (2010). Who sees human? The stability and importance of individual differences in anthropomorphism. Perspectives on Psychological Science, 5(3), 219-232. Link

Ha, Q. A., Chen, J. V., Uy, H. U., & Capistrano, E. P. (2021). Exploring the privacy concerns in using intelligent virtual assistants under perspectives of information sensitivity and anthropomorphism. International journal of human–computer interaction, 37(6), 512-527. Link

Zheng, J. F., & Jarvenpaa, S. (2021). Thinking Technology as Human: Affordances, Technology Features, and Egocentric Biases in Technology Anthropomorphism. Journal of the Association for Information Systems, 22(5), 1429-1453. Link

Lim, W. M., Kumar, S., Verma, S., & Chaturvedi, R. (2022). Alexa, what do we know about conversational commerce? Insights from a systematic literature review. Psychology & Marketing, 39(6), 1129-1155. Link

Tassiello, V., Tillotson, J. S., & Rome, A. S. (2021). “Alexa, order me a pizza!”: The mediating role of psychological power in the consumer–voice assistant interaction. Psychology & Marketing, 38(7), 1069-1080. Link