Description: We humans are remarkably good at expressing our emotions through our facial expressions (think about how hard it is to hide them, say, for example when you are playing poker or otherwise trying to noy show others what you are feeling). We are rather good at reading other people’s facial expressions too. Darwin wrote about how the sending and receiving of emotional facial expressions provided us with an evolutionary advantage and as such strengthened over many generations. But how to study this skill? Paul Ekman and his research colleagues studies this ability to read facial expressions generally among north American research participants and then cross culturally as well showing a certain level of universality in the ability to present or read expressions of basic emotional states like happiness, sadness, anger, fear, disgust and surprise. Think about one of the big challenges in this sort of research. If you are wanting to study peoples’ ability to read facial expressions of emotion you need to be able to present them with consistent facial expression to read. To control for variability of expression, one could use photographs, but photographs are NOT the same as live faces and using love faces introduces a potentially large amount of variance. For example, what are people really feeling? Are faked emotional displays as good as the real thing? Ekman worked on this by training a group of “facial experts” to control the 42 (yup 42) muscles in their faces that are involved in expressing emotions and at the same time, trained them to read the use of those same muscles in other people. And they got pretty good at it… pretty good at using the Facial Action Coding System (FACS). Even so, variability was an issue. So, what if someone created an android face/head with actuators under its “skin” that could be used to consistently (without variability that was uncontrolled) produce facial expressions of emotions that FACS experts would agree are recognizable? Would that be of value in research into human face-reading of emotions? Once you have decided what you think, read the articles linked below (the second link has pictures) to see what the creators of Nikola have to say on this topic.
Source: Introducing Nikola, the emotional android kid, Science News, ScienceDaily.
Date: Feb 16, 2022
Article Link: https://www.sciencedaily.com/releases/2022/02/220216095846.htm or here for pictures of Nikola: https://www.designboom.com/technology/android-kid-nikola-expresses-six-basic-emotions-02-17-2022/
So, are you convinced of the research utility of Nikola? As well, were you able to see how, as the researchers claim, such an android might be of assistance in caring for isolated people with mobility needs? I am unsure but there has been some interest and success in using robot pets with dementia patients or an animated interactive chacter also for use in dementia care so perhaps there is a booming future in full-bodied Nikola offspring.
Questions for Discussion:
- How effectively can people read the facial expressions of emotion in others?
- What are some of the variability challenges of conducting research into our ability to read facial expressions of emotion?
- What do you make of the potential application claims made by the creators of Nikola? What possible applications can you think of that seem possibly viable to you?
References (Read Further):
Sato, W., Namba, S., Yang, D., Nishida, Y., Ishi, C., & Minato, T. (2022). An android for emotional interaction: Spatiotemporal validation of its facial expressions. Frontiers in Psychology, 6521. Link
Ekman, P., Freisen, W. V., & Ancoli, S. (1980). Facial signs of emotional experience. Journal of personality and social psychology, 39(6), 1125. Link
Ekman, P., & Friesen, W. V. (1974). Detecting deception from the body or face. Journal of personality and Social Psychology, 29(3), 288. Link
Ekman, P. (1976). Movements with precise meanings. Journal of communication, 26(3), 14-26. Link
Ekman, P. (2003). Darwin, deception, and facial expression. Annals of the new York Academy of sciences, 1000(1), 205-221. Link
Hess, U., & Thibault, P. (2009). Darwin and emotion expression. American Psychologist, 64(2), 120. Link
Pessoa, L. (2017). Do intelligent robots need emotion?. Trends in cognitive sciences, 21(11), 817-819. Link
Breazeal, C. (2003). Emotion and sociable humanoid robots. International journal of human-computer studies, 59(1-2), 119-155. Link
Cañamero, L. (2005). Emotion understanding from the perspective of autonomous robots research. Neural networks, 18(4), 445-455. Link