Posted by & filed under Uncategorized.

Description: Consider this premise. These days (in what we hope is the latter parts of the COVID-19 pandemic) there are troublingly high rates of loneliness, anxiety, depression and other mental health challenges among the general population. Therapy is hard to access both due to isolation and to its not being routinely covered as health care in North America. So, might there be an opportunity to utilize Artificial Intelligence or AI-bots to provide support or perhaps even a form of therapy at little or no cost to users. Now before you dismiss this as science fiction or fantasy go and have a chat with Eliza. Eliza has been around for over 50 years. She may be a bit frustrating as she is a Rogerian and tends to reflect most of what you say back to you so that you can think about it more deeply. Eliza is an AI-bot, though her programming has not been updated for years so perhaps she is not a good example of the potential in AI-bots for Cognitive Behavior Therapy. So, what do you think? Is there potential here? What are its potential pitfalls? Once you have your thoughts sorted in relation to these questions have a read through the article linked below to see what research and what some Cognitive Behavior Therapists (in the video) have to say on these matters.

Source: COVID-19 has made Americans lonelier than ever – here’s how AI can help, Laken Brooks, The Conversation.

Date: February 12, 2021

Photo Credit:  Image by Stefan Dr. Schulz from Pixabay

Article Link: https://theconversation.com/covid-19-has-made-americans-lonelier-than-ever-heres-how-ai-can-help-152445

SO, what do you think?  I hope you had at least one or two thoughts about the ethical issues in making AI-bot “therapists” generally available (and yes, I know, human therapists will understandably object to their designation being used this way).  Certainly Replika, Tess, and Woebot are more advanced than Eliza and their creators seem to be aware that they are not actually replacements for human therapists so much as they may be aides to those who could be helped to access human therapy. In addition, the research data indicating that people feel better after interacting with an AI-chatbot and vulnerable populations such as young people and the elderly may particularly benefit. Yes, of course, more research AND ethical caution are needed but there are some interesting possibilities here.

Questions for Discussion:

  1. Can an AI-Chatbot make people feel better?
  2. When or under what conditions might an AI-Chatbot be an appropriate part of general approached to mental health issues?
  3. What limitations should be put in place with the application of this technology and what additional research is needed in this area?

References (Read Further):

Simonite, Tom (2020) The Therapist Is In – and It’s a Chatbot, Wired. Link

Bendig, E., Erb, B., Schulze-Thuesing, L., & Baumeister, H. (2019). The next generation: chatbots in clinical psychology and psychotherapy to foster mental health–a scoping review. Verhaltenstherapie, 1-13. Link

Fulmer, R., Joerin, A., Gentile, B., Lakerink, L., & Rauws, M. (2018). Using psychological artificial intelligence (Tess) to relieve symptoms of depression and anxiety: randomized controlled trial. JMIR mental health, 5(4), e9782. Link

Metz, Cade (2020) Riding Out Quarantine With a Chatbot Friend: ‘I Feel Very Connected’, The New York Times. Link

Bell, S., Wood, C., & Sarkar, A. (2019, May). Perceptions of chatbots in therapy. In Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems (pp. 1-6). Link

Cameron, G., Cameron, D., Megaw, G., Bond, R., Mulvenna, M., O’Neill, S., … & McTear, M. (2017, July). Towards a chatbot for digital counselling. In Proceedings of the 31st International BCS Human Computer Interaction Conference (HCI 2017) 31 (pp. 1-7). Link

Lee, Y. C., Yamashita, N., Huang, Y., & Fu, W. (2020, April). ” I Hear You, I Feel You”: Encouraging Deep Self-disclosure through a Chatbot. In Proceedings of the 2020 CHI conference on human factors in computing systems (pp. 1-12). Link

Gentner, T., Neitzel, T., Schulze, J., & Buettner, R. (2020, July). A Systematic Literature Review of Medical Chatbot Research from a Behavior Change Perspective. In 2020 IEEE 44th Annual Computers, Software, and Applications Conference (COMPSAC) (pp. 735-740). IEEE. Link

Cameron, G., Cameron, D., Megaw, G., Bond, R., Mulvenna, M., O’Neill, S., … & McTear, M. (2018, October). Assessing the usability of a chatbot for mental health care. In International Conference on Internet Science (pp. 121-132). Springer, Cham. Link

Leave a Reply

Your email address will not be published. Required fields are marked *