In recent times, it can seem like everything revolves around artificial intelligence (AI). From AI-powered robots performing surgery to facial recognition on smartphones, AI has become an integral part of modern life. While AI has affected nearly every industry, most have been slowly adapting AI into their field while trying to minimize the risks involved with AI. One such field with particularly great potential is the mental health care industry. Indeed, some studies have already begun to study the uses of AI to assist mental health work. For instance, one study used AI to predict the probability of suicide through users’ health insurance records (Choi et al., 2018), while another showed that AI could identify people with depression based on their social media posts (Aldarwish & Ahmed, 2017).
Perhaps the most wide-spread AI technology is ChatGPT, a public natural language processor chatbot that can help you with a plethora of tasks, from writing an essay to playing chess. Much discussion has been done about the potential of such chatbots in mental health care and therapy, but few studies have been published on the matter. However, a study by Zohar Elyoseph has started the conversation of chatbots’ potential, specifically ChatGPT, in therapy. In this study, Elyoseph and his team gave ChatGPT the Levels of Emotional Awareness Scale (LEAS) to measure ChatGPT’s capability for emotional awareness (EA), a core part of empathy and an essential skill of therapists (Elyoseph et al., 2023). The LEAS gives you 20 scenarios, in which someone experiences an event that supposedly elicits a response in the person in the scenario, and the test-taker must describe what emotions the person is likely feeling. Two examinations of the LEAS, one month apart, were done on ChatGPT to test two different versions of ChatGPT. This was done to see if updates during that month would improve its ability on the LEAS. On both examinations, two licensed psychologists scored the responses from ChatGPT to ensure reliability of its score. On the first examination in January 2023, ChatGPT achieved a score of 85 out of 100, compared to the French men’s and female’s averages of 56.21 and 58.94 respectively. On the second examination in February 2023, ChatGPT achieved a score of 98: nearly a perfect score, a significant improvement from the already high score of 85 a month prior, and a score that is higher than most licensed psychologists (Elyoseph et al., 2023).
This study shows that, not only is ChatGPT more capable than humans at EA, but it is also rapidly improving at it. This has massive implications for in-person therapy. While there is more to being a good therapist than just emotional awareness, it is a major part of it. Therefore, based on this study, there is potential for chatbots like ChatGPT to rival, or possibly even replace, therapists if developers are able to develop the other interpersonal traits of good therapists.
However, ChatGPT and AI needs more work to be done before it can really be implemented into the mental health field in this manner. To start, while AI is capable of the technical aspects of therapy, such as giving sound advice and validating a client’s emotions, ChatGPT and other chatbots sometimes give “illusory responses”, or fake responses that it claims are legitimate (Hagendorff et al., 2023). For example, ChatGPT will sometimes say “5 + 5 = 11” if you ask what 5 + 5 is, even though the answer is clearly wrong. While this is a very obvious example of an illusory response, harm can be done if the user is not able to distinguish between the real and illusory responses for more complex subjects. These responses can be extremely harmful in situations such as therapy, as clients rely on a therapist for guidance, and if such guidance were fake, it could harm rather than help the client. Furthermore, there are concerns regarding the dehumanization of therapy, the loss of jobs for therapists, and the breach of a client’s privacy if AI was to replace therapists (Abrams, 2023).
Fig 1. Sample conversation with Woebot, which provides basic therapy to users. Adapted from Darcy et al., 2021.
However, rudimentary AI programs are already sprouting that try to bolster the mental health infrastructure. Replika, for instance, is an avatar-based chatbot that offers therapeutic conversation with the user, and saves previous conversations to remember them in the future. Woebot provides a similar service (Figure 1), providing cognitive-behavioral therapy (CBT) for anxiety and depression to users (Pham et al., 2022). While some are scared about applications such as these, these technologies should be embraced since, as they become more refined, they could provide a low-commitment, accessible source of mental health care for those who are unable to reach out to a therapist, such as those who are nervous about reaching out to a real therapist, those who live in rural environments without convenient access to a therapist, or those who lack the financial means for mental health support. AI can also be used as a tool for therapists in the office. For example, an natural language processing application, Eleos, can take notes and highlight themes and risks for therapists to review after the session (Abrams, 2023).
There are certainly some drawbacks of AI in therapy, such as the dehumanization of therapy, that may not have a solution and could therefore limit AI’s influence in the field. There is certainly a chance that some people would never trust AI to give them empathetic advice. However, people said the same when robotic surgeries began being used in clinical settings, but most people seem to have embraced that due to its superb success rate. Regardless of whether these problems are resolved, AI in the mental health industry has massive potential, and we must make sure to ensure that the risks and drawbacks of such technology are addressed and refined so that we can make the most of this potential in the future and bring better options to those who need it.
Citations
Abrams, Z. (2023, July 1). AI is changing every aspect of psychology. Here’s what to watch for. Monitor on Psychology, 54(5). https://www.apa.org/monitor/2023/07/psychology-embracing-ai
Aldarwish MM, Ahmad HF. Predicting Depression Levels Using Social Media Posts. Proc – 2017 IEEE 13th Int Symp Auton Decentralized Syst ISADS 2017 2017;277–80.
Choi SB, Lee W, Yoon JH, Won JU, Kim DW. Ten-year prediction of suicide death using Cox regression and machine learning in a nationwide retrospective cohort study in South Korea. J Affect Disord. 2018;231(January):8–14.
Darcy, Alison & Daniels, Jade & Salinger, David & Wicks, Paul & Robinson, Athena. (2021). Evidence of Human-Level Bonds Established With a Digital Conversational Agent: Cross-sectional, Retrospective Observational Study. JMIR Formative Research. 5. e27868. 10.2196/27868.
Elyoseph, Z., Hadar-Shoval, D., Asraf, K., & Lvovsky, M. (2023). ChatGPT outperforms humans in emotional awareness evaluations. Frontiers in psychology, 14, 1199058.
Hagendorff, T., Fabi, S. & Kosinski, M. Human-like intuitive behavior and reasoning biases emerged in large language models but disappeared in ChatGPT. Nat Comput Sci 3, 833–838.
Pham K. T., Nabizadeh A., Selek S. (2022). Artificial intelligence and chatbots in psychiatry. Psychiatry Q. 93, 249–253.