CC:0
CC:0

Could your next psych appointment be with Dr ChatGPT?

Embargoed until: Publicly released:
Peer-reviewed: This work was reviewed and scrutinised by relevant independent experts.

Experimental study: At least one thing in the experiment was changed to see if it had an impact on the subjects (often people or animals) – eg: changing the amount of time mice spend on an exercise wheel to find out what impact it has on weight loss.

People: This is a study based on research using people.

US researchers say AI such as ChatGPT could be as good as, if not better, than psychotherapists with our mental health therapy. The team presented over 800 people with the responses from ChatGPT or real therapists on 18 couples’ therapy vignettes - examples of different therapy needs that help train medical staff. They found that while the participants were able to notice a difference in language patterns, they were rarely able to identify whether the responses were written by the AI or a human. Additionally, they found that the responses written by ChatGPT were generally rated higher in core psychotherapy guiding principles - which help ensure clients are treated with respect and have the space to make their own choices.

Journal/conference: PLOS Mental Health

Research: Paper

Organisation/s: Hatch Data and Mental Health, USA

Funder: The authors received no specific funding for this work.

Media release

From: PLOS

ChatGPT has the potential to improve psychotherapeutic processes

Generative artificial intelligence can be useful in therapeutic settings

When it comes to comparing responses written by psychotherapists to those written by ChatGPT,the latter are generally rated higher, according to a study published February 12, 2025, in the open-access journal PLOS Mental Health by H. Dorian Hatch, from The Ohio State University and co-founder of Hatch Data and Mental Health, and colleagues

Whether machines could be therapists is a question that has received increased attention given some of the benefits of working with generative artificial intelligence (AI). Although  previous research has found that humans can struggle to tell the difference between responses from machines and humans, recent findings suggest that AI can write empathically and the generated content is rated highly by both mental health professionals and voluntary service users to the extent that it is often favored over content written by professionals.

In their new study involving over 800 participants, Hatch and colleagues showed that, although differences in language patterns were noticed, individuals could rarely identify whether responses were written by ChatGPT or by therapists when presented with 18 couple’s therapy vignettes. This finding echoes Alan Turing’s prediction that humans would be unable to tell the difference between responses written by a machine and those written by a human. In addition, the responses written by ChatGPT were generally rated higher in core psychotherapy guiding principles.

Further analysis revealed that the responses generated by ChatGPT were generally longer than those written by the therapists. After controlling for length, ChatGPT continued to respond with more nouns and adjectives than therapists. Considering that nouns can be used to describe people, places, and things, and adjectives can be used to provide more context, this could mean that ChatGPT contextualizes more extensively than the therapists. More extensive contextualization may have led respondents to rate the ChatGPT responses higher on the common factors of therapy (components that are common to all modalities of therapy in order to achieve desired results).

According to the authors, these results may be an early indication that ChatGPT has the potential to improve psychotherapeutic processes. In particular, this work may lead to the development of different methods of testing and creating psychotherapeutic interventions. Given the mounting evidence suggesting that generative AI can be useful in therapeutic settings and the likelihood that it might be integrated into therapeutic settings sooner rather than later, the authors call for mental health experts to expand their technical literacy in order to ensure that AI models are being carefully trained and supervised by responsible professionals, thus improving quality of, and access to care.

The authors add: "Since the invention of ELIZA nearly sixty years ago, researchers have debated whether AI could play the role of a therapist. Although there are still many important lingering questions, our findings indicate the answer may be "Yes." We hope our work galvanizes both the public and Mental Practitioners to ask important questions about the ethics, feasibility, and utility of integrating AI and mental health treatment, before the AI train leaves the station."

News for:

International

Media contact details for this story are only visible to registered journalists.