HAILEY the AI that helps mental health support workers with empathy

Publicly released:
International
Image by Mohamed Hassan from Pixabay
Image by Mohamed Hassan from Pixabay

An artificial intelligence (AI)-based chat interface, named HAILEY, can assist mental health peer supporters who interact online with individuals seeking support, according to international research. HAILEY's chat interface provides suggestions for phrases to replace or insert into conversations, which support workers could either ignore or adopt, to increase the level of empathy that would come across in their responses. For example, HAILEY might suggest replacing the phrase “Don’t worry” with “It must be a real struggle." The authors found that the human–AI collaboration approach led to an almost 20% increase in conversational empathy, evaluated by a previously validated AI model, and an almost 40% increase in conversational empathy for peer supporters who had reported experiencing difficulties in providing support.

News release

From: Springer Nature

Technology: Enhancing empathic support with artificial intelligence

An artificial intelligence (AI)-based chat interface, named HAILEY, that offers assistance to mental health peer supporters who interact online with individuals seeking support, is described in a paper published in Nature Machine Intelligence. Peer supporters who took part in the study reported an overall increase in conversational empathy when using the interface and also felt more confident about providing support. 

Mental health has a serious impact on global health, with over 400 million individuals suffering from mental health disorders worldwide. Although access to therapy and counselling is limited, peer-to-peer platforms in non-clinical settings can provide some support, which provides health benefits and is strongly correlated with mental health symptom improvement.

Tim Althoff and colleagues designed HAILEY, a chat interface, which uses a previously developed language model that is specifically trained for empathic writing. They recruited 300 mental health supporters from the peer-to-peer platform TalkLife. In a controlled trial, participants were split into two groups, with one group receiving feedback on responses they wrote via HAILEY to real-world posts that were filtered to avoid harm-related content. HAILEY provided suggestions for phrases to either replace or insert, which participants could choose to ignore or adopt; for example, suggesting replacing the phrase “Don’t worry” with “It must be a real struggle.” The authors found that the human–AI collaboration approach led to an almost 20% increase in conversational empathy, evaluated by a previously validated AI model, and an almost 40% increase in conversational empathy for peer supporters who had reported experiencing difficulties in providing support. 

The authors suggest that their findings show the potential for human–AI collaboration with tasks such as empathetic conversation, but they note that further research is needed to ensure the safety of these applications.

Journal/
conference:
Nature Machine Intelligence
Research:Paper
Organisation/s: University of Washington, USA
Funder: T.A., A.S. and I.W.L. were supported in part by NSF grant IIS-1901386, NSF CAREER IIS-2142794, NSF grant CNS- 2025022, NIH grant R01MH125179, Bill & Melinda Gates Foundation (INV-004841), the Office of Naval Research (#N00014-21-1-2154), a Microsoft AI for Accessibility grant and a Garvey Institute Innovation grant. A.S.M. was supported by grants from the National Institutes of Health, National Center for Advancing Translational Science, Clinical and Translational Science Award (KL2TR001083 and UL1TR001085) and the Stanford Human-Centered AI Institute. D.C.A. was supported by NIH career development award K02 AA023814. Author contributions
Media Contact/s
Contact details are only visible to registered journalists.