Expert Reaction

EXPERT REACTION: AI learns to identify brain patterns of people with suicidal thoughts [RETRACTED]

Publicly released:
International
CC0
CC0

***UPDATE: This article was retracted on 06 April 2023****Artificial intelligence has learned to identify people with suicidal thoughts by analysing their brain scans, according to a US study. Suicide is the leading cause of death in Australia among people aged 15-44, and assessing suicide risk is one of the hardest challenges for mental health clinicians. In the study, the researchers presented 38 people who had suicidal thoughts and 41 non-suicidal people with words such as 'death', 'trouble', 'good' and 'praise' while in an MRI scanner, and a machine-learning algorithm then learned to tell the difference between the two groups. An accompanying editorial suggests such neuroimaging techniques could become a tool for medical diagnosis of psychiatric disorders, but the results need to be replicated in larger studies.

Media release

From: Springer Nature

***UPDATE: This article was retracted on 06 April 2023****

Using machine learning to identify suicidal patients

Patients with suicidal ideation can be distinguished from non-suicidal individuals with high accuracy by applying machine-learning techniques to the representation of death- and life-related concepts in the brain, reports a paper published online this week in Nature Human Behaviour. This method can also distinguish between suicidal ideators who have made a suicide attempt from those who have not.

According to the World Health Organization, close to 800,000 people die by suicide every year. The assessment of suicide risk is among the most challenging problems facing mental health clinicians: suicidal patients frequently disguise their intention to commit suicide, while clinicians’ predictions of suicide risk have shown to be poor. Markers of suicide risk that do not rely on self-reports are therefore much needed.

Marcel Just, David Brent, and colleagues presented suicidal patients and control individuals undergoing functional magnetic resonance imaging (fMRI) scans with death- and life-related words. They found that neural activity in response to six of the words (death, cruelty, trouble, carefree, good and praise) and in five brain locations best discriminated between the suicidal patients and controls. The authors then trained a machine-learning algorithm to use this information to identify which participants were patients and which were controls. The algorithm correctly identified 15 of 17 patients as belonging to the suicide group and 16 of 17 healthy individuals as belonging to the control group. The authors went on to investigate just the suicidal patients, who were divided into two groups: those who had attempted suicide (nine participants) and those who had not (eight participants). The authors trained a new algorithm that correctly distinguished between suicide attempters and non-attempters in 16 out of 17 cases.

The study’s small sample size necessitates replication. However, as Barry Horwitz notes in an accompanying News & Views, if replicated and extended to other psychiatric populations, the method developed by Just and colleagues and similar functional neuroimaging methods have the potential to become a major medical tool for the diagnosis of neuropsychiatric disorders.

Expert Reaction

These comments have been collated by the Science Media Centre to provide a variety of expert perspectives on this issue. Feel free to use these quotes in your stories. Views expressed are the personal opinions of the experts named. They do not represent the views of the SMC or any other organisation unless specifically stated.

Associate Professor Sarah Whittle is from the Melbourne School of Psychological Sciences at The University of Melbourne

Just and colleagues report in new research that brain imaging techniques can be used to predict suicidal from non-suicidal young adults. The findings contribute to a growing body of research suggesting that ‘biological markers’ can be equally, if not more useful than subjective measures (for example, a patient’s own report of their feelings), in psychiatric decision making.

The research, however, is a long way from having an impact on the actual treatment of suicidal individuals. For one, there were a small number of participants in the study, and most were male. Therefore, we don’t know how reliable the results might be, or if they apply to females. Also, the suicidal young adults were more depressed and anxious than the non-suicidal adults. So, we don’t know if the researchers’ have found biological markers of suicidality, or psychiatric problems more generally.

If future research can show that the results are reliable, and are specific to suicidality, then it’s possible that the brain-based biological markers could be used by healthcare professionals for identification and treatment of people at risk of suicide. However, given that brain scans are costly, these tools are likely only to be used for the most severely mentally-ill patients.

Last updated:  30 Oct 2017 11:12am
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest None declared.

Professor Graham Martin is Emeritus Professor in the Royal Brisbane Clinical Unit, Faculty of Medicine at The University of Queensland

Having been an avid adolescent reader of Isaac Asimov and Robert Heinlein robot stories, I was excited to read: ‘Machine learning of neural representations of suicide and emotion concepts identifies suicidal youth’ by Just et al., 2017.

The premise is that machine learning may ultimately be better at discrimination of suicidal youth from non-suicidal youth - and attempters from non-attempters - based on emotional reactions to key words, and MRI study of brain areas lighting up in response. The authors hint this may help clinicians struggling to predict which suicidal people may ultimately complete suicide (supposedly necessary for allocation of scant clinical resources). But this mires us in the logical fallacy that past suicidality predicts future suicidality.

In the unlikely possibility that every clinician will have future access to an MRI scanner and machine learning algorithms, the real excitement in the paper is confirmation that several cheap, available questionnaires (the ASIQ, PHQ-9, ASR, Spielberger Anxiety (State) and the CTQ) significantly discriminated between the groups.

Suicidal people and suicide attempters deserve the clinical opportunity to work through past traumas, find solutions to current problems, and plan a positive future. Perhaps we should focus scant mental health funding on more trained available clinicians.

Last updated:  02 Nov 2017 4:25pm
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest None declared.

Professor Matthew Large is Conjoint Professor in the School of Psychiatry at the University of New South Wales

Suicide risk assessment works notoriously badly and it might be very useful to have some sort of test for future suicide. However, this study should be seen in the light of two major limitations.

First, it is entirely unsurprising that the very many data points produced by functional magnetic imaging can be used to retrospectively classify a very small sample of patients. Any excitement about this should await replication in a larger, untested group of people.

Second, even if suicide thoughts could be reliably determined by a machine, suicide thoughts themselves are only weakly associated with suicide attempts and are of next to no value in predicting who will and will not suicide.

Last updated:  30 Oct 2017 11:03am
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest None declared.

The title of the paper says brain imaging data ‘identifies suicidal youth’. Read the fine print, though, and you will find this is not true.

This study had 79 people, 38 who reported that they thought about suicide and 41 who said they did not. Can brain imaging reliably tell us which subjects were which? The simple answer is no.

Of these 79 people, more than half (57 per cent) gave brain imaging data that were unusable for any attempt at classifying the subject as at-risk of suicide or not. That included 21 (55 per cent) of the people at risk of suicide. So, even if the results of this study generalized to all people, 55 per cent of people genuinely at risk could not be identified by the methods reported here.

Importantly, a check - which studies like this standardly use - was omitted. Even when you have found a way of classifying people into two groups on the basis of analysing brain imaging data, you cannot claim that you have a genuine method for doing such classification unless you show that the artificial intelligence algorithm can successfully classify a new set of people on whom it has not been trained.  This is called cross-validation. Because this wasn’t done, the authors can’t even claim that this method will reliably detect risk of suicide in the 43 per cent of people who yield usable brain imaging data.

Last updated:  02 Nov 2017 4:27pm
Contact information
Contact details are only visible to registered journalists.
Declared conflicts of interest None declared.

Multimedia

Unedited Dr Matthew Large expert reaction
EXPERT REACTION Dr Matthew Large
Journal/
conference:
Nature Human Behaviour
Research:Paper
Organisation/s: Carnegie Mellon University, USA
Funder: National Institute of Mental Health
Media Contact/s
Contact details are only visible to registered journalists.