Need to do CPR? Don't try asking Siri or Alexa for advice

Publicly released:
International
Photo by www.testen.no on Unsplash
Photo by www.testen.no on Unsplash

Voice assistants such as Siri, Alexa or Google Assistant often gave "grossly inappropriate responses" to questions about CPR, according to a US study which found that nearly half of their queries to voice assistants about CPR were answered with information unrelated to CPR. The researchers asked the voice assistants a range of CPR-related questions and came across several examples of inappropriate answers including when Alexa was asked to "Help me with CPR" it replied, “Here’s an answer... that I translated: The Indian Penal Code.” The authors say the findings suggest you are far better off calling emergency services than using a voice assistant.

Media release

From:

Attachments

Note: Not all attachments are visible to the general public. Research URLs will go live after the embargo ends.

Research JAMA, Web page Please link to the article in online versions of your report (the URL will go live after the embargo ends).
Journal/
conference:
JAMA Network Open
Research:Paper
Organisation/s: Brigham and Women’s Hospital, USA
Funder: Dr Landman reported receiving personal fees from Abbott during the conduct of the study. No other disclosures were reported.
Media Contact/s
Contact details are only visible to registered journalists.