Media release
From:
Many New Zealand GPs have taken up the use of AI scribes to transcribe patient notes during consultations despite ongoing challenges with their legal and ethical oversight, data security, patient consent, and the impact on the doctor-patient relationship, a study led by the University of Otago, Wellington – Ōtākou Whakaihu Waka, Pōneke has found.
The researchers surveyed 197 health providers working in primary care in February and March of 2024, providing a snapshot in time of the use of AI-scribes in clinical practice. Most of the respondents were GPs but others included nurses, nurse practitioners, rural emergency care providers and practice managers. Their early experiences with AI-scribes was mixed – with users expressing both enthusiasm and optimism, along with concerns and frustrations.
Forty per cent of those surveyed reported using AI scribes to take patient notes. Only 66 per cent had read the terms and conditions on the use of the software, and 59 per cent reported seeking patient consent.
Lead researcher Professor Angela Ballantyne, a bioethicist in the Department of Primary Health Care and General Practice, says AI transcription services are being rapidly taken up by primary care practices, even though national regulations and guidelines are still being developed.
Most of those surveyed who used AI-scribes found them helpful, or very helpful, with 47 per cent estimating that using them in every consultation could save between 30 minutes and two hours a day. A significant minority however said the software did not save time overall because it took so long to edit and correct AI-generated notes.
Health professionals who responded to the survey mentioned concerns about the accuracy, completeness and conciseness of the patient notes produced by AI-scribes.
One doctor said: “(It) missed some critical negative findings. This meant I didn’t trust it.” Another commented that they’d stopped using AI transcriptions because the ‘hallucination rate’ was quite high, and often quite subtle.
Others expressed concern about the inability of AI-scribes to understand New Zealand accents or vocabulary and te reo Māori. One mentioned pausing recordings if they needed to discuss information which identified the patient such as a name or a date of birth.
Over half of those surveyed said using an AI-scribe changed the dynamic of consultations with patients, as they needed to verbalise physical examination findings and their thought processes to allow the transcription tool to capture information.
One of the GPs surveyed commented: “Today someone said, ‘I’ve got pain here’, and pointed to the area, and so I said out loud ‘oh, pain in the right upper quadrant?’”
Professor Ballantyne says there is a need to track and evaluate the impact of AI tools on clinical practice and patient interactions.
Those using an AI-scribe felt it enabled them to focus more on their patients and build better engagement and rapport through more eye contact and active listening.
There was concern among those surveyed about whether the use of an AI-scribe complied with New Zealand’s ethical and legal frameworks.
Professor Ballantyne says health practitioners have a professional and legal responsibility to ensure their clinical notes are accurate, whether or not they have used AI transcription tools.
“They need to be vigilant about checking patient notes for accuracy. However, as many survey respondents noted, carefully checking each AI-generated clinical note eats into, and sometimes negates any time savings.”
Professor Ballantyne says it is vital that the benefits which AI-scribes can deliver are balanced against patient rights and the need to ensure data security.
“Most AI-scribes rely on international cloud-based platforms (often privately owned and controlled), for processing and storing data, which raises questions about where data is stored, who has access to it, and how it can be protected from cyber threats.
“There are also Aotearoa-specific data governance issues that need to be recognised and resolved, particularly around Māori data sovereignty.”
In July, the National Artificial Intelligence and Algorithm Expert Advisory Group (NAIAEAG) at Health New Zealand – Te Whatu Ora endorsed two ambient AI-scribe tools, Heidi Health and iMedX, for use by its clinicians in Aotearoa. NAIAEAG considers privacy, security, ethical and legal issues.
Professor Ballantyne says to the extent that AI tools are novel, it cannot be assumed that patients consent to their use.
“Patients should be given the right to opt out of the use of AI and still access care, and adequate training and guidelines must be put in place for health providers.”
The Medical Council of New Zealand is expected to release guidance about the use of AI in health later this year, which is likely to require patients give consent to the use of AI transcription tools.
Professor Ballantyne says AI tools are improving over time, which may ameliorate some of the ethical concerns.
“Coupled with appropriate training, good governance and patient consent, the future of AI scribes holds much promise.”