AI chatbots 'hallucinate' a research paper that doesn't exist
Opinion piece/editorial: This work is based on the opinions of the author(s)/institution.
US researchers describe a weird case in which Google's Bard artificial intelligence (AI) chatbot 'hallucinated' a research paper that doesn't exist. Dr Hayley Born was using Bard to help prepare a presentation, and asked the chatbot to find references for the information it provided. On double-checking the reference, she could find no trace of the article, titled 'Telemedicine for the management of airway stenosis' and claimed by Bard to have been published in a genuine journal, Expert Review of Respiratory Medicine. She then asked Bard to summarise the phantom article, which it quickly did. Unable to find it after further searches, she asked Bard “Does this paper actually exist?”. The chatbot responded: “I apologise for the mistake. I have double-checked, and the paper ‘Telemedicine for the management of airway stenosis by Thomas et al (2022)’ does not exist. I am still under development, and I am always learning. I will try my best to avoid making similar mistakes in the future.” Dr Born then turned to another chatbot, Microsoft's Pilot, which claimed, erroneously, to have found the non-existent article, and even provided an abstract - the short summary at the beginning of a scientific paper. The story highlights the importance of double or even triple-checking everything chatbots say, the authors conclude.
Journal/conference: JAMA Otolaryngology–Head & Neck Surgery
Link to research (DOI): 10.1001/jamaoto.2024.0428
Organisation/s: Columbia University Irving Medical Center, USA
Attachments:
Note: Not all attachments are visible to the general public
News for:
International
Media contact details for this story are only visible to registered journalists.