Could blind and low-vision people use bat vision glasses to hear their surroundings?

Publicly released:
Australia; NSW
Photo taken by Lil Deverell (co-author) at the Motion Platform and Mixed Reality Lab in Techlab at the University of Technology Sydney (UTS), Australia, CC-BY 4.0
Photo taken by Lil Deverell (co-author) at the Motion Platform and Mixed Reality Lab in Techlab at the University of Technology Sydney (UTS), Australia, CC-BY 4.0

Smart glasses that use a technique similar to a bat's echolocation could help blind and low-vision people navigate their surroundings, according to Australian researchers. Bats navigate using echolocation by emitting soundwaves that echo back to them, giving them information about the size and distance of objects around them. The team developed smart glasses using a similar principle - creating audio icons that play when an object enters the field of view. Tested by a small group of blind, low-vision and blindfolded sighted people, the researchers say the participants were able to use the glasses to better reach and recognise objects around them without too much additional mental work.

Media release

From: University of Technology Sydney (UTS)

Australian researchers have developed cutting-edge technology known as “acoustic touch” that helps people ‘see’ using sound. The technology has the potential to transform the lives of those who are blind or have low vision.

Around 39 million people worldwide are blind, according to the World Health Organisation, and an additional 246 million people live with low vision, impacting their ability to participate in everyday life activities.

The next generation smart glasses, which translate visual information into distinct sound icons, were developed by researchers from the University of Technology Sydney and the University of Sydney, together with Sydney start-up ARIA Research.

“Smart glasses typically use computer vision and other sensory information to translate the wearer’s surrounding into computer-synthesized speech,” said Distinguished Professor Chin-Teng Lin, a global leader in brain-computer interface research from the University of Technology Sydney.

“However, acoustic touch technology sonifies objects, creating unique sound representations as they enter the device's field of view. For example, the sound of rustling leaves might signify a plant, or a buzzing sound might represent a mobile phone,” he said.

A study into the efficacy and usability of acoustic touch technology to assist people who are blind, led by Dr Howe Zhu from the University of Technology Sydney, has just been published in the journal PLOS ONE.

The researchers tested the device with 14 participants; seven individuals with blindness or low vision and seven blindfolded sighted individuals who served as a control group.

They found that the wearable device, equipped with acoustic touch technology, significantly enhanced the ability of blind or low-vision individuals to recognise and reach for objects, without causing too much mental effort.

“The auditory feedback empowers users to identify and reach for objects with remarkable accuracy,” said Dr Zhu. “Our findings indicate that acoustic touch has the potential to offer a wearable and effective method of sensory augmentation for the visually impaired community.”

The research underscores the importance of developing assistive technology in overcoming the challenges such as locating specific household items and personal belongings.

By addressing these day-to-day challenges, the acoustic touch technology opens new doors for individuals who are blind or have low vision, enhancing their independence and quality of life.

With ongoing advancements, the acoustic touch technology could become an integral part of assistive technologies, supporting individuals to access their environment more efficiently and effectively than ever before.

Attachments

Note: Not all attachments are visible to the general public. Research URLs will go live after the embargo ends.

Research PLOS, Web page The URL will go live after the embargo ends
Journal/
conference:
PLOS ONE
Research:Paper
Organisation/s: University of Technology Sydney (UTS), The University of Sydney
Funder: This work was supported by the Australian Cooperative Research Centres Projects (CRC-P) Round 11 CRCPXI000007, the ARIA research, the University of Technology Sydney, and the University of Sydney. Received by C.J, V.N and C.L. Website:https://business.gov.au/grants-andprograms/ cooperative-research-centres-crcgrants.
Media Contact/s
Contact details are only visible to registered journalists.