Audio Cues are Provided to Visually Impaired Users through Glasses

Audio Cues are Provided to Visually Impaired Users through Glasses

Scientists from the University of Technology, Sydney have developed a groundbreaking aid for people with blindness or low vision. This system features a pair of glasses with an integrated camera that utilizes computer vision technology to analyze nearby items and produce a sound to alert the user about their environment. These “audio symbols” could vary from the sound of leaves rustling to a dog barking, offering added environmental cues to those with limited vision and helping them in everyday activities.

The fast-paced evolution of technology designed to assist those with visual impairments is revolutionizing their perception of the world. Such systems hold the promise of greatly enhancing their ability to carry out everyday tasks, minimizing their reliance on others, and bolstering their self-confidence and autonomy.

This new technology bears some resemblance to the echolocation method bats use, but it relies on computer vision instead of soundwaves to detect nearby items. However, sound is still employed to communicate the identity of the observed object via audio symbols.

“Smart glasses generally employ computer vision and other sensory data to transform the wearer’s environment into computer-generated speech,” elaborated Chin-Teng Lin, one of the creators of this system. “But acoustic touch technology sonifies objects, generating distinct sound representations as they come within the device’s field of vision. For instance, the rustling of leaves could signify a plant, or a buzzing might indicate a mobile phone.”

So far, 14 individuals have tested the glasses. Half of these participants were blind or had low vision, while the other half had normal vision but wore blindfolds for the tests. The glasses allowed users to accurately identify and handle items within the system’s field of view.

“The audio feedback enables users to recognize and grasp objects with impressive precision,” stated Howe Zhu, another researcher involved in the project. “Our results indicate that acoustic touch could offer a wearable and efficient means of sensory augmentation for the visually impaired community.”

The study was published in PLOS ONE journal.