Acoustic Maps to Aid the Blind

Researchers at the University of Bristol in England have developed a new method to convert images from lasers and digital cameras into real-time, three dimensional acoustic maps that help the blind navigate around obstacles in their path. The images are converted to sounds that get louder as objects get nearer, accurately reflecting their orientation with respect to the user. Coupled with related work from the University of Laguna in Spain and several other institutions, these maps could result in a workable assistive technology for the sight-impaired in the near future.
A blind man navigating with the help of an acoustic map (Credit: University of Bristol)
A blind man navigating with the help of an
acoustic map (Credit: CASBlip Project)

The Bristol system integrates real-time image processing with new algorithms designed to identify specific objects like trees, furniture, and people. The algorithms can also identify objects in motion and predict their trajectory and speed. The images and related data are then transformed to sound using a method designed by scientists at the University of Laguna in Spain. The resulting acoustic maps are fed to blind people through a pair of headphones and thus enable them to navigate successfully around both static and moving obstacles.

The headphones use stereo sound to pinpoint a location in space. The principle is similar to the location tests frequently included in standard hearing tests where sound is fed into only one ear as the patient is asked to identify which side the sound came from. Here, the directional abilities are significantly more robust and take into account the rotational position of the wearer’s head at any given moment using an integrated gyroscope developed by scientists at the University of Marche in Italy. Distance is tracked using an intensity factor – the closer the object, the louder the sound created. Imminent collisions cause a loud warning sound to ring, alerting the user to get out of the way.

Two prototypes currently exist – the first prototype uses infrared lasers mounted on the inside of a pair of glasses. With a 60 degree field of view it detects objects up to 5 meters away. The second prototype adds digital cameras on the side of a test helmet worn by users, and by so doing greatly increases the field of view covered by the map. Although not currently integrated into the device, researchers are also exploring the use of an onboard GPS system to help direct wearers away from known, unchanging obstacles. This could free up additional processing power for more applications and allow for improvements in the speed, distance, or angular precision of the detected data.

Considerable testing with both of these prototypes has been very successful, but researchers say more testing is needed before bringing the device to market. In particular, extensive reliability testing must be performed to ensure that the device won’t suddenly stop working as a user crosses a busy street or is in some other dangerous situation.

TFOT has previously reported on other innovative assistive technologies including a new robot that can open doors for people with problems turning handles, a wheelchair that can react to the thoughts of its users and move accordingly, two devices from Honda that help people walk with a more even stride and to lift and squat more easily, and a personalized user interface that adopts itself to the specific visual and motor abilities of its users.

Read more about the new assistive technology and view a video of it in use in this news site, designed to promote research findings funded by the European Union.

Icon image credit: Wikimedia Commons user Anthere.

Related Posts