MyVoice – Sign Language to English Translator (and Vice Versa)
Engineering and design students from the University of Houston Texas developed a prototype device called MyVoice which can translate spoken words into sign language and sign language into spoken words.
The hearing impaired often face a communication problem with those of us who can hear. While they might have difficulty reading lips (not something you can do in every situation), those who can hear do not typically understand sign language (and often have a hard time understanding what the hearing impaired person say). -
There are between 500,000 and 1 million people living in the U.S. considered to be “deaf”(according to a study conducted by the Gallaudet Research Institute in 2005). This huge population requires a solution to the complex two way communication problem mentioned above. In order to assist them as well as help anybody who needs to communicate with them, a group of University of Houston students teamed up to develop the concept and prototype for MyVoice, a device that reads sign language and translates its motions into audible words.
The prototype which recently earned first prize among student projects at the American Society of Engineering Education (ASEE) Annual Conference, is based around a handheld device with a built-in microphone, speaker, soundboard, video camera and screen. The device can be placed on a hard surface and reads a user’s sign language movements. MyVoice processes sign language motions and can translate them using specific algorithms into an electronic voice. It can also monitor a person’s voice and can translate words into sign language, which is projected on its LCD display. -
According to the developers the most difficult part of the project was sampling a database of sign language images which included about 200-300 images per sign. The team is hoping to continue the project and are looking for the right partner to help them turn it from a prototype into a fully functioning product. -
Although at this point the device is just a prototype it begs the question as to why not simply create a Smartphone application that can use the algorithms which were developed for the project to do the same function. After all a Smartphone has a powerful processor, an LCD and a microphone and basically anything else that MyVoice might require (and if not it’s possible to easily connect a small MyVoice accessory either physically or wirelessly). -
Iddo has a B.A. in Philosophy and Cognitive Science and an M.A. in Philosophy of Science from the Hebrew University of Jerusalem. He is currently writing his Ph.D. thesis on the relationship between the scientific community and industry. Iddo was awarded the 2006 Bar Hillel philosophy of science prize for his work on the relationship between science and technology. He is a member of the board of the lifeboat foundation and was the editor of several high-profile science and technology websites since 1999.