MotionSavvy’s Beta translator program running on an Android tablet (via MotionSavvy)
A team of deaf student from the University of Rochester recently teamed up with Leap Motion to create a device that can translate American Sign Language (ASL) into spoken words to better help the deaf communicate with the world around them.
More than 70 million people in the United States are deaf. The vast majority, 90 percent of these persons are born into hearing families, making communication a difficulty from the start. Most deaf persons rely on human translators or written conversation to communicate, but what if they had a voice?
That’s what the MotionSavvy team was determined to find out. Three deaf students from the University of Rochester, Wade Kellard, Ryan Hait-Campbell, and Jordan Stemper put their developing skills to the test along with others to create a mobile communication device that would help deaf persons interact with the world in a natural way.
The product, although still under development, is powered by Leap Motion software. The team has a vision of creating an affordable, mobile, tablet-like device that users can easily take with them anywhere they go. The current prototype is being Beta tested on an Android tablet and relies on gesture technology to analyze ASL hand motions and translate them into spoken English.
The current challenge lies in translating the gestures into proper English. American Sign Language has a different grammatical structure and is really a language all of its own. Taking this complex structure and converting it into everyday English is one of the challenges MotionSavvy faces.
Another challenge is creating a program so broad that it can translate the gestural differences of each user. Some users may flick their wrists differently or speak at a more rapid or slower pace. The team is working hard on developing software that can do it all, accurately and affordably.
The current prototype can accurately translate the entire alphabet, all numbers and roughly 100 words and basic phrases. The team plans on conducting trials with some of the best ASL users to discover the differences that exist in each person’s expression of the language. In the same way that someone can say “I love you” in a tone that clearly indicates they mean “I love you like I love ice cream,” they can also say “I love you” with a romantic twinkle in their eyes. The team at MotionSavvy wants to make sure to get the proper interpretation each time to avoid the unnecessary heartbreak of friend zone dwellers.
The band of students dropped out of school to pursue their vision under the LEAP.AXLR8R program. The former students report that they are learning more within this program than they did at school (go figure) and they are determined to see the technology succeed. After all, half of the developing team is deaf and have a personal reason to ensure its success. Sad but true, often times selfish reasons trump the benevolent ones. Nonetheless, kudos boys. Our hat is off to you.
C
See more news at:
http://twitter.com/Cabe_Atwell