Roozbeh Jafari, Associate Professor for the Department of Biomedical Engineering at Texas A&M University is leading the development of a tool for American Sign Language (ASL) translation. While previous attempts for automatic ASL translation have largely relied on cameras and visual tracking technology, Jafari’s project tracks muscle movement and external motion. “The sensor is based on EMG, or electromyogram technology,” Jafari said. “Combined with the external motion sensors, which show us the overall hand movement, the EMG allows us to discriminate between gestures,” he said. “A fine-grain of interpretation […] motion sensors give us the overall sense and muscle activities give us information about the fine-grained intent.”
The prototype was revealed this past June at the Institute of Electrical and Electronics Engineers (IEEE) 12th Annual Body Sensor Networks Conference, but still has a ways to go. It currently recognizes only 40 ASL signs and transmits them to a computer via Bluetooth. The goal is to eliminate the need for a separate device by including a computer in the wearable technology and to refine the system’s sensitivity allowing for more natural conversation.
Jafari’s ASL translation system is still in development and does not yet have a name, but it’s implications lay an significant groundwork for the future of ASL and motion-sensor technology. “When you think about it, you can use this for many other applications,” he said. “Think about your house […] you might have a smart house, but right now to turn on and off all your devices you need to go to your mobile phone, each app, and then use them. What if you could control your house with hand gestures, communicating with each of your devices?”
#ASL #tech