A group of University of Washington researchers are working on MobileASL (American Sign Language) which would allow the hearing-impaired to use sign language on cellphones
Texting over your cellphone works fine to instruct, remind or arrange a meet time. If you want company or context, however, you use it the old-fashioned way and speak. It’s no different for the deaf and hard-of-hearing who sign. Texting works, but they want conversation, company, context.
That has spurred University of Washington researchers to work on developing software with processing power great enough to support real-time, two-way video on cellphones that allow signing communication. Supported mainly by grants from the National Science Foundation, the team plans to conduct further field studies on the device, called MobileASL (American Sign Language), next year.
The main obstacle is low data-transmission rates on U.S. cellular networks and limited processing power on mobile devices. That has prevented real-time video transmission with enough frames per second that it could be used to transmit sign language. (People are already able to use sign language effectively in Japan and Sweden because of higher bandwidth networks).
As a way around the limits, researchers have employed video-compression technology that devotes more “bits” on what is important in sign language — the face and hands — while allocating fewer bits on the rest of the image. During eye-tracking studies, researchers found that signers spend most of their time focused on the other person’s face and take in the hands peripherally.
- Seahawks agree to contract extension with quarterback Russell Wilson
- Dustin Ackley trade symbolizes continuing dark days of Mariners
- Surviving Seattle’s sidewalks: Pedestrian rage rises as the population grows
- Mariners trade Mark Lowe to the Blue Jays for three minor leaguers
- Seahawks linebacker Bobby Wagner on contract talks: 'Now. That's my deadline'
Most Read Stories
A short video describing the project is posted at youtube.com/watch?v=FaE1PvJwI8E.
The MobileASL project is led by principal investigator Eve Riskin, a UW professor of electrical engineering. Co-leaders include Information School professor Jacob Wobbrock and Richard Ladner, a longtime professor of Computer Science and Engineering.
Ladner, who is fluent in sign language, has a special connection to the work. Both his parents were deaf and he saw firsthand how their worlds opened when they got a bulky teletypewriter (which allowed for texting conversations), in the ’70s. says Ladner.
“I saw the impact it had on their lives and how much technology could make a difference,” he says. “Texting today is popular among the deaf and hard-of-hearing, but for a lot of deaf people sign is more comfortable. It’s certainly faster.”
Ladner, 65, and a UW professor for 38 years, has run workshops for students with disabilities and operated weeklong summer academies for “Deaf and Hard of Hearing in Computing.” He also is a board member for Gallaudet University, the world’s only liberal-arts college for the deaf.
See mobileasl.cs.washington.edu/index.html for more information.