SEATTLE
A mouse that can move accurately in response to sounds could mean the difference between dependence and independence for someone with motor impairments.
A University of Washington researcher with a specialty in speech recognition is putting his skills to use teaching computers to do just that.
Jeff Bilmes, the associate professor of engineering leading a team creating the Vocal Joystick with a grant from the National Science Foundation, said most existing controllers are far from ideal.
Some are controlled by the breath or the tongue, but then users can’t talk while operating their PC, and if the device falls out, someone else has to put it back in.
“The one last bit of independence that such individuals have has been lost,” Bilmes said.
Other solutions, like sensors implanted under the skin or eye-movement trackers, can be invasive or expensive, he noted. Vocal Joystick is neither.