New Scientist published a great article last week summarizing two new gesture computing technologies developed by colleagues at Microsoft Research and the University of Washington and presented at CHI 2012 earlier this month:
THE advent of multi-touch screens and novel gaming interfaces means the days of the traditional mouse and keyboard are well and truly numbered. With Humantenna and SoundWave, you won’t even have to touch a computer to control it, gesturing in its direction will be enough…
As the name suggests, Humantenna uses the human body as an antenna to pick up the electromagnetic fields — generated by power lines and electrical appliances — that fill indoor and outdoor spaces. Users wear a device that measures the signals picked up by the body and transmits them wirelessly to a computer. “It’s just an electrode that measures voltage, digitises it and sends the signal for processing,” says Desney Tan of Microsoft Research in Redmond, Washington [more, including a video describing how Humantenna works, after the jump].
By studying how the signal changes as users move through the electromagnetic fields, the team was able to identify gestures, such as a punching motion or swipe of the hand. In all, the researchers found that the technology could detect 12 gestures with over 90 per cent accuracy.
One version of the system, presented this week at the Conference on Human Factors in Computing Systems in Austin, Texas, runs off a sensor that sits in a small bag. With training, that sensor can learn to recognise specific gestures. Another paper, under review, describes a version that relies on a much smaller wristwatch-sized sensor. Thanks to advances in processing techniques, this newer system needs no training to recognise the same 12 gestures.
The team was able to do away with training after realising that low-frequency components of the signal are similar, no matter which electrical objects are producing them. By focusing on these common patterns, the system can detect the same gesture even when it is performed in a different location with different electromagnetic fields. “That’s a pretty big step,” says Tan.
All sorts of applications would open up if Humantenna can be commercialised. The body could become a kind of universal remote control, and basic gestures such as pointing or swiping might be used to control lights, appliances and computers in the home. Fitness monitoring is another possibility, says Tan. We already have devices that can infer how hard a person is exercising by tracking step patterns, but Humantenna could provide a more holistic measure by monitoring whole body movements…
Simple as it is, Humantenna still requires users to wear a sensor. But Tan’s team, working in collaboration with researchers at the University of Washington in Seattle, has developed another gesture-recognition device that will need no new hardware.
SoundWave, which is also being presented in Austin, relies on an inaudible tone generated by a laptop loudspeaker. When a hand moves in front of the laptop, it changes the frequency of the sound, which the machine’s microphone picks up. By matching characteristic frequency changes with specific hand movements, SoundWave can detect a handful of gestures with an accuracy of 90 per cent or more, even in noisy environments such as a cafeteria…
To learn more, check out the full article or see a video of Humantenna in action below.
And if you have a cool research result to report, please share it with us here – and we’ll feature it as a Computing Research Highlight of the Week.
(Contributed by Erwin Gianchandani, CCC Director)
Trackbacks /
Pingbacks