Can you hear me now?
Touch screens will one-day be antiquated, just like every other technology that was once cutting-edge. But what does the future hold?
Some will say that the ultimate interface will eventually be your brain, but the next step in that journey seems to be gesture control.
These days gesture control is accomplished using your device’s camera, but Elliptic Labs might have just came up with the next-gen solution.
Their “multi-layer interaction” uses sound, specifically ultrasound, to literally listen to your gestures.
With Elliptic Labs’ gesture recognition technology the entire zone above and around a mobile device becomes interactive and responsive to the smallest gesture. The active area is 180 degrees around the device, and up to 50 cm with precise distance measurements made possible by ultrasound… The interaction space can also be customized by device manufacturers or software developers according to user requirements.
It uses less power then your camera and is more accurate. Sounds like the “wave” of the future to me. Hehe.
[via]