The Norwegian firm Elliptic Labs’ booth at CES next week will feature several implementations of a touchless, gesture-based interface for tablets and mobile devices, according to the company. Unlike the Kinect, Elliptic’s interface, called Ultrasonic Touchless Input, graphs hand movements using echolocation. Bathing the user in a silent ultrasonic torrent, it measures the return time of rebounding sonic impulses to map gestures in three dimensions. Paul Marks at New Scientist limns a few of the most obvious uses of such an interface:
Imagine you’re reading a cake recipe onscreen and you don’t want to get flour or butter on your �500 gadget: simply sweeping your hand in front of the screen will allow you to turn to the next page. Ditto engine oil/axle grease covered hands when you’re reading a motorbike manual. Clearly, applications in sign language beckon, too – and gaming of course.
According to Marks, however, Elliptic Labs hasn’t signed a deal with any manufacturers yet, so it may be awhile before we see a consumer application of this technology.
Another potential use would be in netbooks and laptops, where touchscreens aren’t kinesthetically feasible and touchpads take up precious real estate. At the rollout of the new MacBook Airs, Steve Jobs noted that Apple experimented with a touchscreen for the ultralight notebooks, but found them unwieldy. We’re all used to gesturing at our laptops already; Elliptic’s touchless interface would be a natural extension of an already complicated relationship. And even in demo, the device hints at fascinating addenda to the history of gesture, as motions born in the real word put down roots in the virtual.
[via New Scientist]