At last, phones will get ultrasound gesture control in first half of 2015
We’ve been following Elliptic Labs’ development on ultrasound gesture control for quite a while, but no time frame was ever given until now. Ahead of CEATEC in Tokyo, the company finally announced that its input technology — developed in partnership with Murata — will be arriving on phones in the first half of 2015. But that’s not the only good news: On top of the usual swiping gestures for images, games and navigation (we saw some of this last year), there’s now a new capability called “multi layer interaction,” which uses your hand’s proximity to toggle different actions or layers. It’s potentially useful for glancing at different types of messages on the lock screen, as demoed in the video after the break.
Compared to its optical counterparts, this ultrasound solution is more convenient for everyday use, as it has a 180-degree active area around the entire face of the device. The others need your hand to be positioned in front of a camera or a dot sensor, which can be easily missed if you’re not waving carefully; though in their defence, the laser-based gesture cameras capture more detail, which is useful for other applications like 3D scanning plus precise point-and-click. At the end of the day, it’s all about who can perfect the basic user experience, so stay tuned as we hit the show floor tomorrow to see if this is as good as it claims to be.