The world is always moving towards new and creative ideas, focused on bringing dreams, visions, and even movie scenes to reality. In fact, many of the technologies and devices we see today were inspired by science-fiction or literature from the past. In the early Star Trek movies, we caught glimpses Captain Kirk speaking into the ‘communicator’, the earliest portable handheld communication device, later brought to life by Martin Cooper. In Star Trek : The Next Generation, we saw the first tablet computer, and Steve Jobs gave the movie its due credit when he launched the first iPad. Submarines, Satellites, Rockets, and even the Internet are great examples of literature or fictional works that inspired scientists and engineers to push the boundaries of technology and reality. Movies today like the Iron Man, and Minority Report show gesture-based interaction and holographic projection as futuristic technologies, but these may fast become reality. We are already seeing the rapid success of first the Nintendo Wii, then the Leap Motion and Kinect, which use state-of-the-art cameras for gesture recognition.
At CEATEC Japan, Elliptic Labs released a new Android SDK which harnesses ultrasound technology for gesture control. What makes this different from the camera control on the Galaxy S4 is that ultrasound doesn’t require camera usage. So it will perform better at night, under low-light, and will avoid the battery-and-processor drain involved with image processing. It also works outside a field of view ( the major constraint of camera-based recognition ), meaning that even motion behind or below the device can be used as gestures, giving rise to a world of new possibilities of waking up / silencing / otherwise interacting with devices.
The Company claims sub-100 ms latency. To put that in perspective, the touchscreen on the Galaxy S4 has a 114ms latency, the HTC One has ~120 ms, and many less powerful phones much more delay. What this means is that swiping in the air with Elliptic Labs’ hardware could help your Fruit Ninja skills work a little faster, or your Angry Bird avatar hit that pig a little sooner, compared to touchscreens. Also, ultrasound will definitely be less battery consuming, and less processor intensive as compared to camera based recognition. It can recognize gestures beyond the cone of the camera, and shows great promise in this sense.
Elliptic Labs already has a product with Windows 8 ultrasound recognition, using gestures to power Windows 8 laptops and tablets, but the Android SDK allows it to be baked into smartphones and tablets, a much hotter market. Now, all you need is a few small Murata chips added to the phone, and the SDK handles the rest. The boxes are just for demonstration, and in reality, the chips will take up fractions of space on the phone circuit boards.The company claims full compatibility with ARM chipsets.
Elliptic Labs says only a few microphones and transceivers are all it requires to hook a phone up with ultrasound recognition, and gesture control via their SDK. Android OEMs are expected to release the first gesture powered handsets with this next year. CTO Haakon Bryhni says the company is already in advanced-prototyping stages with three OEMs, one tablet and two smartphones. Given how smartphone manufacturers are constantly looking for great ideas to differentiate their products, this may be their line in the sand. We may even see some Special Edition Ultrasound phones like HP Leap Motion Laptop, and this may even become a standard in the future. Technology is moving in very interesting ways now, and there truly is, no time like the present.