Countless companies are working on ways to make mobile device and wearable input more efficient. Some use prediction, others use your voice. Google’s SoliType uses radar, the Geek reports.
Google’s ATAP (advanced technologies and projects) group has built a “full gesture recognition pipeline” that lets them map hand gestures with a very high degree of accuracy.
Add radar to a smartwatch, and you could scroll through menus and make selections without touching the crown. Add it to a smartphone, and you could type out messages simply by moving your fingers in the air.
That’s what engineer Alex Bravo has demonstrated in a video he just posted on his Google+ page. The current prototype is a USB peripheral that looks a bit like a trackpad. The big difference, of course, is that you don’t have to physically touch it.
You can see from the video that SoliType has a long way to go before it’s ready to replace our physical and on-screen keyboards. Banging out a short sentence seems to take quite a while, and there’s going to be a fairly steep learning curve to overcome. Typing in the air isn’t going to be anything like hammering out messages with your thumbs.