Google's Project Soli: Radar driven gestures up close
How can we maximize all the form factors without sacrificing efficiency? It turns out that we are the ultimate user interface, and our fingers are high-bandwidth tools. They are easy to use, they provide instant feedback (no haptics required), and of course, they are very ergonomic.
Taking natural movements, and utilizing radar, Google was able to get machines to identify gestures made in thin air as a way to execute commands. This could render touching a wearable (or any device really) unnecessary in many instances.
Equally remarkable about the whole sensor concept is the astounding rate of development behind Project Soli. In just 10 months, Google went from a PC-sized radar emitter to a chip that is no bigger than a dime. On top of that, Google has developed what will be a close approximation to what will be a developer-ready test board, to be available later this year. The APIs will also be available later this year, fully accessible by the developer community.
ATAP showed off some impressive features with the technology during its presentation. The Project Soli exhibit in the convention hall was back to basics, though it shows how effectively this can work. Unfortunately, there was not enough room to afford us the ability to capture all the action in one frame. We kept the panning up and down as smooth and as unobtrusive as possible.