Watch Google's revolutionary 'Motion Sense' navigation work with Spotify on Pixel 4
There is a reason why Google is going with a giant top bezel on the Pixel 4, in times where the Note 10 will be the most compact large-screen phone in our database precisely on account of the shaved-off strip there. No, that reason is not a 3D face-scanning kit, although there will be one, and Google is collecting faces to perfect it at $5 a pop.
Hand gestures come more natural than pushing against a piece of glass, yet so far the technology for their recognition on a phone has been imperfect as it relied on camera sensors. Google is aiming to revolutionize the interaction with our mobile devices by employing the radar-based Motion Sense technology which would include, but not be limited to, the following natural gestures that can be employed in any orientation of the phone, day or night.
The chip that makes it possible is dubbed "Soli," and employs "a new sensing technology that uses miniature radar to detect touchless gesture interactions." Google's successful miniaturization of the technology fits in a Soli chip that is small as a pinky nail, yet can detect the minutest of motions. It works on the same principle as the big flight radars that detect airplane movements in the sky.
Unfortunately, this was precisely why the FCC didn't let the Soli chip fly until December 31 2018, when it granted Google a waiver from some of its requirements for radars in the commercial 57-64 GHz frequency band.
This is also the reason why some countries won't be getting the Motion Sense feature on their Pixel 4. If you want to see how Google's miniaturized Soli radar works on the phone that will be announced tomorrow, look no further than the video below, demonstrating swapping Spotify songs with the wave of your hand.
We have to say that it seems to work better than similar such solutions done only with the front camera, but let's wait to see how it performs on a retail unit before we pass any judgement.