Can Pixel 4's radical Motion Sense navigation spell 'the end of the touchscreen'?
renders and screen protectors, there was a mysterious and quite large cutout on the right of the Face ID-style module. The best guess was that Google will use it to revolutionize its Pixel phones' interface navigation by introducing the gesture recognition platform it has only had in prototype form so far, and Google just confirmed it will indeed employ it with the official name Motion Sense. The chip that makes it possible is dubbed "Soli," and employs "a new sensing technology that uses miniature radar to detect touchless gesture interactions."Looking at the leaked
Google's Project Soli sounds suspiciously like what LG did with the Multi ID and hand-tracking algorithms enabled by the 3D-sensing front camera kit on the G8 ThinQ, yet relies on radar waves to detect the motion of the human hand. You know, like in this GIF below:
Google's successful miniaturization of the technology fits in a Soli chip that is small as a pinky nail, yet can detect the minutest of motions. It works on the same principle as the big flight radars that detect airplane movements in the sky. Unfortunately, this was precisely why the FCC didn't let the Soli chip fly until December 31 2018, when it granted Google a waiver from some of its requirements for radars in the commercial 57-64 GHz frequency band with the following:
How does Motion Sense work?
Google claims that the radar and the accompanying software can "track sub-millimeter motion at high speeds with great accuracy." The Soli chip does it by pushing out electromagnetic waves in a broad sweep that get reflected back to the tiny antenna inside.
A combination of sensors and software algorithms then accounts for the energy these reflected beams carry, the time they needed to come back, and how they changed on the way, being able to determine "the object’s characteristics and dynamics, including size, shape, orientation, material, distance, and velocity." Despite that the small chip can't really bring the spatial recognition of larger installations, Google has perfected its motion sensing and predicting algorithms to allow for slight variations in gestures that will be transformed into one and the same interface action.
The technology is thus superior to the 3D-sensing cameras at the front or back of some phones which depend on line of sight and lighting conditions. Here's the initial brief on Google's Soli tech that will debut on the Pixel 4 in a retail version for the first time.
Motion Sense brings a universal set of gestures
Hand gestures come more natural than pushing against a piece of glass, yet so far the technology for their recognition on a phone has been imperfect as it relied on camera sensors. Google is aiming to revolutionize the interaction with our mobile devices by employing the radar-based Motion Sense technology which would include, but not be limited to, the following natural gestures that can be employed in any orientation of the phone, day or night.
So, can the world be your interface?
In fact, the guy behind Project Soli, Google's Ivan Poupyrev that you saw in the video above, recently gave a TED talk explaining how this radar-based gesture navigation can be deployed everywhere in a "the world is your interface" kind of moment.
While we can't really comment on the practicality of Google and Levi's Project Jackard idea that employs the motion sensing gizmo in a jeans jacket, getting it into the Pixel 4 is a whole different ball game.
Google's Pixel 4 and Motion Sense
Just when we were preparing this primer on Project Soli, Google came out confirming that this will indeed be the tech occupying the mysterious openings at the right of the thick top bezel on the Pixel 4. In its blog post, the company went through the same points and advantages we list above in more detail. Apparently, it all ties up with Google's head of hardware Rick Osterloh "ambient computing" strategy which he explains as:
What's a bit worrying, however, is that Google lists the Motion Sense abilities on the Pixel 4 as "skip songs, snooze alarms, and silence phone calls, just by waving your hand." Not for nothing, but those are the things that LG does on the G8 ThinQ just with the front camera kit, no fancy miniaturized radars, no bezelicious sprawling at the top.
What about scrolling with an air flick of the finger through long articles, or going back in the interface with a simple thumb twitch, though? Google does wax poetic that this is just the start and "Motion Sense will evolve," but we've heard many a marketing writeups for options and features that ultimately prove to be slow on the uptake.
That Motion Sense "will be available in select Pixel countries" bit is also raising a few eyebrows, as to why would a Pixel 4 model in one place come with the radar-based gesture navigation, while in others it won't. Is it because different countries have different rules on commercial radars in the 57-64 GHz frequency band and some are reluctant to accept the FCC's waiver?
We probed Google with the question and will get back to you here when we get a nod. What do you think, could Pixel 4's Motion Sense be the "end of the touchscreen" and the beginning of the "world is your interface" era indeed, and is it too early to tell, or too complex of an interaction paradigm to take off?