Can Pixel 4's radical Motion Sense navigation spell 'the end of the touchscreen'?

There is a reason why Google is going with a giant top bezel on the Pixel 4, in times where the Note 10 is the most compact large-screen phone in our database precisely on account of the sawed-off strip there. No, that reason is not a 3D face-scanning kit, although there is one, and Google is collecting faces to perfect it at $5 a pop.

Google is using the space to revolutionize its Pixel phones' interface navigation by introducing the gesture recognition platform it has only had in prototype form so far, and Google indeed employed it with the official name Motion Sense. The chip that makes it possible is dubbed "Soli," and employs "a new sensing technology that uses miniature radar to detect touchless gesture interactions." 

Google's Project Soli sounds suspiciously like what LG did with the Multi ID and hand-tracking algorithms enabled by the 3D-sensing front camera kit on the G8 ThinQ, yet relies on radar waves to detect the motion of the human hand. You know, like in this GIF below:

So, what can you do with Motion Sense on the Pixel 4?

You can also swap songs in Spotify and YouTube, and the display will emit a subtle glow when the gesture is executed so that you know that your command has been successful, like we showed you yesterday. You can use it to snooze alarms, dismiss timers, and silence incoming calls with a wave, and features can be turned on and off from Settings>System>Motion Sense. 

The funny part of using Motion Sense will be for gaming and the so-called “Come Alive” series of wallpapers that respond to your motions like the Pikachu one that Google showed during the presentation. You can pet the Pokemon, and it will apparently wake up when you reach for the phone. Specialized games are also in, and Google promises much more features for Motion Sense down the road.

How does Motion Sense work?

Google's successful miniaturization of the technology fits in a Soli chip that is small as a pinky nail, yet can detect the minutest of motions. It works on the same principle as the big flight radars that detect airplane movements in the sky. Unfortunately, this was precisely why the FCC didn't let the Soli chip fly until December 31 2018, when it granted Google a waiver from some of its requirements for radars in the commercial 57-64 GHz frequency band with the following:

Google claims that the radar and the accompanying software can "track sub-millimeter motion at high speeds with great accuracy." The Soli chip does it by pushing out electromagnetic waves in a broad sweep that get reflected back to the tiny antenna inside. 

A combination of sensors and software algorithms then accounts for the energy these reflected beams carry, the time they needed to come back, and how they changed on the way, being able to determine "the object’s characteristics and dynamics, including size, shape, orientation, material, distance, and velocity." Despite that the small chip can't really bring the spatial recognition of larger installations, Google has perfected its motion sensing and predicting algorithms to allow for slight variations in gestures that will be transformed into one and the same interface action.

The technology is thus superior to the 3D-sensing cameras at the front or back of some phones which depend on line of sight and lighting conditions. Here's the initial brief on Google's Soli tech that will debut on the Pixel 4 in a retail version for the first time.

Motion Sense brings a universal set of gestures

Hand gestures come more natural than pushing against a piece of glass, yet so far the technology for their recognition on a phone has been imperfect as it relied on camera sensors. Google is aiming to revolutionize the interaction with our mobile devices by employing the radar-based Motion Sense technology which would include, but not be limited to, the following natural gestures that can be employed in any orientation of the phone, day or night.

So, can the world be your interface?

Google is known for its "moonshots," or crazy-sounding projects that it thinks will ultimately prove their mettle as time goes by. So far, the Google X moonshot lab has bagged incredible technology leaps like the self-driving car efforts, smart contact lenses, and other eye-poppers. The Motion Sense gear on the Pixel 4 could be one small step for Google's hardware department, and one giant leap for the future of the smartphone interaction. 

In fact, the guy behind Project Soli, Google's Ivan Poupyrev that you saw in the video above, recently gave a TED talk explaining how this radar-based gesture navigation can be deployed everywhere in a "the world is your interface" kind of moment.

While we can't really comment on the practicality of Google and Levi's Project Jackard idea that employs the motion sensing gizmo in a jeans jacket, getting it into the Pixel 4 is a whole different ball game. 

Google's Pixel 4 and Motion Sense

Just when we were preparing this primer on Project Soli, Google came out confirming that this will indeed be the tech occupying the mysterious openings at the right of the thick top bezel on the Pixel 4. In its blog post, the company went through the same points and advantages we list above in more detail. Apparently, it all ties up with Google's head of hardware Rick Osterloh "ambient computing" strategy which he explains as:

What's a bit worrying, however, is that Google lists the Motion Sense abilities on the Pixel 4 as "skip songs, snooze alarms, and silence phone calls, just by waving your hand." Not for nothing, but those are the things that LG does on the G8 ThinQ just with the front camera kit, no fancy miniaturized radars, no bezelicious sprawling at the top. 

What about scrolling with an air flick of the finger through long articles, or going back in the interface with a simple thumb twitch, though? Google does wax poetic that this is just the start and "Motion Sense will evolve," but we've heard many a marketing writeups for options and features that ultimately prove to be slow on the uptake. 

That Motion Sense "will be available in select Pixel countries" bit is also raising a few eyebrows, as to why would a Pixel 4 model in one place come with the radar-based gesture navigation, while in others it won't. Is it because different countries have different rules on commercial radars in the 57-64 GHz frequency band and some are reluctant to accept the FCC's waiver? Yep, Google now lists that it will be "functional in the US, Canada, Singapore, Australia, Taiwan, and most European countries," while availability will be "coming soon in Japan."

What do you think, could Pixel 4's Motion Sense be the "end of the touchscreen" and the beginning of the "world is your interface" era indeed, and is it too early to tell, or too complex of an interaction paradigm to take off?

Related phones

Pixel 4
  • Display 5.7" 1080 x 2280 pixels
  • Camera 12.2 MP / 8 MP front
  • Processor Qualcomm Snapdragon 855, Octa-core, 2840 MHz
  • Storage 64 GB
  • Battery 2800 mAh



1. ssallen

Posts: 202; Member since: Oct 06, 2017

I am a bit bummed that they are adopting face unlock but the Soli stuff looks cool. I guess with the ambient computing initiative face unlock is within the metaphor, but yuck.

15. sgodsell

Posts: 7451; Member since: Mar 16, 2013

I was hoping that Google would use the ultrasonic fingerprint scanner under the screen as well. If they are using face only, and dropping the fingerprint scanner altogether, then Pixel 4 won't be for me.

2. Cat97

Posts: 1933; Member since: Mar 02, 2017

Google is also known for its many closed projects and apps, too many to count.

5. TBomb

Posts: 1574; Member since: Dec 28, 2012

I personally think we need to cut google some slack in that department. It also has some very long projects and initiatives. There was a period where they were trying anything and everything - those projects came to a close because they were part of the "try it" phase. It's true, a lot of non-moonshot projects ended as well. But you have to look at what google's intentions were for that project. what was their goal and Is there a better way to reach that goal?

30. vincelongman

Posts: 5724; Member since: Feb 10, 2013

Same with Apple, Samsung, Microsoft and every other major tech company lol

3. Plutonium239

Posts: 1232; Member since: Mar 17, 2015

In order for this sort of thing to revolutionize how we all interact with our devices, it would need widespread adoption by many manufacturers, not just being present on the up coming pixel. The technical achievement of shrinking radar down so far is nonetheless impressive.

6. slim3bdo

Posts: 186; Member since: Jun 05, 2017

And how would you revolutionize the industry without implementation . 1st iphone came was garbage as a phone compared to other phones but it brought multi touch interface and everybody followed . So if google implementation is good enough , the whole industry will follow . If it work as said then its a breakthrough in a stagnant industry and it WILL revolutionize every gadget we have today.

4. surethom

Posts: 1720; Member since: Mar 04, 2009

It look cool & impressive technology, but for most it will just be a gimmick, most will try it out at first then never use it again, just use touch.

7. adecvat

Posts: 647; Member since: Nov 15, 2013

Another gimmick

45. mackan84

Posts: 557; Member since: Feb 13, 2014

Yep MKBHD got his swipes to work 10% of his tries.

8. cmdacos

Posts: 4266; Member since: Nov 01, 2016

Amazing technology but is motion gestures easier to use then my finger on a handheld device? Seems awkward to use when you are holding the device.

9. Poptart2828

Posts: 428; Member since: Jan 23, 2018

Everything is a gimmick around here in PA.

10. dumpster666

Posts: 92; Member since: Mar 07, 2019

it's not meant to replace touch... it's there to compliment it.

42. mackan84

Posts: 557; Member since: Feb 13, 2014

So was 3D touch

11. kevv2288

Posts: 300; Member since: Jul 30, 2015

You mean you have to use your hands? That's like a baby's toy.

14. nuumuun

Posts: 4; Member since: Mar 12, 2015

I see a back to the future reference ... I like :D

12. mackan84

Posts: 557; Member since: Feb 13, 2014

Wasn’t it a display a couple of years ago that registered touch when hovering your finger precisely above it?

13. torr310

Posts: 1677; Member since: Oct 27, 2011

Just by reading at the title....nope! We still cannot get rid of the touch screen with typing. LOL

17. japkoslav

Posts: 1518; Member since: Feb 19, 2017

BS, thank you for your contribution PA.

18. Takeharu

Posts: 286; Member since: Oct 28, 2013

It's just like motion controls for gaming. For years Sony, Nintendo and Microsoft tried to make gaming more interactive and immersive with motion controls but in the end nothing is more intuitive than just moving my thumb a few centimeters; this goes for both a touch screen and a controller

19. lyndon420

Posts: 6834; Member since: Jul 11, 2012

I appreciate the tech involved, but saying 'the end of the touchscreen' is a bit misleading. We still need to touch the screen to type so...

20. TBomb

Posts: 1574; Member since: Dec 28, 2012

I see some games an apps even making use of the tech. Learn sign language by it tracking your hand and correcting you when you do it wrong Harry Potter spell casting games or other gesture based games Snapchat and Instagram filters will get better tracking

23. Demo-jay

Posts: 78; Member since: Feb 13, 2018

This is probably gonna fail..there is definitely gonna be a problem with this doesn’t really polish their software...customers are beta testers

26. berns

Posts: 36; Member since: Jun 07, 2014

It will be a very cool addition to phone's sensors, but to end the touchscreen is far from ease of usability perspective, as how you would hold a phone if you need to actuate more than 1 controls as the same time, as on playing games, nevertheless, it is a welcome add-on, and one very good usage for this that I see is for scanning object to create 3D model.

29. mootu

Posts: 1530; Member since: Mar 16, 2017

I wonder if anyone has thought about the effects on human health? Radar RF waves in low doses are still unknown if they cause ill effects in humans, the tech has been around for a very long time but they have never been able to declare it as safe.

32. Mike88

Posts: 438; Member since: Mar 05, 2019

They actually want to make people sick

33. justchilliando

Posts: 13; Member since: Oct 19, 2017

This will change the world , not do much on pixel phones but everything else, clothes, tv, smart speakers, and overtime u will be able to do more

34. shonasof

Posts: 34; Member since: Mar 18, 2019

It's certainly interesting technology, and I can't wait to see how it develops! But it's not going to be fine enough for precise casual use. This technology (if used as the primary input) also would cause problems for people with physical impairments.

35. kennybenny

Posts: 218; Member since: Apr 10, 2017

Didn't I have this feature on my 2012 Samsung Galaxy S3? (my first cell phone) I seem to remember doing Air gestures so this feature is not 100% new.....

36. midan

Posts: 3015; Member since: Oct 09, 2017

Did anyone said it's new? Even the article mention that it's not The tech behind is for a phone and should make it lot better and precise

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit for samples and additional information.