"Our Live Photos are better!" Google talks behind the scenes on Motion Photos — stabilization secrets revealed

"Our Live Photos are better!" Google talks behind the scenes on Motion Photos — stabilization secrets revealed

Ever since its own Pixel phones hit the market, it has been obvious that Google is hard at work on its cameras. With an amazing HDR+ feature, which works unlike anything the competition has, the Pixel phones are capable of making sharp, detailed, vivid, balanced, and stable photos without much effort on the side of the user.

With the Pixel 2, Google upped the ante and introduced Motion Photo. Yeah, it's like Apple's Live Photo — when you take a picture, the camera will also save a 3-second video recording, showing what happened before and immediately after you pressed the shutter. Then, Motion Photos are viewable and easily shareable via Google Photos.

But do you know what makes them different to Apple's version? They are extremely, extremely stable, making them look that much more like “moving photos” and less like “I accidentally pressed record at the wrong time”. How does Google do it? In a recent blog post, the company revealed some of the magic that goes on behind the scenes.

"Our Live Photos are better!" Google talks behind the scenes on Motion Photos — stabilization secrets revealed


So, since the HDR+ technology is already constantly taking photos while your viewfinder is open, Google just made use of them by making the Motion Photos feature — whenever you press the shutter button, you also get a 3-second video clip of what happened before and after that moment. Alongside this information, however, the phone also stores readings from its gyroscope and the optical image stabilization sensor. Then, the software begins working its magic.

First, it scans the clip to find the object in focus and the background. By tracking various motion vectors on a frame-by-frame basis, it can easily separate background from foreground. However, in cases when you have multi-layered or color-rich backgrounds, additional work is required.

This is where the gyroscope and OIS sensor metadata comes to work. The Motion Photos algorithm analyzes the phone's positioning and movement speed, then the already-scanned motion vectors are held against this data. With this type of “parallax mapping”, the software is better-capable of making out foreground “action” objects from the background.

"Our Live Photos are better!" Google talks behind the scenes on Motion Photos — stabilization secrets revealed


Once all that is done, Motion Photos chooses where to short video should be centered, find a movement path if there is one in the clip, and rotates, skews, and stitches together every frame in such a way that it looks like the phone was held steady the whole time. Additionally, if the clip ends with the user putting the phone away and into their pocket — that part is actually automatically cropped out, which is why you will very, very rarely end up with a Motion Photo that has an awkward beginning or end. It's almost always centered around your photo subject and the moment you had.

"Our Live Photos are better!" Google talks behind the scenes on Motion Photos — stabilization secrets revealed

source: Google

FEATURED VIDEO

8 Comments

1. bucknassty

Posts: 1263; Member since: Mar 24, 2017

man... the only phone i will suggest an iOS user is the pixel... google is killing it

2. Sammy_DEVIL737

Posts: 1497; Member since: Nov 28, 2016

Have this thing on my Nokia Lumia 720 called as Nokia Cinemagraph. But isn’t as stable as this and quality too gets reduced. Aah Nokia was the king of all this camera features back then.

3. KickRocks

Posts: 260; Member since: Mar 22, 2011

Enjoying it on my Pixel 2 XL

4. rsiders

Posts: 1813; Member since: Nov 17, 2011

Google really came in started killing the game.

5. rouyal

Posts: 1550; Member since: Jan 05, 2018

Just came to say, the girl with skateboard, smokin’

6. Fiat_Punto

Posts: 4; Member since: Sep 18, 2014

These tech giants will make as tech slave

7. Marcwand3l

Posts: 381; Member since: May 08, 2017

Google's camera software is unmatched at the moment. I installed Gcam on a Mi A1 which performed quite s**tty with it's stock camera app. The Gcam app improved the quality so much that at times it feels like it's a different phone. The selfie cam went from a 4/10 to a 8/10 easily. That is like multiple generational improvements. I don't know what algorithms Google's camera app uses but the pictures with the front facing camera are way way way better. And the Portrait mode with the back and front camera is simply brilliant. The back camera performs incredibly in low light in comparison to the stock app(like a 40% improvement at least). I can honestly say that the Mi A1 with Google's Pixel Camera app performs in photos close to flagship levels I've seen people that own other phones and which installed Gcam on those phone that the results were similar, much better photos than stock camera. I know a guy with a V20 which said he never knew his phones could take such great photos with both the front and back cameras.

8. Seenvim

Posts: 1; Member since: Mar 17, 2018

This picture with buildings on both side is very far from perfect. You can see that on the right side the 1st building is like becoming smaller while the recorder is moving avway, however the 2nd building is staying the same size. And on the left side building ecspecially wthe windows are being like pressed together making the illusion that the camera is moving away, but it looks very crippled.

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at https://www.parsintl.com/phonearena or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit https://www.parsintl.com/ for samples and additional information.