"Deep Fusion" explained: First look at Apple's most innovative camera feature

The new iPhone 11 family comes with new and better cameras, but there is one ground-breaking new camera feature that will not be available on these iPhones from the very start, but it will instead arrive as an update.

Apple calls this special feature "Deep Fusion" and it is a brand new way of taking pictures where the Neural Engine inside the Apple A13 chip uses machine learning to create the output image.


The result is a photo with a stunning amount of detail, of great dynamic range and with very low noise. The feature uses machine learning and works best in low to medium light.

Phil Schiller, Apple's chief camera enthusiast and also head of marketing, demonstrated the feature with a single teaser picture and explained how it works.

How "Deep Fusion" works:

  • it shoots a total of 9 images
  • even before you press the shutter button, it has already captured 4 short images and 4 secondary images
  • when you press the shutter button, it takes 1 long exposure photo
  • then in just 1 second, the Neural Engine analyzes the fused combination of long and short images, picking the best among them, selecting all the pixels, and pixel by pixel, going through 24 million pixels to optimize for detail and low noise


It is truly the arrival of computational photography and Apple claims this is the "first time a neural engine is responsible for generating the output image". In a typical Apple fashion, it also laughingly calls this "computational photography mad science". Whichever definition you pick, we can't wait to see this new era of photography on the latest iPhones this fall (if you are wondering exactly when, there are no specifics, but practice shows it will likely be the end of October).

FEATURED VIDEO

19 Comments

1. User123456789

Posts: 918; Member since: Feb 22, 2019

Subject 100% still ... Otherwise, blur

10. Phullofphil

Posts: 1789; Member since: Feb 10, 2009

Wow you got the phone and the update already. I would imagine that this phone takes great pictures. But I’ll make sure to ask you which photo should go because you must be the master

11. User123456789

Posts: 918; Member since: Feb 22, 2019

Why do you think there is a minimum shutter speed to freeze the motion? Why do you think google's HDR+ photos introduce ghosting for moving subject? Why do you think night modes have lots of blur if people in the scene are walking?

14. Nexus4lifes

Posts: 294; Member since: Feb 13, 2014

I guess you have used an iphone and say bababaababbaaaaa...

2. bucknassty

Posts: 1337; Member since: Mar 24, 2017

lol... trying to catch google i see

7. Phullofphil

Posts: 1789; Member since: Feb 10, 2009

f**k damn if they do damn if they don’t. But no they are just trying to improve. This s**t has been in development long before they could copy. Pretty much how it goes. We see such a small window of what goes on behind the scenes.

3. darkkjedii

Posts: 31045; Member since: Feb 05, 2011

Mehhh. My next iPhone, will be the 2020 5G model.

4. peschiera

Posts: 21; Member since: Sep 17, 2017

All relevant smartphone makers have been offering (Apple including) advanced algorithms in their camera system. Apple gives a nice name to one single feature, focuses the entire marketing on it and everybody believes in revolution in smartphone photography. I must admit marketing at its best, but it works only if you have an army of worshipers (consumers and wannabe journalists). Edit: the early announcement indicates that Apple is aware of a competitor offering a similar mode which would probably be unveiled earlier than Apple's solution, and Apple would cry again: copycat.

8. Phullofphil

Posts: 1789; Member since: Feb 10, 2009

Marketing 101 dips**t

5. pogba

Posts: 110; Member since: Jun 13, 2018

"It is truly the arrival of computational photography"....... No victor, computational photography truly arrived with the pixels back in 2016.

6. peschiera

Posts: 21; Member since: Sep 17, 2017

Viktor is a typical worshiper.

9. Phullofphil

Posts: 1789; Member since: Feb 10, 2009

No your a troll. 99 percent of people that have an iPhone are not. But what you say makes me believe you are a typical worshiper of android.

12. peschiera

Posts: 21; Member since: Sep 17, 2017

Oh sweetie ... did i hurt your feelings? No, I did not cause trolls like you have none. Maybe you try it on the subject: did the "computational photography" arrived oday? Do you believe that BS?

13. Xavier1415

Posts: 190; Member since: Feb 26, 2012

Don't the pixel has this?

20. QuantumRazer

Posts: 131; Member since: Apr 27, 2019

No, Pixel's Night Sight takes 15 underexposed images and stack them together in pixel shift mode to produce more detailed 12MP images(which, looking at all the samples, seem to have details roughly equivalent to 16MP image). Deep Fusion on the other hand, buffers 4 underexposed images and 4 regular images constantly and instantly merges them with the long exposure frame(that is captured when the shutter is pressed) in pixel shift mode to produce a single 24MP image.

15. gadgetpower

Posts: 166; Member since: Aug 23, 2019

Wow, this is truly amazing!

16. Back_from_beyond

Posts: 1420; Member since: Sep 04, 2015

Yeah, they've taken something that already existed and rebranded it. Innovation at its best...

18. Whitedot

Posts: 811; Member since: Sep 26, 2017

I bet $5 , Victor, Iphone review score will be highest smartphone score again this year.

19. talon95

Posts: 998; Member since: Jul 31, 2012

They can pretend everything they do is new. Apple users won't know the difference.

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at https://www.parsintl.com/phonearena or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit https://www.parsintl.com/ for samples and additional information.