A feature coming with iOS 13.2 will take photography on the 2019 Apple iPhones to a new level

A feature coming with iOS 13.2 will take photography on the 2019 Apple iPhones to a new level
If you bought one of the new 2019 iPhone models and have been impressed with the photos you've been taking with the device, you have not seen what the handset is fully capable of. That's because Apple has yet to push out the promised software update that adds the AI-based Deep Fusion technology to the phones. Apple introduced this back on September 10th at the new products event that outed the iPhone 11, iPhone 11 Pro and iPhone 11 Pro Max.

With Deep Fusion, four frames are captured using a fast shutter speed, and four are shot at a standard speed all before you press the shutter button. When the shutter is clicked, a long-exposure image is taken to capture details. Three regular shots and a long-exposure shot are merged together into one picture which is merged again with the short-exposure image with the best detail. This is all done using AI. In a split second, the image is optimized pixel by pixel to generate a final picture with great detail and low noise.

Deep Fusion has three different modes based on the ambient lighting and the lens being employed. The primary Wide-angle camera will use Deep Fusion for photos in medium light to low light with Night Mode automatically turning on for anything darker. The telephoto camera works at extremes; in very bright light Deep Fusion will get the nod while Night Mode takes over in very dark conditions. And with the Ultra-wide camera, neither Deep Fusion or Night Mode is supported.

Deep Fusion is expected to come with iOS 13.2 and will make photos more detailed with less noise

The good news for those with a 2019 iPhone is that Deep Fusion will be available soon once iOS 13.2 is released to the public. The developer beta version of the build was expected to have been released today, but the status now is "coming" according to a tweet from TechCrunch's Matthew Panzarino (@panzer).  If you are a member of the Apple Beta Software Program, you should have the opportunity to try out Deep Fusion pretty soon. But remember, since it won't be a final build you will have to weigh the risk of using unstable software versus the wait required to load the final version of the build. If you've never loaded an iOS beta version on your iPhone before, you are probably best served by waiting for the stable build to be released.

The camera wars are heating up after the release of the Apple iPhone 11 family. Two weeks from today, October 15th, Google will introduce the new Pixel 4 series. While Apple this year has embraced computational photography, this is something that the Pixels have been known for. Considered one of the best smartphones for photography (if not the best), Google has flexed its processing muscles over the years while equipping the Pixels with a single 12.2MP camera on the back of the phones. Times have really changed when it seems weird that a phone has only a single camera adorning the rear panel.

Meanwhile, the new Pixels will once again feature a 12.2MP primary camera, although this time the aperture is a little wider at f/1.6. This will allow for more light to be captured when a photo is being taken. For the first time, there will be a second sensor on the back; this will be a 16MP telephoto camera with a 5x optical zoom. And while Apple has been praised for its Night Mode, which allows users to snap viewable photos in low-light and dark conditions, Google is expected to respond with its second-generation version of Night Sight. The latter could add longer exposure times to Night Sight allowing Pixel 4 users to engage in Astrophotography. This would allow the phones to be used to shoot stunning (we assume) photos of the night sky including stars.



1. DBozz

Posts: 71; Member since: Sep 19, 2019

Deep fusion is apple's gimmicky name of existing HDR+ from Google.

8. gadgetpower

Posts: 283; Member since: Aug 23, 2019

Its a revolutionary feature that is very useful in photography. Apple is killing every android camera features, its natural looking is the best.

12. DBozz

Posts: 71; Member since: Sep 19, 2019

Ok... now get the hell outta here!

14. Tizo101

Posts: 572; Member since: Jun 05, 2015


15. darkkjedii

Posts: 31337; Member since: Feb 05, 2011

Tell em peaceboy

16. higeyuki

Posts: 24; Member since: Aug 06, 2012

Deep Fusion has little similarity with HDR+, it is better compared to pixel shift using IBIS and exposure stacking.

3. dnomadic

Posts: 426; Member since: Feb 20, 2015

This will be good for subjects that are not moving. It will be interesting to see how this works with subjects in motion. Seems like the Pixel gets it right regardless, though often, it has a lower exposure value.

4. vincelongman

Posts: 5730; Member since: Feb 10, 2013

The way Pixels handle motion is so underrated Even Night Sight handles motion surprisingly well

13. DBozz

Posts: 71; Member since: Sep 19, 2019

Pixel uses accelerometer sensors to track the hand movements and adjust the frames which is exactly the same that they use for video stabilization... so it doesn't matter whether its day or night., we will get sharp images irrespective of any situation

23. TerryD

Posts: 555; Member since: May 09, 2017

'My pixel takes rubbish photos'... said no one ever 'I wish the camera on my Pixel was better'...said no one ever I just get the impression that even though you only have one lens and all the other manufacturers have 2, 3 or even 4 lenses, they're still the one to beat.

7. midan

Posts: 3019; Member since: Oct 09, 2017

Already very impressed what new iPhone cameras can do and loving the new crop out from frame feature and the new super smooth zoom "wheel" can't wait to test this.

9. gadgetpower

Posts: 283; Member since: Aug 23, 2019

This is really cool feature.

27. Watchkc

Posts: 2; Member since: 3 days ago

Hmmm.. How sure are you

10. Feanor

Posts: 1390; Member since: Jun 20, 2012

Computational photography may have done huge steps but sometimes it can also get a bit silly. We took recently a picture together with my friend using her iPhone XS Plus (main camera, no selfie) and the iPhone blow out the colour of our faces into a silly orange leaving my neck and her hands as white as they are in reality. The HDR was also a bit overbearing with the scenery being merged in a very flat, un-contrasty way with the sky. I have an Xperia XZ3 and despite the fact that its camera has been heavily criticized, it still manages some great shots without these unexpected results. I found out that I may prefer a slightly less heavy-handed HDR approach, where some sky areas may remain slightly burned or some shadows lost in darkness but still show some contrast, than this overly flat look. And as for the forced face colour boost, well....

11. midan

Posts: 3019; Member since: Oct 09, 2017

According to reviews, it's lot more reliable with new iPhones. So far i've only got very good results from the camera.

26. Watchkc

Posts: 2; Member since: 3 days ago

appleto next level? Lol...na confirm lie

* Some comments have been hidden, because they don't meet the discussions rules.

Latest Stories

This copy is for your personal, non-commercial use only. You can order presentation-ready copies for distribution to your colleagues, clients or customers at https://www.parsintl.com/phonearena or use the Reprints & Permissions tool that appears at the bottom of each web page. Visit https://www.parsintl.com/ for samples and additional information.