The magic behind the iPhone XS camera – best iPhone camera yet?

46comments
The magic behind the iPhone XS camera – best iPhone camera yet?
Looking back at Apple events from the past couple of years, it is evident how the company's focus on the smartphone camera has grown exponentially. Apple yesterday unveiled three new iPhone models – the premium iPhone Xs and Xs Max and the more affordable iPhone Xr. But since the Xr has a single camera on its back, and is not meant to be on the bleeding edge of technology in this regard, we are going to be focusing on the iPhone Xs and Xs Max instead. Just like Apple did in the keynote.

On paper, the cameras of the new iPhone Xs models are not too different from last year's iPhone X. In fact, when camera specs are solely involved, one may wonder what the deal is. A wide-angle lens with a fixed f/1.8 aperture and an f/2.4 telephoto snapper when the Galaxy S9+ and the Galaxy Note 9 have variable apertures that go up to f/1.4? That may be a valid concern for some people out there, but Apple still showed off some very interesting, very impressive camera stuff during the announcement. So, what's happening behind the curtains in the new iPhone Xs camera? Here's an in-depth look at everything we know so far.

Silicon, glass, and AI



Though the cameras of the iPhone Xs and Xs Max are not a big step forward in terms of optics, the photos they produce are the result of clever new hardware and software tricks. Apple made a good point about the A12 Bionic chip and its new Neural Engine and how it has been vastly improved over the previous generation. The A12 Neural Engine has an 8-core design and can handle up to 5 trillion operations per second. For reference, the A11 Neural engine had two cores and could do about 600 billion operations per second. This will not only speed up and improve AR experiences and take off some load off of the CPU, but will also play a big role in how the iPhone Xs takes photos.

Recommended Stories

Smart HDR


Until a couple of years ago, HDR photography in smartphones used to be a matter of flipping a switch (or leaving your phone to decide) – it was either ON or OFF. But with the introduction of the first generation of Pixel phones, things changed. Google's HDR+ mode offered a middle ground. It was smarter, subtler, and produced more natural-looking images with greatly improved dynamic range. The formula was refined for the Pixel 2 and the Pixel 2 XL, which were equipped with dedicated Visual Core chips that were tasked with image processing. Other smartphone makers took note and Apple was among them. The iPhone X had a very capable HDR mode in its own right, which has been improved in the iPhone Xs and Xs Max.

Apple's new Smart HDR leverages the power of multiple technologies — including the upgraded Image Signal Processor (ISP), the improved CPU, and advanced algorithms — to vastly enhance dynamic range in photos, without making them look artificial. But that's just the beginning.


Smart HDR works well even when shooting high-speed scenes

Another point Apple made during the announcement was shooting HDR photos of moving subjects. This is doable now, but often times results in motion blur in the image, depending on the speed of objects in the frame. Smart HDR aims to rectify this. Harnessing the power of the new ISP and processor, Smart HDR lets you shoot pictures of moving objects with zero shutter lag. This is quite impressive, considering that the camera needs to take multiple photos at the exact moment you press the shutter button. 

The A12 Bionic allows the camera to shoot a four-frame buffer of the scene, so it can freeze the motion in the frame. Then, the chip captures secondary frames at the same time, but these are at different exposure levels and are used to bring out details in the highlights and shadows. But that's not all. Smart HDR also shoots a long exposure during all of that, so it can pay special attention to the shadows and possibly restore even more detail. Of course, this would vary on a per-scene basis, but could be especially useful in high-contrast scenes where you have bright highlights and deep shadows.

But wait, there's more! After all of this is done, which doesn't take long (zero shutter lag, remember?), Smart HDR then takes all the images, analyzes them, and decides how to match up the best parts of the best photos for the best possible result.

Better portrait mode




Portrait Mode now lets you change the depth of field after taking the photo

Ah, bokeh – the quality of out-of-focus blur in photographs. The thing that separates the pros from the amateurs (or so many people will tell you). It doesn't matter if you've just propped your model on a table with random lights behind them. As long as the subject is sharp in focus against a creamy background, that's guaranteed Instagram likes!

To improve the shallow depth of field simulation for the new iPhones, Apple engineers have studied the qualities of many professional lenses and cameras. The result is better object separation and vastly improved bokeh rendering, if the official camera samples are anything to go by.

Another new feature, that Huawei users have been enjoying for years, is the ability to adjust the depth of field (DoF) after capturing an image. This is done by dragging a slider, which is supposed to simulate opening and stopping down the aperture on a camera lens. The slider goes from f/1.4, which is the widest and has the shallowest DoF, all the way down to f/16, which is the narrowest and has the greatest DoF. 


But aside from aiming to render the creamiest bokeh possible, the improved Portrait Mode is also supposed to deliver better skin tones and overall more well-adjusted, professional-looking images. This is done thanks to the enhanced TrueTone flash and the combined powers of the ISP and the CPU, which allow the iPhone XS to automatically adjust many settings and aspects of your photos on the fly, with a single tap of the shutter button.

For example, you are taking a portrait of a friend. You frame them, press the shutter button, and get the final image. What you don't know is that in the fraction of a second between pressing the button and seeing the result, the A12 Bionic is crunching numbers like mad. It detects the face, uses local tone mapping to make it stand out, employs Smart HDR to ensure an even exposure across the frame, reduces the noise, tweaks the white balance, and then fuses everything together in one final image.

Of course, Portrait Lighting is making a return, and though Apple doesn't mention any new additions to this particular feature, it should still benefit from the overall improvement of Portrait Mode. More specifically, from the enhanced depth measuring.


Improved low-light performance for video and stills



Official low-light portrait samples from the iPhone XS

Low-light photography was one of the low points for smartphone photography for many, many years. This changed in recent times, and Apple promises that the iPhone XS and XS Max will deliver when it comes to shooting stills and video in poor lighting conditions.

This is another area where the new A12 Bionic chip delivers in spades. Thanks to the upgraded, octa-core Neural Engine, the iPhone XS is supposed to deliver sharp, clean, and color-accurate results in very dimly-lit scenes. Apple shared a number of portrait photos taken at night and they look quite impressive, with great detail retention in the shadows. We're going to postpone our final judgment for when we get our hands on the iPhone XS and XS Max, but from what we've seen thus far, low-light performance seems to have improved quite a bit since the iPhone X.

Apple also promises the many of these impressive features will be available when shooting video, including tone mapping, improved color rendering, and stereo recording. However, some of these bells and whistles, including dynamic range enhancements, will only be available when shooting in 30fps. The good news is that they still apply to 4K video, and that if you'd rather have higher a higher frame rate, you'll be able to forgo them and still get 60fps at 4K.

We don't have any video samples on our hands right now, but Apple shared this "Shot on iPhone" video that showcases an interesting video experiment done with the iPhone XS:

Video Thumbnail

From what we've seen so far, the iPhone XS camera is shaping up as one of its biggest stand-out features. It may not be too different from last year's model, in terms of actual optics, but its performance is vastly improved through the combination of powerful hardware and software. The new A12 Bionic chip, with its upgraded Neural Engine, is capable of performing trillions of operations every second, which is allowing Apple to pull off tricks, such as taking a dozen images with a single tap of the shutter button and fusing them together in an instant. Wider apertures would have been welcome on both the main wide-angle camera, as well as on the secondary telephoto shooter, but it seems we'll have to wait another year for that. And either way, if the official sample photos from the iPhone XS are anything to go by, then who cares about the aperture and the megapixels.

Of course, as is usually the case with smartphone cameras, we'd rather postpone any real judgment until after we've spent some time with the devices in question. We can't wait to thoroughly test out the new iPhones in all kinds of scenarios, but we're going to have to wait a little longer. Join us then for our definitive camera comparisons and review of the iPhone XS camera.

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless