iPhone 13 camera before iPhone 14 launch: Apple and Android keep failing to make real camera phones

32comments
This article may contain personal views and opinion from the author.
iPhone 13 camera before iPhone 14 launch: Apple and Android keep failing to make real camera phones
Photography, as your parents or grandparents used to know it, is a dying breed.

If 20 years ago the idea of a photograph was to capture an important moment in one’s life as authentically as possible, today we’re living in a different world… Fair enough, not everyone owned a camera in 1980, and phones have managed to make this once unattainable thing very accessible, which is wonderful!

However, as it turns out in 2022, the world is less about authenticity and more about “making everything better” - whatever that’s supposed to mean. Nowadays, photos (amongst other things) are supposed to enhance our reality and make it “cool and fun”. Your baby can have bunny ears, and you can puke rainbows. Why not!

But there’s more beyond Snapchat filters that enhance the way your photos look, and it all boils down to something called computational photography. It’s the “hidden filter” that makes photos taken with your phone look "ready to share online".

This little experiment will try to show off the pros and cons of modern computational photography-enabled phone cameras, and the phone I’ve chosen is Apple's iPhone 13 - one of the most popular phones in the past ten months.

Instagram’s effect (filters) on smartphone cameras today is seriously underestimated


Before I show you a bunch of “before and after” sample photos, let me establish something: I’m well aware that people like photos that are ready to be shared online. And while I might not be one of them, I think I might know what happened here…

In a nutshell, social media played a huge role in the demand for “Instagram-ready” photos (that’s a term that we actually use in the tech community). Speaking of The Gram, ever since it emerged in 2010, the photo and video sharing social network has encouraged the use of bold filters with exaggerated colors, which people simply couldn’t resist, which of course, meant Apple and Android would jump on board…

For instance, Instagram was the reason Apple felt the need to include a Square Photo mode in the iPhone 5S (2013), which was part of the iPhone's camera for nearly a decade. However, even more importantly, this was around the time when iPhone and Android started adding photo filters to their stock camera apps. Because the Instagram fever made it clear that people liked filters.


And then… we entered the era of what I call “filters on steroids” or ”hardcore computational photography”, or "sophisticated filters", if you'd like. The phone that represents the adoption of “ hardcore computational photography” in my mind is Google’s Nexus 6P. In this phone, (most of the) computational photography came in the form of something called HDR+.



Recommended Stories
What HDR+ did was “advanced image stacking”. HDR+ was part of the post-processing stage of taking a photo with the Nexus 6P/Nexus 5X and its role was to balance out the highlights and shadows in high-contrast scenes - one of the biggest challenges for phones back in 2014-2015 (alongside the sheer inability to produce usable night photos).

Anyway, in a short verdict to HDR+: It made the Nexus 6P one of the best phones for taking photos. Sure, my bias plays a role in that statement (I never bought a Nexus 6P, but it was only because I couldn't afford it), but there was no denying that the somewhat darker photos Google's 2015 flagships took had something very appealing to them. Other tech enthusiast loved them too.

Light, highlights and shadows: What photography really should be about


It wasn’t until about a year ago when I watched a brilliant 24-minute long video by David Imel that managed to help me verbalize what I was feeling about the time when the Nexus 6P and original Google Pixel’s cameras ruled the phone camera industry.

To sum up 24 minutes of storytelling, David is drawing a parallel between modern computational photography and classical art, all in an attempt to explain the importance of light for both photography and paintings.

What he's trying to explain is that in the early days of photography, the artistic control/element (in photos) was completely founded “on the intensity of the highlights and the deepness of the shadows” - like in paintings. Those are used to emote feelings and create depth through tonality in our photos. This is especially evident in monochrome photography where light, shadows, and highlights are pretty much the only elements that create nuance and perspective.

But, as he says, “computational speed was advancing a lot faster than physics was changing”, and it looks like this is why I don’t like many of the photos that my super-powerful iPhone 13 takes and wish they were more like the original Google Pixel’s images.



iPhone 13, Galaxy S22, Pixel 6 take photos that don’t represent reality and aren’t always more appealing than what the real scene looks like



What we see here are a bunch of photos I’ve taken with the iPhone 13 in full auto mode. It’s important to note that I didn’t start taking photos in order to make my point, but the photos the iPhone 13 gave me became the reason to write this story...

Anyway, iPhone 13 photos taken in Auto Mode are on the left, and the same iPhone 13 photos, which I’ve edited are on the right. I've adjusted them not to my liking, but to the authenticity of the scene at the time (and to the best of my ability).

I chose to edit the photos using the iPhone’s photo editing abilities because that’s what most people have access to. Of course, Lightroom would’ve given me a lot more (and better) control over the different properties of the images (which weren’t taken in RAW format), but that’s not the idea here.

If you’re curious, what helped me most in my attempt to get the iPhone 13’s photos to look more realistic to the scene, it was dragging the Brightness and Exposure sliders way back. Which means photos taken with modern phones are too bright. Then, some Brilliance and Highlight and Shadow adjustments helped me to get an even more accurate result.

iPhone 13, Galaxy S22 and Pixel 6 showcase the problems of modern HDR and computational photography



The results tell me that computational photography on phones today is quite literally a hit or a miss.

On the one hand, some people will like the default output of the iPhone 13, Galaxy S22, and Pixel 6 (the Galaxy also takes photos that are too bright, while the Pixel’s are incredibly flat), because they are “sharable”.

But even if we leave authenticity aside, I’d argue the iPhone's processing doesn't actually make photos look “better” than what the scene looked like. Take another glance at the samples shown above. Which photos do you like more? The ones on the left or the ones on the right?


Apple, Samsung, Google & Co have made some staggering progress in all three areas thanks to large camera sensors (capture), fast processors, including dedicated image processors (process), and super-bright and color-accurate screens that let you view your photos (display). However, I’d argue that, as it often happens, we don’t know when to stop… As things stand, most phone makers are abusing the incredible software and hardware power that the modern phone camera offers.

Photos and even videos taken with iPhone 13 and other modern phones often appear too bright, too oversharpened, too flat, and eventually “lifeless”. Sure, they might be able to capture both the highlights and shadows incredibly well and even turn night into day thanks to Night Mode, but without the element of balance and natural contrast, photos taken with most phones won’t emote any feelings...

But hey! They look fine on Instagram.

In the end: There’s light at the end of the tunnel of computational photography thanks to Sony and Xiaomi



To end on a positive note, there’s light (pun intended) at the end of the tunnel!

Unlike Apple and Samsung, companies like Sony have always tried to stick to the basics of photography, and that’s evident by the fact that the Sony Xperia 1 IV has incredible processing power but doesn’t even include a Night Mode in its camera. The phone also brings the first continuous zoom on a modern smartphone, which is as close to a "real camera zoom" as we've ever gotten.

And then, of course, we have the Xiaomi 12S Ultra, which uses a full 1-inch sensor and Leica’s magic to deliver some of the best photos I've ever seen come out of a phone camera (if not the very best). Xiaomi and Leica chose to let the shadows be shadows, avoid oversharpening, and rely on groundbreaking hardware, which (shocker!) results in photos with incredible depth, and natural detail.



So, I call for Apple, Samsung, and even Google to go back and look at the original Pixel; go back and look at the iPhone 4S (as unimpressive as its camera might seem today), and bring back the realism in our photos. I’m sure that with the increasing power of hardware and software, a touch of authenticity can go a long way!

And you know - for those who want bright and saturated photos… Give them filters!
Create a free account and join our vibrant community
Register to enjoy the full PhoneArena experience. Here’s what you get with your PhoneArena account:
  • Access members-only articles
  • Join community discussions
  • Share your own device reviews
  • Build your personal phone library
Register For Free

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless