Dual cameras explained: how do they work and what are the differences?

46comments

As this year’s Mobile World Congress in Barcelona clearly proved, a new trend has officially started in the smartphone world, and it’s that of the dual camera. Granted, the idea itself isn’t exactly new: in fact, it goes back at least to 2011 when the HTC Evo 3D, the first notable phone capable of recording 3D video, was released. While this particular application of the idea never really panned out, manufacturers are still keen on further exploring alternatives, judging by many of this year's upcoming flagships.

Even if the general idea of using two sensors to take one photo is shared between these devices, this is as far as similarities go, as each phone maker has a slightly different opinion regarding the way this tech should work. This, unfortunately, can be a major source of confusion, but luckily for you, we’ve taken on the task of demystifying the hot new thing in tech, so you don’t have to. So whether you’re in the market for a new device, or are simply wondering what all the fuss is about, we’ve got you covered:

The Apple approach



Let’s start with the most obvious contender first: the iPhone 7 Plus. Launched last year, Apple’s larger flagship offered a feature its sibling lacked: the dual-camera setup on its back. With it, the company chose to address a small yet important problem in smartphones: the lack of optical zoom. Sure, there’s always the option of zooming in digitally, but it’s a lossy and bad-looking one.

The iPhone 7 Plus’ camera solves this in a rather inelegant, but still ingenious way: the second camera on its back uses a different, permanently zoomed in telephoto lens, with the device switching between them whenever necessary. While this sounds like a hacky solution, it works wonders, in large part thanks to Apple’s software, which makes the transitions as unobtrusive as possible.

A side effect of this approach is that the two rear cameras also have different focal lengths, which Apple utilizes in a good way: Portrait Mode. Even though it was added in an update some time after the phone’s release, it’s still become one the device's defining features. In short, it works like this: the image from the two cameras is combined into one, with the person or object in the foreground being the only thing in focus. While the effect can look a bit artificial at times, it’s still light years ahead of the purely software-based solutions we’ve had before that.

Recommended Stories
While the iPhone 7 Plus is the most well-known device using this approach, it’s not the only one. At this year’s CES, Asus showed off its upcoming Zenfone 3 Zoom, which does pretty much the same thing, but with a triple zoom and improved focusing capabilities. It also includes its own portrait mode functionality, though it remains to be seen how well it actually performs in real-life conditions.

The LG approach


Before Apple had the iPhone 7 Plus, however, LG had been doing something similar with its LG G5, though few people are aware of that fact, what with phone being a rather spectacular market failure. The upcoming G6, however, hasn't abandoned the idea, and has in fact improved upon it with an upgraded second sensor.

Compared to Apple’s approach, the way LG handles the second camera is almost opposite: where the 7 Plus has a permanently zoomed-in camera, the G6’s appears zoomed out, due to its much wider field of view. This method isn’t really widely used, though the G6 could soon change that, as we expect it to sell pretty well. Some people, however, might not be pleased with the results, as the wide field of view also noticeably distorts the image, particularly along the edges.

The Huawei approach



This idea, as implemented by several of Huawei’s flagships, including the upcoming P10, is the most technically complicated of the bunch – it, again, involves two separate sensors, but the difference here is that one of them shoots exclusively in monochrome. This, in practice, means that the camera setup is more sensitive to light, and can therefore take better photos in low-light conditions.

A bonus of this approach is that users can take monochrome pictures of much higher quality than color ones, which is a welcome addition for those who like the effect. Unfortunately, in our review of the Huawei P9, which uses almost the same tech as the P10, we found video quality to be quite lacking, though photos were still of adequate quality.

Even if Huawei is the company spearheading this particular approach, it’s far from the only one to use it. Just last September, Qualcomm unveiled its so-called Clear Sight technology, which is another implementation of the same idea, but more easily and cheaply accessible to manufacturers. Xiaomi has already used it in last year’s Mi 5s Plus, and other device makers are bound to jump on it, too, so we'll likely be seeing more of it in the future.

Past, present and future


While these are the popular ideas being used right now, they are certainly not the only ones available. For example, the aforementioned HTC Evo 3D was designed to use its dual-camera module for stereoscopic 3D, and even featured a glasses-free 3D display (the same idea as in the Nintendo 3DS). Unfortunately, 3D turned out to not be as popular among consumers, prompting the eventual abandonment of the idea.

A much less popular approach that is still used nowadays also warrants mentioning: the one used in the mid-range honor 6X, as well as some of HTC's older flagships. It relies on two very different sensors, one with high and one with low resolution, and uses the second one only for various special effects, including iPhone-like bokeh. Most of the time they're a fun but cheap way of making an image more interesting, though for many users this is all that's needed.

As for the future, Oppo just recently showed off its newest camera tech, dubbed 5x, which, too, features a dual-sensor setup, but with a slight twist. With it, one of the sensors, which is attached to a telephoto lens and hidden inside the device’s chassis, users can have up to 5x “lossless” optical zoom, which could prove to be far better than current tech. We haven’t seen this idea implemented in a real product just yet, but we expect Oppo won’t keep us waiting for too long.

Another possible use of the technology is represented by Google’s Tango platform, namely augmented reality. Granted, we’re cheating a little here, as Tango requires three or more sensors to function properly, but the idea is generally the same: using a multi-camera module for a singular purpose. And the results themselves are promising, even if the platform doesn’t yet feel like a finished product.

So far there’s only a couple of devices supporting Tango out in the wild, but it may turn out to be a Big Thing some time in the near future, provided Google polishes it well enough. And so do multi-camera systems, in fact – while current solutions have their own problems, those are diminishing with each subsequent release, and we’re excited to see what’s to come next.

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless