Lots of high-end smartphones have added multiple camera modules and lenses to the back of their designs. But why? The fact is that they do different things for different phones, so we’re here to break it all down for you.

Multiple rear cameras are a luxury feature at the moment only found on the most expensive phones—like the iPhone X—but the trickle-down nature of mobile tech means we’ll soon be seeing them on less expensive models, too, so it’s good to get familiar with how it all works.

The “Two Are Better Than One” Approach

Different camera modules and lenses are better at different tasks. A low-aperture, wide-angle lens is great for gathering sharp detail up close, but not so great when your subject is in motion. A longer lens can “blow up” images from far away subjects, but lets in less light.

The Galaxy S9 uses identical sensors, but different lenses on its two cameras.

With a conventional camera, taking two photos with two different lenses isn’t all that useful—you’ll just end up with two mediocre images. But with specialized image processing, the software upon which digital cameras run, you can combine the strengths of both lenses and image processors while removing the weaknesses. This results in a single image that’s brighter, sharper, and clearer than either camera could achieve on its own.

Combining multiple images isn’t a new technique. That’s how HDR photography works: photographers take multiple images at different levels of exposure to highlight different colorful portions of the image, and then combine them for “High Dynamic Range.” Phone camera processing is merely automating this sort of process and applying it near-instantly to give users better-looking photos, especially in low light.

Now, image processing does a lot of other stuff too, some of which isn’t actually helping the image so much as messing with it. The “bokeh” portrait effect is a good example: most phone cameras are simply artificially blurring part of the image to achieve the same effect as a low depth of field on a regular camera lens. But in general terms, high-end phones with double lenses and advanced image processing can perform better than their single-lens counterparts.

The “Double Zoom Option” Approach

Phone cameras are getting amazing capabilities, but one thing that they’re not very good at is zoom. Phone bodies are simply too small and thin to house the kind of miniaturized electronics and optics needed for true zoom photography, short of outlandish designs like the Samsung Galaxy S4 Zoom. (You’ll notice that this brief design trend disappeared pretty quickly).

But using multiple camera modules and lenses can alleviate these issues, at least to some degree. The secondary lens in high-end phones can be set to a slightly farther zoom level, generally expressed as “2x.” The results won’t beat a DSLR or even a decent point-and-shoot with a full zoom lens, but if your phone is the only camera you use, it’s better than digital zoom (which merely blows up the image). For example, the iPhone uses what it refers to as a primary “wide angle” and a secondary “telephoto” camera, with the latter at approximately twice the zoom of the former.

The second lens is also usually set at a different F-Stop value, the ratio of the aperture to the diameter of the lens. This is a physical property of the camera module; it means that the farther-zoomed lens is working with less light than the standard lens, and thus takes darker and less sharp photos. Again, image processing—combining multiple images—can help alleviate this. Some more interesting software tricks, like Samsung’s ability to take two photos and “add in” the portions of the image that are “missing” from a zoom shot, are enabled as well.

The “Wizard Of Oz” Approach

Wizard of Oz isn’t a technical term. Rather, it’s a way for you to remember another example of dual camera setups: color and black and white. In some models, the two different camera modules are assigned to take color and monochrome images. This doesn’t result in two photos (at least with default settings), but instead a single photo that uses the color information from one to augment the sharp detail of the other.

Simultaneous images from the Essential Phone’s monochrome and color sensors.

Once again, this dual setup is depending on the phone’s image processing software to work most of its magic, and make up for phones’ size constraints for bigger camera modules. The different properties of the monochrome camera can also allow the phone to focus faster or adjust the preview to more accurately show what the final image will look like.

There’s at least one new premium phone that’s combining all of the above techniques for a massive, triple-camera setup: the Huawei P20 Pro. This phone includes three rear cameras: one 3x zoom camera for long distance shooting, a primary 20-megapixel camera for color images and portraits, and a third monochrome camera for collecting sharper image detail. It probably won’t be the last phone to try this technique—there are already rumors of a forthcoming triple-camera iPhone.

Other Dual Camera Setups

There are other dual camera systems that don’t fit neatly into the categories above, though mostly those designs have been retired or simply abandoned. Examples include:

HTC’s “Ultrapixel” setup: one high-density sensor and low F-stop lens combined with a more conventional camera. HTC has abandoned its dual camera designs, now favoring a more flexible single “Ultrapixel” module.

Older designs like the HTC Evo 3D used dual cameras for 3D video.

Older 3D camera phones: some Android models used two identical camera modules with a considerable gap in between to take photos and videos with a “3D” effect. These designs were usually paired with a 3D lenticular screen, and interest in this feature has died along with the brief 3D TV product category.

Augmented Reality: specialized phones like the Lenovo Phab 2 Pro use dual lenses and modules to accurately measure and map the physical space around them.

Michael CriderMichael Crider is a veteran technology journalist with a decade of experience. He spent five years writing for Android Police and his work has appeared on Digital Trends and Lifehacker. He’s covered industry events like the Consumer Electronics Show (CES) and Mobile World Congress in person. Read Full Bio »