iPhone 16 Pro’s Ultra-Wide Explained: Understanding the Give and Take

Close-up of a smartphone's rear camera system featuring three lenses and a flash, set against a blurred gray background. The lenses are arranged in a triangular pattern.

As more people have gotten their hands on the iPhone 16 Pro and Pro Max smartphones, the change Apple made to the ultra-wide camera — namely the boost in resolution up to 48-megapixels — is getting more attention.

Since PetaPixel published its review last month, both Lee Morris of Fstoppers and Sebastiaan de With of Lux (the parent company of Halide) published their findings, and each came to similar conclusions regarding the ultra-wide camera.

“The ultra-wide lens is slightly sharper in bright light but falls apart when the light drops off,” Morris says.

That is an expected result given how Apple approached the sensor design of the ultra-wide. The sensor is the same size as the one in last year’s iPhone 15 Pro, but the pixels are smaller: they went from 1.4-micrometer quad pixels to 0.7-micrometer quad pixels. This results in more resolution at the cost of detail in less-than-optimal conditions. That’s a tradeoff that Apple chose here, which is in line with competitors in the smartphone space.

More pixels with similar performance in optimal conditions at the expense of performance in very demanding low-light conditions is the choice that is being made across the smartphone space right now.

Apple tells PetaPixel that it sees users loving higher resolution photos and this is the case specifically with the ultra-wide. Big landscapes with grand vistas do look better thanks to more resolution. Apple was also able to make macro photos better — the iPhone’s best close-focusing distance also uses the ultra-wide lens — because it had more pixels and more data to work with even though the pixels are smaller.

A narrow, covered pathway with a stone floor leading through a shaded area framed by wooden doors and trellises. Green foliage lines parts of the path, and the wooden structures have a rustic, weathered appearance. Light filters through the overhead lattice.
Photo by Chris Niccolls

But there is no free lunch, as an economist would tell you, and that applies to tradeoffs in physics, too. Improvements in one area nearly always come with a cost somewhere else. What people may see in low light is decreased efficiency with light gathering because of these smaller pixels. Even if it’s the same sensor surface area, so there may be some slight degradation when the iPhone moves to the binned 12 megapixels in really low-light situations. This happens in sub-100 lux lighting, which is indoor low-light situations and dark nighttime conditions.

In the iPhone review from de With, he makes a really important point: “While the iPhone 14 Pro introduced a 48-megapixel sensor for its main camera, they almost doubled the physical size of the sensor compared to the iPhone 13 Pro. This year, the ultra-wide is the same physical size, but they crammed in more photo-sites. In ideal lighting, you can’t tell the difference. In low light, the expected noise reduction will result in some smudgier images you’d also get from the 15 Pro.”

This is precisely the situation here. In good light, the extra pixels are beneficial without their potential cost being visible. In low light, you don’t get the benefit and instead incur a cost.

Looking up through the branches of a large tree with lush green leaves. Sunlight filters through the leaves, casting dappled light and shadow patterns. The sky above is clear and blue.
Photo by Chris Niccolls

Beyond just hardware changes, Apple also updated the image pipeline on the iPhone 16 series which powers the new photographic styles — a tool that’s a lot more than a filter. Photographers get a lot more versatility in terms of editing and previewing, and Apple thinks there are enormous benefits to image quality and editing capability provided by these features.

Apple also mentions that when it comes to third-party camera apps, those developers can use different flavors of the iPhone camera API. Some go for faster capture and that is likely where this delta would be most apparent. That’s worth keeping in mind, because if a photographer is using a third-party camera app, that may be a place where weaker image quality will be more apparent. When an app forces shorter shutter speeds in those low-light situations, the image quality issues will be more exaggerated.

The Cost/Benefit Analysis

There’s an inherent pressure to keep up with the competition in terms of megapixel count and features in smartphones, so if Apple feels confident that for most users in most situations, the 48-megapixel sensor is better than the outgoing 12-megapixel one, the decision makes sense from every possible angle.

It’s tempting to think that each new camera model — whether that’s a smartphone or a full-fledged standalone camera — is nothing but improvements. However, that’s not true; it’s not true for smartphones and it’s not the case for full-frame flagship cameras. For example, stacked image sensors deliver more speed, but almost invariably at the cost of peak dynamic range. For a professional and even many mid-level enthusiasts, the tradeoff often makes sense.

Close-up image of a green succulent plant with thick, fleshy leaves arranged in a rosette pattern. The leaves have smooth edges with subtle pinkish tips, creating a symmetrical and vibrant display.
Photo by Chris Niccolls

That’s why, for example, the Nikon Z6 III makes more sense for more people than the Zf, even though from a dynamic range standpoint, the Zf is superior. The benefits are worth the cost. The same can be said of Canon’s R5 II versus the R5, where Canon chose to take a hit to dynamic range in order to boost performance elsewhere. Sony did the same when it launched the a9 III.

With smartphones, the potential downsides of improvements are often even more obvious because sensors and lenses are just so small. Jamming more pixels onto a small sensor runs into the brute laws of physics. While computational photography and AI features often do an excellent job overcoming these limitations, as is the case on the iPhone 16 Pro, there are still situations where you feel the downside of more pixels.

There are other benefits to the upgraded pixel count for the ultra-wide camera that could tip the scales in favor of Apple’s choice. With the macro function, the extra pixels enable a higher magnification ratio at a higher megapixel count, all else equal. This benefit is most acutely felt in optimal conditions, of course, where the smaller pixels aren’t a limiting factor.

Apple knows this. The company tells PetaPixel that it looks at all of these factors when it chooses what sensor modules to work with, and then it makes those tradeoff decisions. Apple says that if there is image degradation, it happens in small, niche cases. It decided that the benefits of higher resolution for macros and landscapes outweighed the cost of small degradation in low light which it says is a relatively rare percentage of use cases.

Apple’s calculus makes a lot of sense for most people — but not for night owls.

Discussion