Apple is confident that the iPhone 13 Pro and 13 Pro Max can capture images with the best the industry has to offer, and built a newer camera to prove it. The result is a device that marks huge improvements over the devices most will be upgrading from.
The company considers it “a dramatically more powerful camera system” and that the Pro models received the “biggest upgrade ever.” These are bold claims, even by Apple’s standard parlance, but the stakes are certainly getting higher. Its usual competitors, like Samsung and Google, have pushed Apple hard over the years, and now, Chinese brands are upping the ante with their own innovative features and output.
The iPhone 13 Pro and Pro Max don’t have any camera discrepancies. Unlike their predecessors, they’re each equipped with the same exact camera hardware and run the same software computations. It really only comes down to size and battery life as differentiators.
That means imaging results apply to both phones equally, which is great news for those who want to go “pro” with their iPhones. It’s just a question of whether the ends justify the claims.
Design and Camera Features
Those familiar with the iPhone 12 Pro models won’t find a whole lot has changed on the outside with the 13 models. Apple retained the same dimensions, going with a matte finish on the back and stainless steel edges, the latter of which turns out to be major fingerprint magnets that few will notice because it will likely always be in a case.
Under the hood, Apple changed two key things. First, is the newer A15 Bionic chip with an also newer image signal processor. The second is the rear camera array with a new image sensor on the primary wide camera, which is the largest Apple has put inside an iPhone to date. It’s not clear exactly how big the sensor is, but it sticks to 12-megapixel (26mm equivalent), albeit with larger 1.9 Micron pixels this time around. Apple’s sensor-shift optical image stabilization (OIS) also applies to both Pro models this time, as it was only in the iPhone 12 Pro Max last year.
While the ultra-wide and telephoto sensors remain the same, Apple did tweak both of them to work somewhat differently. The 12MP ultra-wide (13mm equivalent) now doubles as a macro camera, though you don’t need to necessarily select that lens to see it in action. The company’s own software transitions from the primary to the ultra-wide once you move within macro range. For the 12-megapixel telephoto lens, Apple increased the focal length from 52mm to 77mm, effectively giving it a 3x optical zoom over the previous 2x.
Another feature worth noting is ProMotion which gives both phones’ screens a 120Hz refresh rate for smoother navigation. That extends to navigating both Apple’s and third-party camera and editing apps. It’s also exclusive to the Pro models, as neither the iPhone 13 nor 13 mini have it.
Apple also increased the storage capacity on these devices, with the base model now sporting 128GB with options to increase that all the way up to 1TB. There is one caveat to the 128GB models, which is that ProRes video — coming in a later update — is limited to 1080p resolution. If you want 4K, you’ll have to go 256GB or higher. Storage figures more prominently here than perhaps any other time, given the focus on higher resolution video and shooting still images in RAW.
For Some Perspective
Apple exclaimed the leaps it believes these phones represent over their iPhone 12 predecessors, but I would argue the better comparison is with the iPhone 11 Pro devices. Only two years removed from when those launched in 2019, Apple’s progress on the mobile photography side may be best exemplified by that wider gap in time.
The metrics on paper certainly set that stage. Apple says the primary camera’s image sensor on the iPhone 13 Pro and Pro Max is 84% larger than their 11 Pro and Pro Max counterparts. The 13 Pros also have a wider aperture at f/1.5, which should bring in 2.8x more light. The ultra-wide camera also has a different sensor than what was in the 11 Pros, and is supposed to draw in 92% more light. The telephoto camera doesn’t offer much of a difference in that regard, other than its 3x zoom.
Having never used either of the iPhone 12 Pro devices, I used my iPhone 11 Pro as a basis for comparison. Note that, aside from the telephoto camera, the other two retain the same focal lengths, so Apple’s numbers make it clear that you should expect a much easier time shooting in low-light when situated in the exact same spots with these newer iPhone 13 Pro devices.
All of this bears consequences for video recording, too, which is really where the iPhone has excelled over the years. By addressing the difficulty in making low-light images look better in all facets of the camera array, it would be reasonable to assume improvements for stills would also apply to video.
One of my concerns with Apple’s Camera app has been the linear guardrails it presents to mobile photographers who would prefer more flexibility. Shooting in RAW, or ProRAW, as Apple calls it, was a nice step in that direction. The iPhone 12 Pro and Pro Max introduced that, and never pushed it to the iPhone 11 Pro and Pro Max, so off the bat, I get to shoot at will in an uncompressed format (storage space permitting, of course) for the first time.
Night mode is no more or less accessible, though I do wish it offered more control. It’s a familiar refrain for me every time I try to command more out of the Camera app, but it’s at least inching in the right direction. It’s become easier to customize composition and output with simple sliders, finally allowing users to determine at least some of the fundamentals, like temperature and tone. Apple has long skewed toward the warmer and softer side for its standard output, which you no longer have to accept if you want more contrast or a cooler temperature.
Not surprisingly, Apple does try to automate it for users who aren’t interested or particularly knowledgeable. The best explanation comes by going to Settings -> Camera ->Photographic Styles. Here, you get a basic synopsis of what each style looks like and how you can adjust it. In all fairness, the differences are neither stark nor distinct when swiping through the example image Apple used to present them. If you’re so inclined, you can select one to become your preferred style whenever you launch the Camera app. From there, you get to adjust the tone and warmth to customize the composition further.
What is interesting is that the style you choose applies itself to an image in a localized fashion, meaning that it can dial it up for one aspect of an image, and not another. Rather than a blanket effect, kind of like a filter, you get congruency relative to what’s actually in the image.
None of this is possible with the iPhone 11 Pro and Pro Max — at least not with Apple’s own Camera app. You do get the same basic filters, but not the tools to decide on style. Slowly but surely, manual controls are leaking into Apple’s camera interface, even if software computation is working on the stylistic side — this is all good stuff.
Other than the new Cinematic mode, the options are otherwise still the same. All the photo modes are carryovers, except for macro, which works automatically and doesn’t appear as something you can turn on or off (until Apple releases an iOS update to add that, which it will this fall).
A larger sensor is great, as is a wider aperture, but it’s the software that has to bring out the best in them, given they are both diminutive by photography standards. The A15 Bionic chipset comes with a newer Deep Fusion image signal processor and faster Neural Engine. Apple initially unveiled Deep Fusion with the iPhone 11 Pro and Pro Max, and ever since, the idea has been to apply it in unison with Smart HDR to produce the best possible images. Smart HDR 4 stacks bracketed shots together to balance shadows and highlights, while Deep Fusion brings out greater detail. It’s even smart enough to render people differently within the same frame, though can’t do it for objects.
Despite that, it’s supposed to be a killer combination, and at times, it can feel like it’s worked some magic. Just not all the time, and it’s a familiar bugaboo that creeps into certain shots. Whether it’s day or night, bright or dark, images are more than capable of coming out looking great. Apple has a knack for getting nice colors and realistic skin tones. White balance is stable, and images often capture nice details, regardless of distance.
The issue is with dynamic range, especially when the light source is bright, leading to an abundance of luminance in the shot. This could happen on a bright sunny day or with a street lamp in the frame. High contrast scenes present the biggest challenge. For instance, a foreground subject with some shadow will actually come out looking well-lit and detailed, but if there is a light source, like a bright sky or bulb, in the shot, it will likely be washed out. You’re left with a situation where you may need to shoot it again, only this time focusing on the light source to then try merging them later.
That’s not something the average iPhone shooter is going to do, especially when the software was designed to do it on its own. My only recourse was to lower the exposure, except that sometimes darkened the foreground subject too much, so I tried to find the happy medium by shooting in RAW and hoping for the best editing in post afterward. Even the unique Photographic Styles Apple offers can’t rescue an image from this. Plus, when you shoot in RAW, the style algorithms lay dormant since you’re not saving HEIF or JPEG versions simultaneously.
This contrast issue doesn’t happen with every single image. In more moderate conditions, results come out looking much better, which is to say that the sun isn’t quite as high or the light source isn’t beaming right through the lens into the sensor. I came away impressed with results on a number of shots, even if I did notice subtle hints of luminance or noise.
Ultra-wide and Macro
Apple maintained the 120-degree field of view for the ultra-wide camera, so the perspective hasn’t changed. The output, however, definitely has, at least when I compared it to the iPhone 11 Pro. That phone’s ultra-wide camera was never available to shoot in Night mode to begin with — not that it would’ve been great with its f/2.4 aperture anyway. For the first time, Apple has made Night mode available on all three lenses.
A larger sensor and wider f/1.8 aperture make the iPhone 13 Pro and Pro Max’s ultra-wide more than ready to shoot at night with this camera. It’s not entirely an apples-to-apples (pardon the pun) comparison because the 11 Pro had a restricted ultra-wide lens in more ways than one. Still, this is what progress is all about, and in just about every respect, the iPhone 13 Pro trounced my iPhone 11 Pro. There were instances, particularly in bright sunny conditions, where I felt the 11 Pro’s ultra-wide did better, but these were often specific circumstances.
Then there was the Macro mode, which utilizes the ultra-wide lens. Like other manufacturers who have already done it, you can also get into macro mode directly from the ultra-wide lens, though you do have to move in really close and then pull back a bit to see the subject enter into focus. Doing so from the primary camera caused a jittery transition that Apple smoothed out with the iOS 15.0.1 update.
It’s nice to have proper macro to work with, though I do see room for improvement. It lets you get as close as two centimeters from the subject but can be very finicky with even the slightest movement, either from the subject or your hand, throwing focus off completely. I did manage pretty good results shooting with it, I only would like to see a little more stability applied to it.
Raising the focal length for the telephoto lens feels like the iPhone finally has a proper way to get closer to a scene from the same distance. It does come at the cost of gathering light, which explains the tighter f/2.8 aperture. That’s a big reason why it’s not all that good in night shots, though you could manage decent results if you shoot at dusk or in blue hour.
This lens is more about getting closer when lighting is abundant. Not only that, but you also have to consider the additional optical reach in relation to how Portrait mode works. The iPhone 13 Pro and Pro Max do necessitate the shooter standing farther back in most cases, as I noticed when I used the same mode with the 11 Pro. Apple still set the distance to 2.5 meters, but I would say it’s actually more like 3.5 meters to frame the exact same shot if you were using both phones’ telephoto lenses side-by-side. You do have the option to select the primary lens as an alternative (as with any other iPhone offering the same thing) in case framing becomes problematic.
Portraits should come out looking better for indoor shots, including when using the different light effects available. I’m just not sure I saw a dramatic difference for other shots. The iPhone 11 Pro may have done slightly better outdoors in brighter conditions.
Night and Low-Light
It’s about time Night mode came to all three lenses, even if results do vary when night falls and the lights dim. I found the most success shooting during blue hour, albeit with light sources that didn’t cause problems for any of the camera modules. The contrast and colors looked gorgeous in some shots, which was great to see. When I did capture a night scene with a well-lit subject, it was hard to argue with the results.
That said, some recurrent issues continually come up in night and low-light shots. I already mentioned limits on dynamic range, and it’s when shooting at night that they become more overt. It’s one thing to shoot a building reflecting light, and another when shooting a night scene with bright signage or lamps. In the latter case, the mode simply can’t keep those light sources and the darker areas in check. Rampant luminance aside, you might also find noise creep in, along with the odd image flare.
Moreover, these phones struggled with light sources that were a further distance away. When I shot a cityscape from afar on a tripod, I was surprised to see the lack of detail and color. Even if I attempted to override the mode by forcing a longer exposure and manually lowering exposure brightness, I would end up with an overexposed shot anyway. Trying it in RAW also didn’t fix it. I suspect Apple’s Smart HDR 4 overcompensates by ramping up ISO on too many of the bracketed shots it composites together. I am speculating, as I have no way to confirm it, it’s just a hunch based on what I saw while shooting in these conditions.
Whether I’m on the right track or not, Apple should address these shortcomings through future iOS updates because the troublesome elements are surely software-related. There’s no doubt the iPhone 13 Pro and Pro Max will perform better than the 11 Pro could in the same conditions. I’m just not sure the results were entirely commensurate with the improvements associated with Apple’s newer sensor.
Video is possibly the bigger story, though I won’t go too in-depth about it here since we at PetaPixel focus on still photography performance. I’ve always admired the iPhone’s ability to record solid video footage, and the new Cinematic mode brings another flavor with a mix of Hollywood to boot. I look at it like Portrait mode for video, which is kind of how it looks as a finished product anyway. This is hardly new, given that Samsung and LG offered manual and bokeh video effects in their phones going back years.
What Apple did differently is jazz up the feature to make it feel like directing a scene. It works quite well, as it can automatically focus on a subject entering the frame, including when said subject is a person who starts talking. After you’re done, you can go into the Photos app, select Edit, and make cool adjustments. Change focus or choose a different aperture, with the timeline showing you where you changed focus as you were initially recording the clip. An upcoming macOS update will let you edit Cinematic footage in iMovie and Final Cut.
It’s pretty neat, and is exclusive to the iPhone 13 lineup, including the regular 13 and 13 mini. It does have some walls you have to play within, like its limit to 1080p at 30fps. It also only works with the primary and telephoto lenses. Since the regular 13 and 13 mini don’t have telephoto shooters, you’re limited to only the primary camera for those two models. The lack of 4K will probably turn off seasoned videographers who want more pixels, and ProRes won’t be coming until a future iOS 15 update before the end of the year.
A Solid Camera That is Legitimately Apple’s Best
iPhone 13 Pro and Pro Max users may come away thinking these two produce outstanding results, and they would certainly not be wrong. These devices are certainly capable of that, though I wish they were more consistent. For every couple of great shots, I was left with one that I knew a competing handset could do better. The mobile photography arms race is tough, and as much as people like me clamor for bigger sensors and better optics, I recognize the real battle is in the coding arena.
That said, Apple says that the cameras in the iPhone 13 Pro and Pro Max are its best ever, and they certainly are.
Apple has made some big strides compared to where it was in 2019 and through most of 2020 before the iPhone 12 models launched. From that perspective, which is one most buyers are going to be looking at these devices from, the technological leap is certainly noticeable and worth gravitating towards. Additionally, the iPhone 13 Pro retains its position as a dependable video recorder, and Cinematic mode holds some real promise, provided it can get to 4K to appeal to a wider subset of users.
Are There Alternatives?
Google will be launching the Pixel 6 soon, and that may signal a big leap forward for that line. Until we know more about what Google has in store. that is pure speculation though. The Samsung Galaxy S21 Ultra can hold its own very well against all comers, and it includes a similar feature to Cinematic mode in the form of Portrait Video, although it lacks the same level of post-production sophistication.
Then there are Chinese brands that are pushing the envelope in mobile photography. The Vivo X70+ may be the camera phone of the year if it lives up to the company’s hype. Xiaomi has also impressed with its Mi 11 Ultra. The downside with those phones is that they’re not as readily available in North America, and you may run into issues with 5G connectivity because of the lack of band support.
Should You Buy It?
Yes, so long as you’re upgrading from an older iPhone that’s not the iPhone 12 Pro or Pro Max. In the former case, the upgrades are noticeable and worthwhile. Another PetaPixel staff member upgraded to the iPhone 13 from the Xs, and reports the improvements are monumental from that perspective. In the latter case, you would have to really want that buttery smooth 120Hz display, the ability to shoot in Night mode with all three lenses, have access Cinematic mode, and want more zoom on the telephoto lens in order to see the draw to upgrade. If you can wait, then you may benefit more when Apple releases in 2022.