Another year means another trio of Samsung flagship smartphones. This time, the Galaxy S24 lineup is headlined by the AI-driven Galaxy S24 Ultra. The Galaxy S24+ and Galaxy S24 are also getting the AI treatment, but after spending some time with all three, it’s obvious where the primary focus lies, especially as it relates to mobile photography.
On paper, it would seem Samsung didn’t do a whole lot. Mind you, a 200-megapixel image sensor in the previous Galaxy S23 Ultra is a hard act to follow as far as sheer specs go, so the changes this time around are almost entirely software-driven in the Galaxy S24 series. We’ll be diving deeper into what the results look like in a full review at PetaPixel, but we also gleaned insights into what to expect from the various features.
The previous Galaxy S23 Ultra used the company’s 200-megapixel ISOCELL HP2 image sensor, yet it’s not entirely clear if the S24 Ultra is using an ISOCELL HP2SX this time. Either way, the resolution and feature set around the sensor remain the same, where you can capture JPEGs at full resolution, making it easier to crop down to a usable image without resorting to one of the telephoto lenses.
Samsung wants to change that by adding a new 50-megapixel telephoto lens with 5x optical zoom that can go up to 10x optically through some trickery with the main wide lens and its larger sensor. This is also the lens that zooms in to 100x, which Samsung claims will deliver better results in optimal conditions. There was no way to confirm that in the confines of a small ballroom, so only testing in the field will see if that checks out. One thing Samsung reps demoed was the ability to read QR codes clearly using zoom at any distance close enough to see them.
Otherwise, the 10-megapixel telephoto with 3x optical zoom and 12-megapixel ultra-wide are holdovers from the previous S23 Ultra, so there aren’t likely to be any real surprises there.
Another thing Samsung addressed is a level of what I call “image certainty,” which is to say that its Super HDR will show a more accurate preview of what the final image will look like before taking it. “Nightography,” as Samsung calls it, gets a 60% boost in light gathering because of larger Micron pixels in the Tetra2pixel binning process, and as before, Night mode is always available in low-light settings straight from the regular Photo mode.
Samsung Throws in Extras for Expert RAW
Expert RAW stood out the last time around for a couple of reasons, not least of which was the ability to shoot RAW images at full 50-megapixel resolution or 12-megapixels for low-light shots. Those options return, only this time, Samsung included a third option, letting you shoot at 24-megapixels for something in between.
It also added a variable neutral density (ND) filter setting you can apply to any image you want to shoot within Expert RAW, letting you adjust the effect to emulate changes across about five stops. It works differently from simply adjusting exposure, adding some intrigue into reducing the amount of light to avoid overexposing images. Without a variable aperture on the phone, the filter could theoretically mitigate some of the challenges with high brightness or reflections. Unfortunately, the demo wasn’t in a setting where we could test that.
Interestingly, there is a setting to share images with tablets and PCs — provided they’re also Samsung products. The company says it makes for a smoother transfer process but also smacks of an intention to retain a walled garden. There’s an inherent advantage in moving RAW files over to a tablet or PC for a quicker editing workflow, though there were no Galaxy tablets or computers to try this out, so we’ll be digging into it further once we have the proper devices available.
Special modes like Multiple Exposure and Astrophoto are back again, but little has changed with them. Multiple Exposure lets you take two images or more and superimpose them onto each other, whereas Astrophoto is for taking long exposures to capture the cosmos when out in the darkest conditions. Neither of those shooting modes allows you to shoot at the full 50-megapixel resolution, so you’re limited to 12 megapixels in both cases. Along with the ND filter, you can turn any of these three features off in the Expert RAW settings so they don’t appear in the interface.
AI Editing Features Galore
AI is a buzzword that will be tossed around a lot in 2024, including by brands like Samsung, but it may be on to something with its Galaxy AI features. The editing suite within the Gallery app will have new modes like Edit Suggestion and Generative Edit that can apply to images already shot on any of the Galaxy S24 phones or even those pulled from Google Photos and other devices, including the iPhone.
Edit Suggestion uses an algorithm to analyze an image and then suggest what it thinks it needs to look better, be it removing a reflection from a window or shadows from a face, for example. Tap on the one you want, and it goes to work applying the edit, and once it is done, it lets you either go back or save the photo as a copy.
Generative Edit takes the same approach, only with manual control. You decide what and how to edit the photo, so if you want to cut someone or something out of an image to move them somewhere else in the frame, it’s possible to do that mostly non-destructively. In the examples I saw, the AI accorded itself well in filling in missing shadows and textures after a subject was moved, but in another example, an erased lamp post still left a shadow on the ground to contend with. It’s not clear how well the feature will work with busier backgrounds or low-light settings where it’s harder to match pixels, but as is, it showed some real promise compared to what Google does with Magic Eraser.
Samsung will also add a watermark to any image touched by its AI editing features, including a disclaimer in the metadata to indicate the image was edited by AI, even if one were to crop out the visual watermark. There will likely be ways to skirt around that, but it’s a positive step toward transparency.
Other onboard AI features are also valuable, like how the Interpreter app in the phone dropdown settings can enable two-way translation during a live conversation with someone. The screen splits in two so both parties can see the transcribed text in real-time to keep the flow going. For photographers in unfamiliar places asking to take photos, it’s a feature that could prove valuable crossing cultural and linguistic barriers, especially since it runs natively off the device, meaning you don’t need a cellular or Wi-Fi connection to make it work. It supports 13 languages to start, and Samsung says more will come. Live Translate can perform a similar task for phone calls, making it possible to speak with someone who wouldn’t know a word of your own language during a call.
A Newer Way to Search
Whether you use the included S Pen (in the Galaxy S24 Ultra) or not, Circle to Search is a unique way to do a Google image search straight from the phone. All three Galaxy S24 phones support this, though only the Ultra supports and comes with the pen. Either way, you can draw a lasso around anything in an image, be it a person, object, or landmark, and get information about it immediately. For example, if you’re unfamiliar with a particular landmark, you can take a snapshot of it and then look it up with Circle to Search.
It works with any photo, not just those captured with the phone, so for example, if someone were to send over a photo of them posing next to a landmark, you could circle it and find out both where it is and what it means. The same goes for things like apparel, food, and buildings. Circle an image of lasagna, and the search will find the closest places serving it. Do it with a pair of shoes, and you can see who sells them. It’s curiosity meeting consumerism in a very different way that cuts down on search time.
The examples I saw appeared pretty fluid, but it’s very likely some false positives or mystifying results might come out of such a feature. At the same time, it could be very informative, like circling a dog or cat to find out what breed it is, for example. Same with plants or trees.
For the Galaxy S24 Ultra, the S Pen still acts as a remote shutter for photos, and you can also use a Galaxy Watch to control the shutter on any of these three phones. Vision Booster makes it easier to see their respective screens in brighter daylight — useful when shooting outside in the sun, thanks to 2,600 nits peak brightness on the Ultra, in particular.
It’s the same 6.8-inch Dynamic AMOLED 2X display (3200 x 1440) with a 120Hz refresh rate for the Ultra. All remnants of a curved display have disappeared, leaving a decidedly flat panel that is much easier to wield — and protect with a proper case. The other two Galaxy S24 devices get a slight 0.1-inch screen size bump from their predecessors and the same AI features driven by the same Snapdragon 8 Gen 3 processor the Ultra runs on.
Image credits: Photos by Ted Kritsonis