Apple released iOS 10.1 to public beta testers today, and it includes one of the most anticipated features they showed off during their iPhone 7 keynote: the “Portrait” mode that fakes a depth of field effect.
Apple isn’t the first to leverage a dual camera to try and fake bokeh in portraits and other images, but they’re determined to get it right. The software uses a combination of facial recognition and information from the iPhone 7 Plus’ two cameras to create a depth map, which it then applies a “blurring” effect to.
Objects closer to the subject will be less blurred, while objects further away are more blurred, just like you would expect if you were using a larger sensor camera and fast lens. MacRumors got their hands on this update yesterday when it launched for developers, and they demoed the effect in the video below:
As you can see, the effect is rendered in real time on your screen, so you can adjust things and make sure your subject is positioned correctly.
Plus, each time you take one of these fake bokeh portraits, the camera saves both the original and the depth of field version so you can compare them. Here’s the before and after shots from MacRumors’ demo:
MacRumors later compares this portrait with one taken on the Sony a6300, and you can tell there’s work left to do. The difference is subtle, but you get some haloing around the main subject since the effect is rendered digitally. That obviously isn’t the case when the effect is an optical phenomenon.
Still, for a first release, still only in beta, it seems to work pretty well and definitely gives photo lovers another reason to pick the larger iPhone 7 Plus over the iPhone 7. Because this effect relies on using two cameras, Portrait mode won’t work on the smaller, single-camera model.
For more samples, check out this other, very quick demo by 9to5Mac: