Patch for iOS Uses Neural Networks to Blur Your Portrait Backgrounds

patchblurfeat

The iPhone 7 Plus has a new Portrait mode that artificially blurs backgrounds using info captured by the two rear cameras. If you want a similar look but don’t have an iPhone 7 Plus, check out the new Patch app for iOS. It uses neural networks to generate faux depth and blur.

Created by developer Hal Lee, the app uses a neural network to determine the boundary line between the subject in photos and the background, using the same technique as the automatic portrait selection tech being developed by Adobe scientists.

patchscreens

After figuring out where the subject is in a photo, Patch automatically blurs out the rest of the frame, creating “an effect similar to that from an expensive professional camera and lens.”

“Since Patch doesn’t use depth information from dual cameras, it works on pretty much any iOS device,” Lee says.

patchexample

To test Patch’s powers, we opened up this portrait:

testshot

computing

The selection can be a bit rough for photos that are more complicated, so there are also manual tools that allow you to refine the automatic selection. A brush tool lets you paint in areas that should be sharp, and an eraser tool lets you take out areas that should be blurred. Zooming into the shot by pinching gives you greater precision.

screenshotsrefine

Once you’re happy with your selection, there are five different blur levels you can choose from, from a slight blur to throwing the entire background way out of focus.

Here’s what the app produced after a few casual swipes of our finger — we could have achieved a better result by spending a little time refining the edges further:

resultingphoto

If you’re interested in giving Patch a shot, you can download the app for free from the iTunes App Store. Saving watermark-free photos is a $1 in-app upgrade.

Discussion