Warping Reality: Adobe’s Neural Filters are Ripe for Mayhem

Face tuning apps have thrived for years in the mobile phone ecosystem, allowing users to make subtle (and sometimes not so subtle) changes to their appearance for a selfie-obsessed generation.

Some consumers use the tools to get closer to the generic celebface look that dominates the influencer world of the Kardashians. For others, the tools are used to simply erase a few years from one’s visage – perhaps turning a selfie into a headshot for professional use. We’ve even seen “fun” apps that advance age in your photo, built by questionable developers with unknown motives.

But whereas face tuning apps fall under the category of “fun,” there is something legitimizing about the incorporation of similar technologies into Adobe Photoshop – the powerful image editing software that has been so ubiquitous as to become a verb.

With the newly released version 26, Adobe introduced a set of neural filters – a cloud-based technology based on machine learning and GANs that offer a set of switches and sliders to use their beta technology.

Whereas convincing image manipulation used to take a highly skilled operator hours of work, anyone with an Adobe Creative Cloud subscription can make convincing alterations to a photo. The implications are frightening. In this age of disinformation, one doesn’t need pristine output to move the masses. Indeed, a low-quality, intentionally slowed down video of Nancy Pelosi was enough to convince some users that she was drunk and slurring her words. And even when viewers know the video is false, the power of visual imagery reinforces confirmation bias.

And thus any image can be easily altered, and meme-ified with the potential to go viral. Critical thinking skills are suspended. Damage done.

Fig 1: I started with a recent portrait of Joe Biden, age 77, and pulled the image into Adobe Photoshop’s Neural Filters

It is one thing to read about a technology, but when I tried it out for myself, I was filled with a sense of dread. Using a low resolution, Creative Commons portrait of Joe Biden, I reversed aged him about 40 years in two minutes.

Fig 2: Moving the age and hair sliders, removed skin blemishes, wrinkles, and increases the density of hair. The image is sent into the cloud for processing.

In Adobe’s promotional materials, they use an example of a baby looking away from the camera. A few tweaks later, she gazes in the same direction as her mother. The presentation of innocuous and anodyne use cases ignores the obvious potential for abuse.

Fig 3: The “gaze” and “head direction” sliders almost imperceptibly shift the direction he’s looking, creating a more engaged and alert appearance.

As Washington Post photo editor Olivier Laurent tweeted “I think it’s time for the developers at Adobe to realize that not all ideas are good ideas, especially at a time when their “tools” are making it easier for some to manipulate photos and for people to question whether what is shown is real or not.”

Pandora’s box is already open, and there is no turning back the availability of technologies like deepfakes, 100% synthetic faces, and other reality bending tools like Adobe’s neural filters. It will only be minutes before the next Macedonian teen generates the fakes news – now accompanied by photo-realistic, synthetic images – that sways an election somewhere in the world. Maybe even in your backyard.

Photography continues its slow march away from reality. Even with burst-mode enabled phones, we seem more intent than ever on capturing life as we want to remember it, not as it was.


About the author: Allen Murabayashi is the Chairman and co-founder of PhotoShelter, which regularly publishes resources for photographers. Allen is a graduate of Yale University, and flosses daily. This article was also published here.

Discussion