Users Complain That Lensa AI Selfie Generator is ‘Sexualizing’ Their Photos


People are criticizing viral photo editing app Lensa AI — claiming that its artificial intelligence (AI) selfie generator is sexualizing and changing their portraits beyond recognition.

Lensa soared in popularity this weekend to take the number one spot in the U.S. on Apple’s App Store charts — after its “Magic Avatars” AI selfie generator took social media by storm.

However, female users have condemned the AI photo app for sexualizing their portraits, making them slimmer, or anglicizing their features to an uncomfortable degree.

In some cases, users have detailed how the AI selfie generator created entirely nude portraits from their fully-clothed original photographs.

‘I Was Unrecognizable’

In an article for Insider, Laura Wheatman Hill says she was “shocked” when the app made an “unrecognizable, thin, and sexualized” version of herself.

Hill claims that the AI images Lensa created for her look “nothing” like her true self and she slams the app for “stretching” her face “until it was unrecognizable.”

She says Magic Avatars gave her a “tiny, hourglass waist,” and “breasts that were nearly falling out of a red strapless top” — despite her only providing images of herself that were from above the shoulder.

In another piece for Wired, Olivia Snow describes how the Lensa app generated nudes from her childhood photos.

Snow writes “many users—primarily women—have noticed that even when they upload modest photos, the app not only generates nudes but also ascribes cartoonishly sexualized features, like sultry poses and gigantic breasts, to their images.”

“I, for example, received several fully nude results despite uploading only headshots.”

Other users have accused “Magic Avatars” of whitening their skin and giving them more “western” features. Several women of colour criticized the lack of nuance in Lensa’s AI-generated portraits of them.

Bias in AI Image Generators

It is relatively common knowledge that AI systems can exhibit biases that stem from their programming and data sources.

For example, machine learning software could be trained on a huge, unfiltered dataset that underrepresents a particular gender or ethnic group. The huge amount of images submitted to train AI can even cause the technology to create unintentional nudes.

However, Lensa’s Magic Avatars is the first time AI image generators have truly hit the mainstream and social media users are only now becoming wise to the fact that these biases exist in the technology.

The sudden popularity of Lensa AI has also brought incited a wider discussion on social media around AI image generators being built using a dataset trained on the work on non-compensated real artists.

In response to the online criticism, Prisma Labs, the team behind Lensa AI, says it is working to prevent the accidental generation of nudes

“To enhance the work of Lensa, we are in the process of building the NSFW filter,” Prisma Lab’s CEO and co-founder Andrey Usoltsev tells TechCrunch.

“It will effectively blur any images detected as such. It will remain at the user’s sole discretion if they wish to open or save such imagery.”

“Stable Diffusion neural network is running behind the avatar generation process,” Usoltsev explains. “Stability AI, the creators of the model, trained it on a sizable set of unfiltered data from across the internet.”

“Neither us, nor Stability AI could consciously apply any representation biases; To be more precise, the man-made unfiltered data sourced online introduced the model to the existing biases of humankind. The creators acknowledge the possibility of societal biases. So do we.”