Users Complain That Lensa AI Selfie Generator is ‘Sexualizing’ Their Photos
People are criticizing viral photo editing app Lensa AI — claiming that its artificial intelligence (AI) selfie generator is sexualizing and changing their portraits beyond recognition.
Lensa soared in popularity this weekend to take the number one spot in the U.S. on Apple’s App Store charts — after its “Magic Avatars” AI selfie generator took social media by storm.
However, female users have condemned the AI photo app for sexualizing their portraits, making them slimmer, or anglicizing their features to an uncomfortable degree.
Saw a friend try the Lensa app, so I caved and tried the 50 avatar pack for fun (10 styles, 5 variations).
Some of the results are borderline racist cause it just served me random Asian girls that looked like the source material wasn’t adjust at all as if we all look the same 🤣 pic.twitter.com/BaQDbbHexu
— Xandra van Wijk (@xndra) November 26, 2022
In some cases, users have detailed how the AI selfie generator created entirely nude portraits from their fully-clothed original photographs.
not sure how they got my tiddies so accurate, i swear i didn't upload my nudes to lensa pic.twitter.com/A2bNrWByVe
— It's Yerrr Girl, Alyssa (@_trashbaby13) December 7, 2022
‘I Was Unrecognizable’
In an article for Insider, Laura Wheatman Hill says she was “shocked” when the app made an “unrecognizable, thin, and sexualized” version of herself.
I tried the AI photo app everyone was using. As someone with a history of disordered body image, it was very triggering to see the results. https://t.co/jArToemPHP
— Insider Life (@InsiderLife) December 6, 2022
Hill claims that the AI images Lensa created for her look “nothing” like her true self and she slams the app for “stretching” her face “until it was unrecognizable.”
She says Magic Avatars gave her a “tiny, hourglass waist,” and “breasts that were nearly falling out of a red strapless top” — despite her only providing images of herself that were from above the shoulder.
Anyone else get loads boobs in their Lensa pictures or just me? (I promise I didn’t upload any naked photos)
— Becky Steeden (@Rusty_Steed) December 4, 2022
In another piece for Wired, Olivia Snow describes how the Lensa app generated nudes from her childhood photos.
Bro- Lensa had me naked WAY too much and I’m not sharing my AI-generated titties but here’s some others: pic.twitter.com/iwRErFkwwg
— Erika Donohue (@donohue_erika) December 3, 2022
Snow writes “many users—primarily women—have noticed that even when they upload modest photos, the app not only generates nudes but also ascribes cartoonishly sexualized features, like sultry poses and gigantic breasts, to their images.”
“I, for example, received several fully nude results despite uploading only headshots.”
Finally caved in and used #lensa , the results were wild, either I was a child, naked, or a K-pop idol 💁🏻♀️ pic.twitter.com/rKQsJ38bAH
— 🍒cheekycherry11🍒 (@cc11games) December 4, 2022
Other users have accused “Magic Avatars” of whitening their skin and giving them more “western” features. Several women of colour criticized the lack of nuance in Lensa’s AI-generated portraits of them.
Tried out the Lensa AI app and fed 20 photos of myself, and I have to say it really struggles with Asian faces. My results were skewed to be more East Asian and I’m absolutely not impressed. pic.twitter.com/WnyLKXQT8K
— Anisa Sanusi (@studioanisa) December 3, 2022
Bias in AI Image Generators
It is relatively common knowledge that AI systems can exhibit biases that stem from their programming and data sources.
For example, machine learning software could be trained on a huge, unfiltered dataset that underrepresents a particular gender or ethnic group. The huge amount of images submitted to train AI can even cause the technology to create unintentional nudes.
Real time watching everyone I know give their data and likeness away (because #vanity and I'm with them) to a poorly trained ML model and finally starting to care about facial recognition + racist/sexist AI as they try out Lensa. #productinclusionandequity #mlfairness #technerd
— sydneycoleman (@sydneycolemanSF) December 5, 2022
However, Lensa’s Magic Avatars is the first time AI image generators have truly hit the mainstream and social media users are only now becoming wise to the fact that these biases exist in the technology.
I spent like five minutes and found dozens of AI generated Lensa images with fucked up "signatures" in the corners. So they're not even trying to hide that all this shit is just pulling from real artists real work, huh? https://t.co/c38KCvCHgT pic.twitter.com/3ucAb0CoJS
— Daniel Danger (@tinymediaempire) December 6, 2022
The sudden popularity of Lensa AI has also brought incited a wider discussion on social media around AI image generators being built using a dataset trained on the work on non-compensated real artists.
people worried about Lensa generating nudes seem to have no clue what AI is truly capable of
— tzvi (@bestnewuser) December 8, 2022
In response to the online criticism, Prisma Labs, the team behind Lensa AI, says it is working to prevent the accidental generation of nudes
“To enhance the work of Lensa, we are in the process of building the NSFW filter,” Prisma Lab’s CEO and co-founder Andrey Usoltsev tells TechCrunch.
“It will effectively blur any images detected as such. It will remain at the user’s sole discretion if they wish to open or save such imagery.”
“Stable Diffusion neural network is running behind the avatar generation process,” Usoltsev explains. “Stability AI, the creators of the model, trained it on a sizable set of unfiltered data from across the internet.”
me importing my selfies into the Lensa AI app pic.twitter.com/BDXseZ1rq0
— erin ♡ (@efillysux) December 2, 2022
“Neither us, nor Stability AI could consciously apply any representation biases; To be more precise, the man-made unfiltered data sourced online introduced the model to the existing biases of humankind. The creators acknowledge the possibility of societal biases. So do we.”