AI Image Generator Dropped by Computing Provider Over Nonconsensual Nude Pictures

AI image generator

An investigation into a controversial AI image generator that is allegedly used to make pictures of “child pornography” has led to it being dropped by its computing provider.

OctoML, which is a machine learning acceleration platform, has severed ties with Civitai after an investigation by 404 Media which accuses Civitai of “profiting from nonconsensual AI porn.”

In an article published December 5, 404 Media reported on internal communications at OctoML that raised concerns over Civitai users creating sexually explicit material.

Messages over OctoML’s Slack expressed alarm over a specific text-to-image model that is the number three most downloaded model from Civitai which, despite being labeled as a “pretty SFW model”, generates “unethical/shocking content—read something could be categorized as child pornography.”

In the wake of the report, OctoML announced it was blocking the generation of all NFSW content on Civitai before going one step further and cutting all ties.

What is Civitai?

Civitai harnesses Stable Diffusion’s AI image generation technology to make custom models. For example, a curated text-to-image model can be made specifically for anime pictures, or even ones that imitate real-life photography.

Of course, this means that bad actors can create models for synthetic pornography; including nonconsensual deepfakes and sexual images of children.

Civitai, which has $5 million in funding from VC firm Andreessen Horowitz, has been accused of promoting this type of material with a “bounties” features. These are challenges to generate realistic images of real people for rewards.

In November, 404 Media found realistic-looking images of celebrities, influencers, and even private citizens that were sexual in nature and primarily of women.

Civitai was using OctoML’s services which in turn use Amazon Web Services’ servers.

“We have decided to terminate our business relationship with Civitai,” OctoML says in a statement to 404 Media. “This decision aligns with our commitment to ensuring the safe and responsible use of AI.”

In an interview with Venture Beat last week, Civitai founder Justin Maier defended his users creating sexual content.

“People that are there to make these NSFW things are creating and pushing for these models in ways that kind of transcend that use case,” Maier tells Venture Beat. “It’s been valuable to have the community even if they’re making things that I’m not interested in, or that I prefer not to have on the site.”

Last week, a report revealed that artificial intelligence apps that generate nudes of women from images of them fully clothed are on the rise.


Image credits: Header photo licensed via Depositphotos.

Discussion