Midjourney Bans Words About the Reproductive System to Fight AI Porn

Midjourney

Midjourney has banned words about the human reproductive system in a bid to stop users creating porn on the artificially intelligent (AI) text-to-image generator.

MIT Technology Review reports that the ban is a temporary measure to stop people from using Midjourney to create shocking or gory images.

If a user types a prompt like “placenta,” “fallopian tubes,” “mammary glands,” “sperm,” “uterine,” “urethra,” “cervix,” “hymen,” or “vulva” into Midjourney, the AI image generator flags the word as a banned prompt and will not allow it to be used on the software.

In some circumstances, if a user attempts to use one of these words as a prompt, they are blocked for a limited time for trying to generate banned content.

Midjourney founder David Holz tells MIT Technology Review that the AI image generator has banned these terms as a provisional measure to stop people from generating violent or sexualized images while the company “improves things on the AI side.”

Holz explains that Midjourney moderators track how words are being used and what kinds of images are being generated, and change the prompt bans from time to time.

Midjourney has a community guidelines page that lists the automatically banned keywords on the software. Other words relating to human biology are allowed.

The huge amount of data that AI models such as Midjourney, DALL-E 2, and Stable Diffusion are trained on make it extremely difficult to stop offensive images — such as those of a sexual or violent nature or those that could produce biased content — from being generated. However, blocking certain prompts works as a stopgap measure to fight shocking, pornographic, and gory images being created on the software.

Midjourney’s ban of scientific words relating to the human reproductive system reveals the challenges when it comes to stopping certain content from being made on AI generators.

AI-generated pornography is becoming increasingly common online. In December, a man was arrested for creating child porn images with AI software. The arrest was allegedly one of the first of its kind.

Then in January, a community of young Twitch livestreamers discovered that their images were on a deepfake porn site. AI had been used to depict them in sex acts without their consent or knowledge.

In the U.S., there is currently no federal legislation to protect against people’s images being used without their consent in deepfake porn or with any associated AI technology. However last year, the U.K. announced plans to criminalize the sharing of any deepfake porn made without the subject’s consent. Those found violating the law can face jail time.

Discussion