Ideogram 2.0 Dangerously Lets You Generate Photorealistic AI Images of Anything

On the left, two women stand in front of a concert stage wearing "Swifties for Trump" shirts. On the right, a woman in a black suit holds hands with a man in a navy suit and red tie. The header text above the image reads "AI-GENERATED.
I was able to generate these AI-generated images within seconds of landing on Ideogram’s website.

There is a widely held belief that AI images pose a danger to society. The technology is able to create fake images that can pass as genuine creating opportunities for bad actors to spread misinformation.

Step forward Ideogram, a startup that’s raised $16.5 million, and its latest model, Ideogram 2.0, which promises to “create images that can convincingly pass as real photos.”

However, unlike most AI image generators which have safeguards in place, Ideogram 2.0 has seemingly no restrictions on creating photorealistic images of real people in compromising positions.

Within a few seconds of signing up, PetaPixel was able to create photorealistic images of Kamala Harris and Donald Trump holding hands as well as Taylor Swift fans wearing “Swifties for Trump” t-shirts.

A group of people with their backs to the camera, wearing blue shirts that read "Swifties for Trump," stand in an audience at an indoor event. The stage is lit up with colorful lights and a "Trump" banner is visible in the background. The venue appears large and crowded.

A woman and a man, both in formal attire, are holding hands and smiling as they walk through a doorway. The man wears a red tie, and the woman wears a pearl necklace. An "AI-Generated" label is prominently displayed in the bottom right corner.

When given the exact same prompts, Midjourney rejects these image generations. Even Elon Musk’s Grok image generator has been toned down after its initial chaotic launch.

Ideogram, a Canadian startup, makes no bones about its capabilities in its marketing literature.

“The Realistic style of Ideogram 2.0 enables you to create images that can convincingly pass as real photos,” reads the announcement for Ideogram 2.0. “Textures are significantly enhanced, and human skin and hair appear lifelike.”

A group of four young women stands at a rally holding up blue and red signs that read "Swifties for Trump." They are wearing matching blue T-shirts with the same slogan. The scene is in a crowded, dimly lit venue. A label in the corner reads "AI-GENERATED.

A man in a suit and red tie clasps hands with a woman in a blue blazer and necklace. They are smiling and standing in front of a blurred background featuring a U.S. flag. The image has an "AI-Generated" label in the bottom right corner.
As well as creating fake, photorealistic images, Ideogram also trumpets its ability to generate legible text. It pretty much nailed the “Swifties for Trump” t-shirts which has been a hot topic of debate this week after Donald Trump shared very similar AI-generated images in a social media post that appeared to suggest Taylor Swift and her fans were endorsing him.

With the presidential elections just around the corner, releasing a powerful tool like this with no safeguards that can be so obviously manipulated by bad actors is idiotic, reckless, and downright dangerous.

As of publication, Ideogram did not respond to PetaPixel’s request for comment.

Irresponsible AI

There’s no doubt that Ideogram 2.0 is an impressive tool. Most people who use it will presumably do so in good faith people and create benign AI art.

However, it is glaringly obvious that an unrestrained AI image generator could be used for malicious means.

And it’s not just Ideogram, yesterday PetaPixel editor-in-chief Jaron Schneider wrote about the AI tools on Google’s Pixel 9 Pro.

“Google doesn’t appear to be adding any level of transparency that AI was used to create these images,” he writes about the phone’s new Magic Editor which allows people to manipulate real photos by using AI to generate new objects and elements.

There is a clear and urgent need for the nascent AI image industry to move responsibly and transparently. At the very least, these tools should have some guardrails.


Image credits: All images generated from Idegram 2.0

Discussion