‘AI or Not’ is a Free Web App That Claims to Detect AI Generated Photos

Ai or Not

“AI or Not” is a free web-based app that claims to be able to identify images generated by artificial intelligence (AI) simply by uploading them or providing a URL.

Powered by Optic, the company says its technology is the smartest content recognition engine for Web3 and claims it is capable of identifying images made using Stable Diffusion, Midjourney, Dall-E, or GAN.

“Optic AI or Not is a web service that helps users quickly and accurately determine whether an image has been generated by artificial intelligence (AI) or created by a human. If the image is AI-generated, our service identifies the AI model used (mid-journey, stable diffusion, or DALL-E),” Optic says.

“Our mission is to bring transparency to the media on blockchains so all communities can realize their creative and economic potential.”

The platform, spotted by DIY Photography, is very easy to use. Anyone can upload an image or provide a link to an AI-generated image’s hosted location and Optic AI or Not is able to provide feedback on if the image is real or generated by AI in a matter of seconds.

The company says that AI or Not uses “advanced algorithms and machine learning techniques” to analyze images and then detect signs of AI generation.

“Our service compares the input image to known patterns, artifacts, and characteristics of various AI models and human-made images to determine the origin of the content,” Optic explains.

Optic positions its service as being able to help users identify AI generated images, especially in challenging cases, to avoid the many issues that might come with their use such as fraud or misinformation.

Does it Work?

PetaPixel decided to see if the platform would be able to handle a range of images: both simple and easy-to-identify ones that astute human eyes could spot, as well as more challenging fakes.

To start with, Optic was challenged by a photo Nikon recently released as part of its “Natural Intelligence” campaign, and luckily AI or Not was able to recognize that it was indeed a real photo.

AI or Not

Next, PetaPixel gave it an AI-generated photo a photographer created for a real estate client. It, again, successfully recognized the photo as fake.

AI or Not

In February, an artist was able to win a local photography competition with an aerial image of a surfer in an AI-generated ocean, and once again Optic was able to recognize that the image wasn’t real.

AI or Not

The next set of results is less rosy for Optic. It wasn’t able to determine that an AI-generated photo of Tom Hardy as James Bond was not real, instead it provided a rather unhelpful, “I don’t know.”

AI or Not

When provided with a photo where AI was used to change the identity of a person in the image, Optic was unable to tell that the photo had been altered.

AI or Not

But perhaps most damning, Optic could not tell that the image below of former president Donald Trump kissing Anthony Fauci, which was created specifically to mislead audiences, was generated by AI.

AI or Not

It was also unable to identify this wholly make-up social media influencer:

AI or Not

With some of these images, the platform’s inability to tell real from fake makes sense: Optic only promises that it can detect images created entirely by Stable Diffusion, Midjourney, Dall-E, or GAN, so the Tom Hardy image, the fake social media influencer, and the street photo with an altered face makes sense.

However, it’s inability to see the fully AI image of Trump and Fauci shows this platform has a ways to go yet.

Discussion