The capabilities of AI-based image-generation tools have progressed to a stage where individuals could mistake them for non-AI-generated or authentic images. It raises concerns regarding their potential misuse.

Content from OpenAI’s video generation tool Sora

OpenAI has already introduced watermarks for DALL-E 3 generated images to maintain transparency about the source and respect authenticity. The company is now developing a new tool that will be able to distinguish real images from the ones that are generated by the company’s image text-based generation model DALL-E.

Generative AI tools from OpenAI will include C2PA metadata

The official OpenAI blog made the announcement stating that they are working on new methods to detect AI-generated content. According to the company, the goal is to help researchers study content authenticity and to join the Coalition for Content Provenance and Authenticity Steering Committee (C2PA), a widely used standard for digital content certification. It will allow creators to tag and certify their content to verify its true origin.

OpenAI says they will also integrate the C2PA metadata for Sora, the company’s video generation model when it launches broadly. For those unaware, Sora is also likely going to be a premium text-to-video generation tool, like DALL-E 3, where only paid subscribers will be able to access it. According to a previous report, Sora will be available to the public within 2024.

OpenAI is building a new tool to detect DALL-E 3-generated content

As mentioned above, OpenAI is also working on a new tool that uses AI to detect DALL-E 3-generated images. More specifically, it predicts the probability of an image being generated by the tool. According to the company, the image can still be detected after the image is compressed, the saturation is tweaked, or even the image is cropped. Not to mention, these tools aim to be more resistant to attempts at removing signals about the origin of content.

The detection tool achieves an accuracy of 98% for images created with DALL-E and more importantly, does not flag non-AI-generated images as AI-generated.

The company has already opened applications to access the new image detection tool to their first group of testers. It includes research labs and research-oriented journalism nonprofits and aims to collect feedback through the Researcher Access Program.

Related:

(Source)