07.05.2024 19:09 Uhr, Quelle: Engadget

OpenAI says it can detect images made by its own software… mostly

We all think we’re pretty good at identifying images made by AI. It’s the weird alien text in the background. It’s the bizarre inaccuracies that seem to break the laws of physics. Most of all, it’s those gruesome hands and fingers. However, the technology is constantly evolving and it won’t be too long until we won’t be able to tell what’s real or not. Industry leader OpenAI is trying to get ahead of the problem by creating a toolset that detects images created by its own DALL-E 3 generator. The results are a mixed bag. OpenAI The company says it can accurately detect pictures whipped up by DALL-3 98 percent of the time, which is great. There are, though, some fairly big caveats. First of all, the image has to be created by DALL-E and, well, it’s not the only image generator on the block. The internet overfloweth with them. According to data provided by OpenAI, the system only managed to successfully classify five to ten percent of images made by other AI models

Weiterlesen bei Engadget

Digg del.icio.us Facebook email MySpace Technorati Twitter

JustMac.info © Thomas Lohner - Impressum - Datenschutz