蓝狐
蓝狐|Oct 02, 2025 02:03
OpenAI's Sora2 video generation is basically indistinguishable from reality, even capable of 'creating something out of nothing.' AI now needs Crypto to help verify authenticity, or else people won't be able to tell what's real anymore. Unlike the previous focus on simply verifying whether a video is real or fake, this now involves verifying the authenticity of the video content itself. The level of forgery has stepped up, and simple hash values and watermarks are no longer sufficient. To combat deepfakes, first, we need content credential standards (like C2PA, etc.). On this foundation, Crypto signatures can be embedded into the video metadata, not only recording the hash but also details like the creation tool, timestamp, location, and even the creator. If the video is generated by a real device (like a smartphone camera), the content credential standard can prove it’s authentic and not AI-generated. If it’s generated by AI tools like Sora2, it will be recorded as 'AI-generated.' Encryption plays a role by storing these credentials on the blockchain, similar to a 'digital passport.' Additionally, AI detection tools are needed to analyze the content itself, not just metadata—for example, using detection models to analyze pixel-level anomalies, physical inconsistencies (facial expressions, lighting errors), or audio synchronization issues to determine if it’s synthetic. Moreover, if attackers bypass invisible watermarks (difficult, but not impossible) or create 'hybrid videos' (part real, part AI-generated), detection algorithms need to be continuously iterated, and encryption technology must record the entire history of modifications. From the content creator’s perspective, Ethereum NFTs can be used to tag original videos, bind the creator’s identity, prove ownership, enhance credibility, and prevent replacement by AI-generated deepfakes.
+4
Mentioned
Share To

Timeline

HotFlash

APP

X

Telegram

Facebook

Reddit

CopyLink

Hot Reads