

AI image generation is improving rapidly, and the next leap will be greater control over its outputs. The ability to make specific, granular changes will open new creative possibilities, but also make it harder for anyone to spot misinformation.
This demo by FacePoke, making detailed edits to the Mona Lisa, shows what's coming:
This is insane. Detailed edits to the Mona Lisa using FacePoke. pic.twitter.com/Ry3eUXYCLo
— Victor Mustar (@victormustar) March 23, 2023
For creators and AI artists, more control is exciting. But with granular editing power, the potential for subtle manipulation grows. It's easier now to create content that looks real but is either entirely fabricated or partly altered. Seeing is no longer believing.
The impact is already being felt. Misinformation can hurt people or divert attention during critical moments. Viral deepfakes related to hurricane Helene could misdirect essential resources or waste valuable time.
AI-generated images are becoming increasingly difficult to detect.
— Mario Nawfal (@MarioNawfal) March 26, 2023
pic.twitter.com/ZM1rPjJcVP
This is exactly where Trueshot is needed. Reporting on real world events with verifiable images. A timestamped NFT proves an image is authentic the moment it's captured. As image editing becomes easier, the need for authenticity grows stronger, especially in fast-moving news.
The future of digital content creation is exciting, but it comes with real challenges. As we gain more control over image output, we need tools to protect what's real. Trueshot provides that.