The case for content authenticity in an age of disinformation, deepfakes and NFTs

Illustration of globes showing some visible some partially visible

Did you hear about the award-winning documentary photographer from an esteemed photo agency whose faked images of North Macedonia’s industrial cityscapes tricked even the skilled eyes of the experts at a recent French photojournalism festival?

Despite the random computer-generated bears that Jonas Bendiksen added to his pictures, nobody noticed his photos had been intentionally altered with computer software. That is, until Bendiksen himself pointed out the fraud using a Twitter account he created using a pseudonym.

His point was clear: If fake images can dupe the pros, imagine how hard it is for the rest of us to know what’s authentic. He wanted to see how far he could get “before the guards woke up. They never woke up.”

A wake-up call

Our inability to distinguish fantasy from reality in digital images is a wakeup call. In an increasingly fragmented media landscape, we are witnessing extraordinary challenges to trust in media. Now, the same powerful and easy-to-use tools used to make and share legitimate content are also deployed to create and spread disinformation or misinformation.

This isn’t a problem that’s going away. There could be 100 times more visual content by 2027, according to one study. One expert quoted in Nina Schick’s book Deepfakes, The Coming Infocalypse, estimates that synthetic video may account for as much as 90% of online video in just three to five years – meaning it will be generated partially or entirely by artificial intelligence (AI), not humans. Already, deepfake videos like deeptomcruise, created with the help of AI and machine learning algorithms, are incredibly convincing (hint: it’s not actually Tom Cruise).

A study this year in the journal Nature about the rise of misinformation online, found that people are more focused on sharing what they think will boost their social status than in sharing what is true. Governments and bad actors already know this and use it to spread misinformation (unintentional deception) or sow disinformation (deceit with intent).

Giving credit where it’s due

Getting at the truth isn’t the only thing at stake. Understanding the authenticity of content is also a big deal for the authors of the content – the creators, the creative professionals and the communicators.

Content creators often go uncredited and unpaid when their images are repurposed into an endless variety of re-edited viral memes that stray from their original intent or purpose. In the red-hot market for non-fungible tokens (NFTs), protecting the creators of these original pieces of “cryptoart” should be just as important as protecting the collectors who buy them.

From photos to deepfake videos to NFTs and other digital file types, true transparency into a how a piece of digital content is created and/or changed, is critical to ensuring whether we can trust the source. What’s real, what’s fake, what’s valuable, who created it and with which editing tools – as well as what changes were made to an original piece – are the questions we should all be asking.

How to know what is authentic

These questions arise as many of us wrestle with new terms to help us understand what is real and what is fake. Deepfakes, cheap/shallowfakes, NFTs, blockchains, synthetic media, content authenticity and provenance – knowing these buzzwords in the context of our increasingly digital lives is imperative. (Need help? Scroll for our content authenticity primer.)

What these words mean for content creators and society is the foundation of our efforts leading the Content Authenticity Initiative (CAI). Building on Adobe’s heritage of providing trustworthy and innovative digital solutions for our customers, the CAI is a collaborative community of leading media and technology companies as well as individual creators, technologists, journalists, activists, and leaders addressing misinformation and content authenticity at scale. What started in late 2019 as a venture between Adobe, Twitter and The New York Times has now grown to more than 375 members over the past two years – including AFP, Arm, the BBC, Getty Images, Microsoft, Nikon, Qualcomm, Truepic, The Washington Post, and many others.

As a former war photographer and media executive who has dedicated my career to getting at the truth, often at great personal risk, I strongly believe that having a shared understanding of objective facts can help us all make more thoughtful decisions when consuming media.

The CAI and C2PA

With the CAI, we’re busy raising awareness, advocacy and creating educational programs. In the Coalition for Content Provenance and Authenticity (C2PA), we are among many contributing to the creation of an open standard, a blueprint for a new technical specification (currently open for public feedback) to digital content “provenance.”

This is a digital chain of custody, like a manifest, that shows where content came from, who made it, and the journey it has been on since it was published, including edits and other manipulations. The new CP2A opt-in, open-source attribution standard is a powerful example of how a diverse group of companies can come together, to collaborate on what it means to take responsibility for the entire media ecosystem.

It’s going to take time and everyone’s participation to make true content authenticity a reality. I’m confident that together, we can begin to restore the collective trust and understanding that right now seems so challenging.

The C2PA is only one of the collaborative efforts the CAI has worked on since its inception just two years ago. We will have more updates to share during Adobe MAX 2021 (free and all virtual from Oct. 26-28) as we continue to help solve fundamental challenges of content authenticity.

It’s going to take true industry collaboration to fight misinformation with attribution and verifiable truth of content, but with the CAI, we’re committed to ensure creators are getting credit for their digital work, and consumers have a way to verify whether digital content has been altered and to what degree.

To stay up to date with the CAI and our open, extensible approach for providing media transparency to allow for better evaluation of content, sign up to become a CAI member organization.

A picture containing text, clipart Description automatically generated

The vocabulary of Content Authenticity

With the volume of digital content increasing, people want to know that what they see online is authentic. Currently there’s a lack of transparency. People can be misled when they don’t know who is altering content and what content has been changed. Content authenticity is when there is proper content attribution for creators and publishers, which helps ensure trust and transparency online.

Other important concepts and definitions around content authenticity: