Adobe publishes research study on disinformation

Deceptive content online is driving a loss of trust in the truth. A shared understanding of facts is critical to a functioning society. We can, should, and do argue about ideas, strategies and principles all day. But we will make no progress in solving problems if we cannot ground these debates on objective reality. Unfortunately, disinformation driven by deceptive content online is growing faster than ever. Every day we are faced with split-second decisions to believe, or not believe, whether what we are seeing is true. It’s a new world, where content can be easily edited to turn fact into fiction.

“Adobe has always been the leader in driving digital creativity. We’ve consistently evolved our technology to give creatives the world’s best tools and services they need to unleash their imagination. But we know that creative tools can be misused, and it is important to us to play a proactive role in mitigating the unintended and negative consequences of technology.”

Dana Rao, EVP and general counsel, Adobe

What is disinformation? Understanding consumer and creator perception

Disinformation is a multi-faceted problem, and there are many parts to the solution. Earlier this year we commissioned an opinion research study (memo and deck) among U.S. consumers and creative professionals to understand their perceptions about the scale and severity of manipulated content online, what needs to be done and who needs to do it.

A total of 1,200 people (800 consumers and 400 creative professionals) in the U.S. participated in a survey that was conducted from September 4 to September 18, 2020. We also conducted a similar study in the U.K. and the results can be found in this memo and deck.

In our research, we asked people what content they tend to trust online, how they distinguish between good and bad altered content, what tools they need to evaluate authentic content, and what solutions would help creative professionals uphold the integrity of their work.

Here’s what we heard:

Manipulated content is everywhere, and it’s eroding trust in everything

Theft and plagiarism online are rampant. The integrity of creative work is often compromised.

Consumers want to be armed with information to decide for themselves who and what to trust and know they can’t easily tell if an image has been altered on its own

Context matters: whether an edit is good or bad depends on why it was made

Consumers draw distinctions between altering an image for art vs. altering an image in the news.

Roughly twice as many consumers worry about altering images to influence impressions of an issue or event (68 percent) or politician (72 percent) than altering an image for artistic expression (34 percent).

Consumers and creators believe everyone has a role to play and do not place sole responsibility for solutions on any one group

However, data suggests that everyone has a role to play and in fact, concerns arose in focus group conversations about putting too much responsibility on any one group. We agree that this is a problem to solve collectively. There are lots of different kinds of groups that play a role in all of this – the media, the government, creators and technology companies – and consumers need to be empowered with information and solutions.

Adobe’s path forward

We know that disinformation is detrimental to the consumer and disempowering to creators. We’re very excited about the Content Authenticity Initiative’s (CAI) attribution tool we introduced in October and after a little more than a year collaborating with CAI partners, we now have the first end-to-end system for image attribution, which will work on mobile devices, platforms and more. Through this system, we’re now making it possible for creators to tag their work, so that consumers are empowered to discover for themselves who created a photograph or video, with a number of details like where it was created, and how it was edited.

Addressing problems created by technology is part of Adobe’s broader Digital Citizenship efforts, and we look forward to continuing to work with governments, platforms, publishers and tool providers to accelerate progress for society at large. As this research shows, the time is now for the stakeholders to work together to address the problems of disinformation to restore trust in facts, truth and each other.