Adobe publishes research study on disinformation
Deceptive content online is driving a loss of trust in the truth. A shared understanding of facts is critical to a functioning society. We can, should, and do argue about ideas, strategies and principles all day. But we will make no progress in solving problems if we cannot ground these debates on objective reality. Unfortunately, disinformation driven by deceptive content online is growing faster than ever. Every day we are faced with split-second decisions to believe, or not believe, whether what we are seeing is true. It’s a new world, where content can be easily edited to turn fact into fiction.
“Adobe has always been the leader in driving digital creativity. We’ve consistently evolved our technology to give creatives the world’s best tools and services they need to unleash their imagination. But we know that creative tools can be misused, and it is important to us to play a proactive role in mitigating the unintended and negative consequences of technology.”
Dana Rao, EVP and general counsel, Adobe
What is disinformation? Understanding consumer and creator perception
Disinformation is a multi-faceted problem, and there are many parts to the solution. Earlier this year we commissioned an opinion research study (memo and deck) among U.S. consumers and creative professionals to understand their perceptions about the scale and severity of manipulated content online, what needs to be done and who needs to do it.
A total of 1,200 people (800 consumers and 400 creative professionals) in the U.S. participated in a survey that was conducted from September 4 to September 18, 2020. We also conducted a similar study in the U.K. and the results can be found in this memo and deck.
In our research, we asked people what content they tend to trust online, how they distinguish between good and bad altered content, what tools they need to evaluate authentic content, and what solutions would help creative professionals uphold the integrity of their work.
Here’s what we heard:
Manipulated content is everywhere, and it’s eroding trust in everything
- 63 percent of consumers and 72 percent of creative professionals say they frequently come across fake images.
- In focus groups, people shared concerns that they lack the knowledge and training about their ability to identify images that have been altered to deceive.
- 74 percent of consumers and 80 percent of creative professionals worry altered images cause people to believe misleading information.
- 69 percent of consumers and 76 percent of creative professionals worry altered images drive distrust in the news.
- 63 percent of consumers and 74 percent of creative professionals worry that altered images cause people to tune out the news.
- 68 percent of consumers and 78 percent of creative professionals are concerned about altered images, causing people to second guess what they hear, even when talking to people they know. One consumer said, “If we can no longer trust the images we see online or in the news, you just get completely skeptical of everything, or people start checking out.”
Theft and plagiarism online are rampant. The integrity of creative work is often compromised.
- 75 percent of creative professionals say they have had their work stolen, plagiarized, or not properly credited on the internet, with 34 percent saying they’ve experienced this “many times.” One creator said, “I’ve had corporations steal my art. I have friends who were really hurt by having their photography stolen.”
- 86 percent of creative professionals worry about having their work stolen or plagiarized in the future.
Consumers want to be armed with information to decide for themselves who and what to trust and know they can’t easily tell if an image has been altered on its own
- 70 percent of consumers and 79 percent of creative professionals back providing increased access to information about the original source.
- 66 percent of consumers and 83 percent of creative professionals also back increasing access to information about how an image is made.
- 63 percent of consumers support placing a tag on digital images to inform people about its origins. 73 percent of creative professionals are in favor of this idea.
- 72 percent of consumers and 82 percent of creative professionals see a tag as a compelling way to empower people to decide for themselves whether they trust what they are seeing online. One consumer said, “I think with my lack of experience, I don’t know what to look for because I’ve never Photoshopped myself more than a filter.”
Context matters: whether an edit is good or bad depends on why it was made
Consumers draw distinctions between altering an image for art vs. altering an image in the news.
Roughly twice as many consumers worry about altering images to influence impressions of an issue or event (68 percent) or politician (72 percent) than altering an image for artistic expression (34 percent).
Consumers and creators believe everyone has a role to play and do not place sole responsibility for solutions on any one group
- 35 percent of consumers and 39 percent of creative professionals put primary responsibility on the media, followed by companies that provide a platform to share images publicly (25 percent of consumers, 28 percent of creative professionals).
- 19 percent of consumers place responsibility on the government, and 24 percent of creative professionals do the same.
- A similar number – 19 percent of consumers and 22 percent of creative professionals – look at companies that provide tools to alter images to take responsibility.
- 19 percent of consumers place the onus on creative professionals, and an even greater number of creators hold themselves accountable (27 percent).
- 14 percent of consumers and 15 percent of creative professionals put the onus on consumers themselves.
However, data suggests that everyone has a role to play and in fact, concerns arose in focus group conversations about putting too much responsibility on any one group. We agree that this is a problem to solve collectively. There are lots of different kinds of groups that play a role in all of this – the media, the government, creators and technology companies – and consumers need to be empowered with information and solutions.
Adobe’s path forward
We know that disinformation is detrimental to the consumer and disempowering to creators. We’re very excited about the Content Authenticity Initiative’s (CAI) attribution tool we introduced in October and after a little more than a year collaborating with CAI partners, we now have the first end-to-end system for image attribution, which will work on mobile devices, platforms and more. Through this system, we’re now making it possible for creators to tag their work, so that consumers are empowered to discover for themselves who created a photograph or video, with a number of details like where it was created, and how it was edited.
Addressing problems created by technology is part of Adobe’s broader Digital Citizenship efforts, and we look forward to continuing to work with governments, platforms, publishers and tool providers to accelerate progress for society at large. As this research shows, the time is now for the stakeholders to work together to address the problems of disinformation to restore trust in facts, truth and each other.