Adobe Publishes Research Study on Disinformation
By Dana Rao
Deceptive content online is driving a loss of trust in the truth. A shared understanding of facts is critical to a functioning society. We can, should, and do argue about ideas, strategies and principles all day; but we will make no progress in solving problems if we cannot ground these debates on objective reality. Unfortunately, disinformation driven by deceptive content online is growing faster than ever. Every day we are faced with split-second decisions to believe, or not believe, whether what we are seeing is true. It’s a new world, where content can be easily edited to turn fact into fiction.
Adobe has always been the leader in driving digital creativity. We’ve consistently evolved our technology to give creatives the world’s best tools and services they need to unleash their imagination. But we know that creative tools can be misused, and it is important to us to play a proactive role in mitigating the unintended and negative consequences of technology.
What is Disinformation? Understanding Consumer & Creator Perception.
Disinformation is a multi-faceted problem, and there are many parts to the solution. Earlier this year we commissioned an opinion research study (memo and deck) among U.K. consumers and creative professionals to understand their perceptions about the scale and severity of manipulated content online, what needs to be done and who needs to do it.
A total of 900 people (600 consumers and 300 creative professionals) in the U.K. participated in a survey that was conducted from September 4 to September 18, 2020. We also conducted a similar study in the U.S. and the results can be found in this memo and deck.
In our research, we asked people what content they tend to trust online, how they distinguish between good and bad altered content, what tools they need to evaluate authentic content, and what solutions would help creative professionals uphold the integrity of their work.
Looking at the U.K. specifically though, here’s what we heard:
Manipulated content is everywhere, and it’s eroding trust in everything.
- 62% of consumers and 75% of creative professionals say they frequently come across fake images.
- In focus groups, people shared concerns that they lack the knowledge and training about their ability to identify images that have been altered to deceive.
- 74% of consumers and 73% of creative professionals worry altered images cause people to believe misleading information.
- 70% of consumers and 69% of creative professionals worry altered images drive distrust in the news.
- 63% of consumers and 72% of creative professionals worry that altered images cause people to tune out the news.
- 67% of consumers and 75% of creative professionals are concerned about altered images, causing people to second guess what they hear, e_ven when talking to people they know_.
One consumer said, “If we can no longer trust the images we see online or in the news, you just get completely sceptical of everything, or people start checking out.”
Theft and plagiarism online are rampant; the integrity of creative work is often compromised.
- 65% of creative professionals say they have had their work stolen, plagiarised, or not properly credited on the internet, with 19% saying they’ve experienced this “many times.”
One creator said, “I’ve had corporations steal my art. I have friends who were really hurt by having their photography stolen.”
- 81% of creative professionals worry about having their work stolen or plagiarised in the future.
Consumers want to be armed with information to decide for themselves who and what to trust and know they can’t easily tell if an image has been altered on its own.
- 73% of consumers and 73% of creative professionals back providing increased access to information about the original source.
- 73% of consumers and 73% of creative professionals also back increasing access to information about how an image is made.
- 70% of consumers support placing a tag on digital images to inform people about its origins. 69% of creative professionals are in favour of this idea.
- 76% of consumers and 79% of creative professionals see a tag as a compelling way to empower people to decide for themselves whether they trust what they are seeing online.
Another consumer said, “I think with my lack of experience, I don’t know what to look for because I’ve never Photoshopped myself more than a filter.”
Context matters: whether an edit is good or bad depends on why it was made.
Consumers draw distinctions between altering an image for art vs. altering an image in the news.
- More than twice as many consumers worry about altering images to influence impressions of an issue or event (67%) or politician (70%) than altering images for artistic expression (26%).
Consumers and creators believe everyone has a role to play and do not place sole responsibility for solutions on any one group.
- 42% of consumers and 35% of creative professionals put primary responsibility on the media, followed by **companies that provide a platform to share images publicly **(35% of consumers, 29% of creative professionals).
- 26% of consumers place responsibility on the government, and 20% of creative professionals do the same.
- 14% of consumers and 16% of creative professionals look at **companies that provide tools to alter images **to take responsibility.
- 16% of consumers place the onus on creative professionals, and an even greater number of creators hold themselves accountable (24%).
- 14% of consumers and 22% of creative professionals put the onus on consumers themselves.
However, data suggests that everyone has a role to play and in fact, concerns arose in focus group conversations about putting too much responsibility on any one group. We agree that this is a problem to solve collectively. There are lots of different kinds of groups that play a role in all of this – the media, the government, creators and technology companies – and consumers need to be empowered with information and solutions.
Adobe’s Path Forward
We know that disinformation is detrimental to the consumer and disempowering to the creators.
We’re very excited about the CAI attribution tool that we introduced in October, and after a little more than a year collaborating with CAI partners, we now have the first end-to-end system for image attribution which will work on mobile devices, platforms and more. Through this system, we’re now making it possible for creators to tag their work, so that consumers are empowered to discover for themselves who created a photograph or video, with a number of details like where it was created, and how it was edited.
Addressing problems created by technology is part of Adobe’s broader Digital Citizenship efforts, and we look forward to continuing to work with governments, platforms, publishers and tool providers to accelerate progress for society at large. As this research shows, the time is now for the stakeholders to work together to address the problems of disinformation to restore trust in facts, truth and each other.