Adobe’s AI and the Creative Frontier Study reveals creators' views on the opportunities and risks of generative AI
In today’s creative landscape, generative AI is increasingly being embraced by the creative community as a valuable tool for enhancing creativity and streamlining workflows. However, it also raises concerns about intellectual property, transparency, and potential misuse of creators' work in AI training. For our inaugural AI and the Creative Frontier Study, we surveyed over 2,000 creative professionals in the U.S. to explore their evolving relationship with generative AI. The findings reveal that while creators mostly see benefits like time savings and support in brainstorming with this transformative technology, they also demand greater transparency and control over their content. In addition, there’s overwhelming demand from creators for tools that can help with attribution over their work, government regulation to protect creators' rights in this rapidly changing environment.
Read on below to discover key insights from our study.
While creators welcome the time-saving benefits of generative AI in easing their workload and freeing them from menial tasks, they remain wary about its potential for misuse — particularly around their work being exploited to train AI models without consent.
- 90 percent of creators said they believe generative AI tools can help them save time and money by relieving them of menial tasks and supporting their creative brainstorming process.
- 90 percent of creators also said that they believe generative AI tools can help create new ideas.
- However, more than half (56 percent) believe generative AI can harm creators, primarily by training AI models on their work without consent
- Furthermore, 44 percent say they have encountered work online that is similar to their own that they believe was created with generative AI.
Creators demand tools that provide transparency into when and how generative AI is used to create content, as well as more control over using their work for AI training. By overwhelming margins, they say such tools would help address many of their concerns about generative AI.
- 91 percent of creators said that if there was a tool to attach verifiable attribution to their work – so that people online can tell that the work came from them — they would use it.
- 89 percent believe that AI-generated content should always be labeled as such in exhibitions and marketplaces.
- A strong majority of creators say each of the following would help address their concerns about generative AI:
- The ability to signal to generative AI models that they don’t want their work used to train the model (83 percent)
- Pledges from AI companies that the generative AI output will not violate copyrights or trademarks (82 percent)
Creators back government regulation around AI and insist it has a crucial role to play in shielding them from AI’s impact — an area where they see a lack of protective laws.
- More than half of creators (56 percent) do not think existing legislation sufficiently protects creators and their work from the potential impact of AI.
- Roughly three-in-four creators (74 percent) support government regulation of AI, and 84 percent agree that the government should play a role in ensuring creators can get attribution credit for their work.
- A strong majority (88 percent) would support a new law to protect creators’ work from being impersonated with AI tools and would also support protections that allow creators to take legal action against someone impersonating or replicating their art with AI tools.
- The same share (88 percent) would also support legislation that requires devices, editing tools, and online platforms to make attribution technology available to creators.
Supporting Creative Endeavors
At Adobe, we believe that generative AI is a tool for, not a replacement of, human creativity. We’re committed to providing creators with tools, resources and solutions that help improve creator productivity and serve creative careers. Adobe Firefly, our family of creative generative AI models, is only trained on content Adobe has permission to use — never on customer content. We’re continuing to offer bonus payments to Adobe Stock contributors whose work has been used to train Firefly, while creating new opportunities for contributors to potentially grow their earnings as a member of the Stock community. In addition, Adobe operates Behance, the leading online platform for creators to showcase their work and discover new creative work to grow their careers.
This morning, we also announced the Adobe Content Authenticity web app, which is designed to help creators protect and gain attribution for their work with Content Credentials. And while Firefly is only trained on content that Adobe has permission to use — never on customer content, and a feature in the web app will creators to use Content Credentials to signal if they do not want their content used by or to train generative AI models from other companies. You can read more about the Content Authenticity web app here.
Methodology
Adobe collaborated with Advanis to collect 2,002 responses from the creative community, including creative professionals primarily employed in a creative job function (e.g., digital artists, photographers, videographers, graphic designers, and other creatives), developers who focus on visual designs, and freelancers who earn a living by selling their work. All survey respondents were 18 and older and data was collected from an opt — in non-probability sample provider in September.