How smart public policy can supercharge AI innovation and fuel growth
Image source: Adobe Stock/emzee.
We are living through a time of considerable transition as AI revolutionizes the way we work, create and live. In this moment, policymakers everywhere are looking to help their countries stay competitive and fuel the next wave of economic growth for decades to come. In the U.S., in response to the White House Office of Science and Technology Policy (OSTP)’s Request for Information, Adobe recommended key policies that would ensure AI innovation can continue to thrive in the right way, for everyone.
As the White House prepares to release the AI Action Plan this week, here is Adobe’s view on the three most critical areas where the government and policymakers can make an immediate impact to accelerate innovation and drive economic opportunity for everyone.
Fast-tracking AI innovations
AI isn’t a one-size-fits-all technology, so governments should ensure that any regulatory approaches to AI accommodate a variety of different use cases. In our own product development cycle, Adobe prioritizes context-driven AI ethics, ensuring our safeguards match the real-world use cases of our technology.
If there’s a low-risk feature, like an AI-powered font generator, we accelerate its development and ship it as quickly as our engineers can work. Features with the potential for higher risks go through an extensive process of review, training, testing and risk mitigation before we consider them for public release. Guided by this risk-based framework and our core AI Ethics principles of accountability, responsibility, and transparency, our ethical innovation and product teams can keep up the pace of innovation by fast-tracking low-risk features.
A similar risk-based approach can be applied by governments, which should bring a lighter touch to low-risk AI uses while ensuring higher-risk uses get appropriate oversight and guardrails. By not painting with a broad brush — for instance, tightly regulating all AI models over a certain size — governments can help new innovations get to market faster and create more opportunities for economic growth. This risk-based approach would also be simpler, making it easier for companies to comply and innovate.
Seeing is believing
We see firsthand how creative professionals are using generative AI tools — like Adobe’s own Firefly Text to Image, Generative Fill in Photoshop and the new Generative Extend in Premiere Pro — to push creative boundaries, while streamlining tedious and time-consuming tasks. But, as the company that has been the creative community’s technological backbone for more than 40 years, it is extremely important to us that we not just make it easier to ideate and create with AI — but that we do so in the right way, by respecting creators and their rights.
That’s why we intentionally designed Adobe Firefly to be commercially safe. Distinct among commercial AI models, Firefly was trained exclusively on content that Adobe has permission to use: licensed content, such as Adobe Stock images, as well as public domain content. That means businesses and creators alike can use Firefly to generate production-ready content with confidence, knowing that no copyrights were violated in the creation of an image or video.
Adobe’s commitment to creators and their rights is also why we are leading an effort to universalize Content Credentials, a digital “nutrition label” with metadata that shows who made a piece of content, when and where it was made, and whether AI was used to make or edit it. Adobe’s vision is to help anyone who has something important to say — whether they’re an artist, a journalist, a government official or a citizen — establish attribution and authenticity of the digital content they produce. Already, we’re seeing significant growth for Content Credentials through the support of the Content Authenticity Initiative (CAI), the global, cross-industry coalition we founded in 2019 that now boasts over 5,000 dedicated members, including The Wall Street Journal, The New York Times, Nvidia, Microsoft, Nikon, Leica and Universal Music Group. We are also a founding member of the Coalition for Content Provenance and Authenticity, an open, global standards organization that developed the technical specifications for Content Credentials.
Building on this private-sector momentum, government can take two critical steps to drive widespread availability of Content Credentials.
First, they can help drive adoption of Content Credentials as an industry standard by requiring platforms preserve Content Credentials attached to digital content, thereby protecting the authenticity of online speech. This would help ensure that neither governments nor private companies are acting as arbiters of truth in the digital age. Given the vital importance of free speech online, we need standardized tools that allow creators to be associated with their works and disclose information they believe is important.
Second, government bodies can use Content Credentials themselves, offering citizens a way to verify that government content really came from the agency or policymaker it says it came from — helping to build trust between governments and citizens.
Growing the creative economy
The courts are currently deciding how copyright law applies to AI. Much of the current legal discourse revolves around inputs rather than outputs. Input is the data used to train AI models, while output is the text, image or video that the AI model generates. From an input perspective, AI can only be as good as the data it’s trained on. The more AI models can learn from varied, high-quality data, the more accurate and reliable they are as tools. There are several court cases that are likely to determine whether using images and content to train an AI model qualifies as fair use under U.S. copyright law. But with uncertainty continuing to swirl, we urge governments to clarify these fundamental questions to provide certainty in the marketplace for both creators and AI developers as this technology continues to evolve. Without new creative works and data to train on, AI models will no longer be able to develop accurately or responsibly in the future.
On the output side, one of the most critical ways the government can have a positive impact is by giving creatives a way to protect the distinctive styles that they have spent years developing and that help them make a living. In the AI era, it is important to prevent bad actors from using AI to impersonate creators’ styles and undermine them in the marketplace — an area that is not currently covered by copyright protections. A federal anti-impersonation right protecting creators and combating bad actors would be an invaluable addition to our laws and would give creators more control over their style and work. This, in turn, would help protect and fuel the creative economy and ensure AI works for all creatives — protecting not just household-name artists but designers, marketers, everyday creators and small businesses who rely on creative tools to power their work.
In the U.S., enacting the Preventing Abuse of Digital Replicas Act (PADRA) would be a critical step towards ensuring the responsible use of AI-generated content because the act would protect artists’ voices and likenesses from being copied using AI and misused commercially. Establishing these rights will be crucial for the future development of AI. Without assurances, creators won’t be motivated to produce.
Supporting the creator economy also means recognizing how AI will reshape not just creative tools, but the future of work itself. Adobe is providing Adobe Express for Education for free to every K–12 student in the U.S. and is equipping K-12 educators with the training and resources they need to help children build essential AI skills that will fuel America’s growth.
Policies for prosperity
There is no doubt that we are living through a time of great change. New, AI-powered and -infused technology is touching all of our lives, and as one of the leading companies moving this technology forward, Adobe takes seriously our responsibility to ensure that it empowers people and creates more opportunity.
Using our own practical experience in these issues, we have shared our perspective with the government as they look for the right balance between innovation and responsibility. The U.S. has the chance to develop smart policies that can not only help foster innovation but also promote incentives for economic growth and prosperity for decades to come.