An artist’s style is precious — Let’s protect it
decotora, an illustration by Tokyo artist 夏 若林.
Nothing is more personal, mysterious, and valuable for a creative person than their individual artistic style. No matter their medium, the idiosyncratic way an artist expresses themselves is not only their currency in the creative market, it is fundamentally how their past and human experience informs their unique point of view — and it is how they represent themselves.
But when the first generative AI models emerged, models trained on words, images and other data pillaged from all over the internet, artists saw their styles being commoditized. I saw the shock of creatives, some of whom I’d known for years as the founder of Behance, when they saw that their unique approach could be mimicked by anyone who entered “in the style of ______,” using their name, into a prompt box.
Having your style impersonated is not only upsetting. It threatens the livelihoods of creative people. Suddenly, an artist is essentially competing against themselves, in the form of an AI engine that can create a facsimile of their style in seconds, based on all of their past work. And that AI engine doesn’t have to pay the rent or feed a family, so it will always undercut the actual creator, who spent years or decades developing their style.
At Adobe, we believe that AI should advance creative careers, never threaten them. We know that when creative people are aided by AI-based tools, they can work more efficiently, explore new mediums and produce work that might never have been possible before. We committed to developing creator-friendly AI by training our Firefly generative AI models only on content we have permission to use — never on user content.
We also founded the Content Authenticity Initiative, a cross-industry group (4000+ members strong) devoted to improving transparency of digital content and developing a new approach to creator attribution called Content Credentials. Content Credentials are built on an open standard that allows anyone to securely attach information to digital content, such as who created the content and how. With Content Credentials, artists can also tell other AI models in the market that they don’t want the models to train on their work.
But we also recognize that technology alone won’t solve this problem. Adobe has been advocating for a federal anti-impersonation right that helps protect creators against bad actors who use an AI model explicitly to impersonate their style for commercial gain. And so we’re encouraged by the momentum around thoughtful legislative solutions like Rep. Darrell Issa’s Preventing Abuse of Digital Replicas Act (PADRA), which was introduced in the House this week.
PADRA supports the innovation and development of AI while addressing the potential for commercial misuse of AI and safeguarding the intellectual property of the creative community. It would enact federal protections against the voice, image and likeness of individuals being used in unauthorized commercial ways. PADRA is also the first bill to protect digital artists from those who would pass off AI-generated work as the artists’ creations.
We see PADRA as an important step in ensuring creatives have the protections they need to thrive in the age of AI. We know that the best solutions for this problem will come when the creative community provides their views, and so I hope creators will share their ideas about how this transformational technology can better serve them, and in what areas they think governments should step in.
Artists’ unique voices are precious not just to them, but to all of us. They provide beauty and originality and give us all new perspectives through which to see the world. For most artists, developing their style is a long, painstaking and delicate process. It’s in all our interests to safeguard that process and protect artists’ styles from commercial, AI-based exploitation.