Adobe’s approach to generative AI models & customer choice

Screenshot generating an image using Adobe Firefly.

Today, there is a lot of discussion about the race to build the “best” generic AI model. At Adobe, though, we see a future in which thousands of specialized models emerge, each strong in their own niche. Some will pioneer specific creative effects that customers will leverage, much like the use of plugins in our Creative Cloud applications today. Other models will power Acrobat AI Assistant, a new feature now generally available that instantly generates summaries and insights from long documents. And, as we announced at Summit, we use our own technology and third-party AI models in our Digital Experience applications to help customers analyze data, discover actionable insights and drive personalization at scale. Our customers will use different models for different uses, and Adobe will ensure our customers can easily choose the best tool for their work goals.

When it comes to creativity, inspiration cannot be constrained. That’s true whether you’re creating an illustration, a video, or a marketing asset. And that’s why at Adobe we’ve always made our tools accessible across platforms and as open as possible to any format or raw material, whether it’s your own drawing, a photograph from any camera, or footage you bought from a stock service (our own service or any other).

Now, in the era of generative AI, there is an entirely new ecosystem of material to find and include in your workflow. That’s why we’ve integrated Firefly, our family of generative AI models, directly into our products. Firefly models were developed for the categories and use cases our customers are most focused on and were designed to be safe for commercial uses. We also recognize that our customers use generative AI to achieve many different creative objectives and may want to choose a different, sometimes niche, model to suit their needs. For that reason, we are excited to keep our creative tools open by supporting generative AI models from other companies directly in our applications.

In many ways, generative AI for a creative tool is an entirely new source of footage akin to a magic camera. It can produce raw imagery, snippets of video, or draft text, but for most of our customers, this content is just a starting point. Perhaps you use it as inspiration on a mood board as you explore creative directions. Or maybe you’re editing and incorporating the AI-produced content into your project — or leveraging AI to make unlimited variations of what you’ve created. Whatever your goal may be, our goal is to provide industry-standard tools, like Adobe Photoshop, Adobe Acrobat, and Adobe Illustrator, and seamless workflows that let you use any materials, from any sources, across any platform, to create what is in your mind’s eye. Whether you want to use Firefly or another specialized AI model, we’re working to make the process as seamless as possible from within Adobe applications.

A choice of AI models

Generative AI has already proven itself to be a new and incredible source of content and inspiration, but one’s taste and tools continue to determine the outcome. We’ve seen this firsthand with the launch of Firefly. The majority of our customers use AI as a step in their process, not an end in itself. They’ll take what the AI model produces and tweak it, integrate it into their overall project and keep working until they achieve what they imagine. The numbers we see in our products bear that out. The vast majority of content generation with Adobe Firefly has taken place within our flagship applications, whether that’s using Generative Fill in Photoshop, Text Effects in Express, Firefly Variation Generation in GenStudio or other applications and features.

Screenshot using Adobe Firefly.

Adobe’s greatest value has always been our applications and services, which provide the power and control our customers need to produce their best work efficiently and collaboratively. By integrating many different models into our products, we’ll make it easy for customers to draw on the strengths of each model within the powerful workflows they use every day.

We already partner with third-party AI models in Document Cloud and Digital Experience products. We’re now exploring the addition of non-Adobe AI models throughout Creative Cloud. This week, at the National Association of Broadcasters conference in Las Vegas, we showed some early “sneaks” of how professional video editors could, in the future, leverage the Runway or Open AI Sora video generation models, integrated in Premiere Pro, to generate b-roll to edit into their project, or how they could use either Firefly or third-party models like Pika with the Generative Extend tool to add a few seconds to the end of a shot. Suffice to say, customers will enjoy choice and endless possibility as they create and edit the next generation of entertainment and media in Premiere Pro. By supporting the growing ecosystem of third-party models over time, much like we do with cameras and formats, we ensure that customers can incorporate or generate any footage within Adobe’s applications and are only bound by their imaginations.

Advancing the possibilities of generative AI with Firefly

As we partner with other generative AI providers, we will also continue to develop our own AI models, like Firefly, in the areas in which we have deep domain data and expertise, such as imaging, video, audio, and 3D, and for use cases that require specific workflows and integrations across our products. With our own models, we can provide benefits we know are important to you, including:

  1. Customization: Enterprises can now train a custom version of Firefly with their own copyrighted brand images, logos, characters and styles. That model, available only to the company that creates it, supercharges production of on-brand assets while ensuring that valuable IP remains firmly within the company’s control.

  2. Fine-grained Controls: Drawing on decades of understanding of how our customers work, we’ve incorporated features that provide unprecedented precision when generating content. One recent example is Structure Reference — you upload an image that has the composition you’re looking for and every image generated will match that structure. We have an exciting roadmap of such capabilities coming to our products.

  3. Commercial Safety: When we build our own proprietary models, we can control what goes into the model. For example, today’s Firefly Image Generation model is designed to be commercially safe, by being trained on licensed content, such as Adobe Stock, and public domain content where copyright has expired, and customers have shown that this is a critical feature when they are building content for commercial consumption.

Responsible innovation


Adobe has developed our own AI models with a commitment to responsible innovation and we will apply what we’ve learned to ensure that the integration of third-party models within our applications is consistent with our safety standards and add additional protections for our customers to improve their experiences.

Screenshot using Adobe Firefly.

And as one of the founders of the Content Authenticity Initiative, we’ll attach Content Credentials to assets produced within our applications so that people who work with or view the content can see how it was made, including what AI models were used to generate the content created on our platform. This type of transparency will allow you to achieve the goals you want; in the way you want to. And finally, we’ll continue to connect with customers through multiple channels — from in-app feedback to community groups and social listening programs — to help identify and address issues quickly.

We’re living through the dawn of the generative AI era and just starting to see how this technology will help us all work and create more efficiently and achieve results that would have been impossible before. No doubt, outfitting the world’s greatest and most creative minds with industry-leading tools and an ecosystem of generative AI models at their fingertips will truly change the world through digital experiences.