Behind the shorts at the Generative AI Film Festival at MAX 2025
Six directors. One shared toolbox. Six very different approaches. For our inaugural Generative AI Film Festival at Adobe MAX 2025, we invited six visionary filmmakers at the cutting edge of technology to each create a short film. The films, which premiered at MAX and here on our blog, cross both genres and aesthetics, ranging from the trailer for an animated series to an Afrofuturist sci-fi comedy.
Below, we break down each film’s making-of and showcase exclusive behind-the-scenes videos that explain just how the filmmakers achieved their vision with the help of generative AI. We also take a look at the specific Adobe Firefly and Adobe Creative Cloud workflows that helped bring each story to the screen.
“Beta Earth”
Director / Studio: Nik Kleverov / Native Foreign
Tools: Adobe Firefly (including partner models like Google’s Veo 3.1 and Luma AI’s Ray3), Firefly Boards, Generate Sound Effects, Premiere
Nik Kleverov relied on Firefly Boards to set the tone of the “Beta Earth” series trailer. The character designs were initially hand-drawn by illustrators, then brought into their final form with the help of Firefly.
“Once we generated still frames, we transferred them to an edit in Premiere and built out the animatic,” Kleverov explained. Those frames were then refined and put into motion through video generation using Firefly and a suite of partner models.
“Music, sound design, and finishing followed a process we’ve been using for a while now,” Kleverov said. “We also tried out some of Firefly’s audio-generating tools for this project and were pleasantly surprised.”
“Enter the BBL Drizzyverse: The Yams of Life”
Director / Studio: King Willonius
Tools: Adobe Firefly (including partner models like Veo 3.1 and Nano Banana, Runway Gen-4), Firefly Boards, Generate Sound Effects
Generative AI enabled King Willonius to instantly envision and build the worlds he wanted to create for his film. “There are no limitations. I can just dream big,” he said. “These tools have made a significant impact on how I work and approach productions: They allow us to do more with less. We’re now a lot more efficient, more organized, and way more productive.”
Specifically, King Willonius used Firefly partner model Nano Banana to generate visuals, and Runway to create style transfers. The team then used Veo 3.1 for animation, while narration and score were sketched, developed, and iterated on with Generate Sound Effects in Firefly, allowing the story and sound to evolve together.
Firefly Boards also made it a lot easier for the team to stay organized. “Firefly Boards keeps everything contained, so the project is not all over the place,” King Willonius explained. “You can generate and stay in Boards, which is very helpful from a productivity standpoint for our workflow and collaboration.”
“My friend, Zeph”
Director / Studio: Dave Clark / Promise Studios
Tools: Adobe Firefly (including partner models like Google’s Veo 3.1 and Luma AI’s Ray 3), Firefly Boards, Substance Painter, Adobe Photoshop (specifically the Harmonize tool), Premiere
Dave Clark’s team deliberately blended classic filmmaking with AI workflows. “We started by writing detailed prompts, or 'visual language', to generate ideas, images, and visuals for the world,” Clark explained. “We then used Firefly Boards to lay out the story and plan shots, exploring different worlds, lighting, and character designs.”
A key decision was using the lead actress’s actual childhood photos (with her permission) as a reference in Firefly to create her younger self and ground the AI character in reality. The team also used Adobe Substance Painter to hand-paint her childhood drawings onto the 3D model of the robot.
During production, the team combined live-action footage with virtual production, which allowed them to see the AI-generated backgrounds in real-time and make instant creative changes, such as swapping environments or adjusting focal length, right on set.
“The tools we used were transformative,” Clark said. “The process required human artistry, and we could now fuse human performance with AI world-building. The most significant impact on our workflow was the ability to make major creative decisions in real time. For the story, these tools allowed us to dream as big as we wanted and stay true to the original script.”
“Nagori”
Director / Studio: Guillaume Hurbault / Promise Studios
Tools: Firefly Boards, Photoshop, Premiere
Guillaume Hurbault treated Firefly Boards as an all-in-one creative studio — generating, refining, and organizing every visual and motion test in the same place. “Alongside Adobe Photoshop and Premiere, it gave me total creative flow,” he explained. “No friction, just freedom to experiment and build an emotional story efficiently.”
This approach allowed Hurbault to preserve the aesthetic of a Japanese ink painting while leveraging generative capabilities. “What stood out to me was how seamless the AI-first workflow has already become — no technical barriers, just pure direction,” he said.
“Kyra”
Director / Studio: MetaPuppet / Promise Studios
Tools: Adobe Firefly (including partner models like Google’s Veo 3 and Luma AI’s Ray 3), Firefly Boards, Generate Sound Effects
As MetaPuppet captured sounds and photography for the film, his traditional filmmaking background opened up to the new technology. Blending live photos and sound recordings with AI-generated imagery ended up being his favorite part of the project.
“Generative AI didn't just help me create new visuals,” MetaPuppet pointed out. “It became a kind of time machine that allowed me to revive images and memories I've held onto for two decades and finally tell a story through them.” Even though the building featured in the film — the 5 Pointz Building in Queens — doesn’t exist anymore, viewers of “Kyra” can now experience it as it once was.
MetaPuppet used Firefly Boards to experiment with the look of his characters and subway locations. “It was an incredible tool for generating multiple variations from my starting images and quickly refining the visual style,” he said. “The partner models Veo 3 and Ray3 then brought these images to life in ways that far exceeded my expectations.”
Sound played an equally important role. MetaPuppet used Firefly Generate Sound Effects to sync his voice to the video and automatically generate matching sounds. He also recorded himself playing a bucket drum rhythm for the opening scene, fusing live sound with AI-generated elements.
“Dreamer”
Director / Studio: Ryan Patterson / Queen One Studios
Tools: Adobe Firefly (including partner models like Veo 3), Firefly Boards, Generate Sound Effects, Premiere
Ryan Patterson created “Dreamer’s” decades-spanning look by bringing personal photographs into Firefly Boards and recreating scenes from his past. He stayed in Boards to create all the film’s images and video. Generate Sound Effects, meanwhile, were used to add sound to the footage.
Then, depending on the type of shot he needed, Patterson leveraged different partner models. “Having a variety of models all in one place really let me streamline my workflow,” he said. “I was surprised at how quickly I was able to bring my ideas to life. There has never been a more revolutionary time for creatives.”
Experiment with generative AI in your own filmmaking workflows
We asked the filmmakers from Adobe's inaugural Generative AI Film Festival to offer some practical advice for other filmmakers starting to incorporate AI in their work. Here’s what they said:
Nik Kleverov and Ryan Patterson both recommended simply diving in. “Start small and explore a single idea,” Kleverov suggested. “Be really strict with yourself. Identify a core theme, and be diligent about creating something short and impactful. Deadlines are really helpful, even if it’s one you set for yourself.”
King Willonius agreed and advised filmmakers to be organized as much as possible. “Plan out your process, and have your ideas storyboarded out,” he recommended. “Pre-production is very important, and these tools help you work efficiently, so you don’t have to go back and redo a lot of material.”
MetaPuppet, meanwhile, encouraged filmmakers to draw from their own archive, whether that’s photos, sounds, or memories. He said: “When you combine your lived experience with generative tools, the results feel far more personal and authentic.”
“The magic happens when you move fast and experiment,” Guillaume Hurbault said. “Let intuition lead.”
Why generative AI belongs in the filmmaker’s toolbox
Across six very different films a clear theme emerged: generative AI tools don’t replace the creative process, they boost it.
What used to take weeks of back-and-forth is now turning into an interactive, iterative, and more fluid process that allows filmmaking teams to quickly test looks, sound, and narrative beats. For teams with constrained budgets and schedules, that means more creative freedom and more time spent on storytelling.
Adobe’s integrated Firefly and Creative Cloud workflow is making this possible: a single playground for ideation, image/video generation, sound prototyping, and editorial finishing — and, crucially, a way to keep human creativity at the center of every choice.