Sundance Dispatch: Expanding creative expression in filmmaking with generative AI
For decades, the Sundance Film Festival has been a launchpad for filmmakers willing to take creative risks, redefining how stories are told and experienced. This year, Adobe collaborated with two independent directors rooted in traditional craft to commission two original short films.
The resulting case studies offer a close look at how generative AI can unlock new creative territory, enabling experimentation with formats once out of reach, allowing more flexibility during the production process, and optimizing budgets.
Both films — an animated dramedy by Momo Wang and a hybrid live-action and generative AI drama by Taryn O’Neill — were produced with a mix of traditional craft and generative workflows — all powered by Adobe products. They demonstrate how new technology can enhance every stage of filmmaking, from moodboards to final polish, and expand what a small, independent team can create.
The workflow behind the generative AI shorts
Intentionality and creative control guided every step of the production. For both films, the directors and their teams deliberately chose where and when generative AI entered the process, using the technology to unlock new visual possibilities and generate meaningful time savings.
This human-led approach — creativity amplified by thoughtfully mixing generative AI with traditional craft — allowed the teams to rapidly explore visual directions, build environments that would have been costly to consider, and preserve full control over characters, pacing, and emotional truth.
Building on the success of the Generative AI Film Festival at Adobe MAX 2025, filmmakers had access to Adobe Firefly, the all-in-one creative AI studio, including Generate Image, Generate Video, and Generate Audio. They used Firefly Boards — Adobe’s collaborative, AI-powered ideation surface — for moodboarding, storyboarding, and creative collaboration across the teams. And they leaned on Creative Cloud apps such as Photoshop, Premiere, After Effects, and Audition to refine assets and add the ultimate finish and polish to their work.
Watch the Sundance films and discover how they were made
The two films have only just premiered at the Adobe House during Sundance, and now you can watch them too!
Below, we spotlight the full films, the directors’ original ideas and creative choices, and the workflows that helped them realize their visions.
“Wink”
Director: Momo Wang
Concept: A one-eyed stray cat reinvents herself as a casino’s “lucky charm,” only to abandon her glamorous life to train a shelter of “unadoptable” animals on how to be loved.
Tools used: Adobe Firefly (including partner models Veo for video and lip sync, Nano Banana for images, and ElevenLabs for voice generation), Firefly Boards, Photoshop, After Effects, Audition, and Premiere.
On the surface, animated dramedy “Wink” is a story about a one-eyed cat searching for a home, but at its core, it’s a reflection of Wang’s own search for true love and self-worth. The film is based on a real rescue cat. “When I saw his photo on the SPCA website, I knew immediately I was taking him home,” Wang said. “He had lost an eye to abuse, but we went on to share ten happy years — mostly eating shrimp together.”
Wang originally imagined “ Wink” as a larger animated feature with many characters and complex plots. “To make it work as a standalone short, we had to be decisive — we completely rewrote the ending to focus on the emotional arc,” Wang explained. The team tested three storyboard styles: “We found the ‘cinematic’ style was too slow-paced, and the ‘cartoon’ style felt a bit too childish for the message. Ultimately, we chose the ‘action’ style — it gave the film the dynamic energy it needed,” Wang said.
The film’s final beat — Wink leaving her collar behind and a young lady staying to say, “Let’s go home” — distils the journey of embracing one’s own imperfections. Wang voiced the lady herself, making the ending feel especially intimate. “It’s a film about a cat, but in her, we see ourselves. You don't need to be ‘lucky’ to deserve a home. You just need to be you,” she said.
The making of “Wink”
Wang started with hand-drawn sketches, which she then rendered into a 3D style using Firefly. From there, Wang used Generative Fill in Photoshop to combine multiple variations into one unique character design.
“Due to our tight timeline, and since 3D is currently the most robust and efficient style for AI generation, we used it as our foundation,” Wang said. “However, to give the film a unique artistic identity, we blended in 2D aesthetics. This allowed us to leverage the speed of AI, while maintaining a distinct, hand-crafted look.”
Since the team worked remotely, Firefly Boards became their main collaboration tool. For video, they primarily used Firefly partner model Veo, specifically for the extensive lip syncing required. “To nail the timing, we leaned on ElevenLabs in Generate Speech to generate scratch vocals for our dynamic storyboards,” Wang explained.
Those AI-driven vocals served as placeholder content before the production replaced them with real voices and actors for the final cut. “And whenever the AI video had glitches, we relied on traditional VFX techniques to fix them or flexibly adjusted our storyboards and editing.”
Being able to access top AI models and move seamlessly from Firefly to Premiere meant the team didn’t have to constantly switch contexts and could work fast without sacrificing quality. “Ultimately, we used technology to build the body, but our passion to breathe life into these characters,” Wang said.
“MythOS”
Director: Taryn O'Neill
Concept: A whale shares the story of our fractured world and the mythic secret that might help fix it, through the eyes of a young woman. The film shines a light on the role myths have played in the growth of society throughout the ages.
Tools used: Adobe Firefly (including partner models Veo and Marey for video, and Nano Banana and Flux for image generation), Firefly Boards, Photoshop, Generate Sound Effects, and Premiere.
Written and directed by O’Neill, “MythOS” is rooted in her long-standing fascination with the power of storytelling and how it has served humanity for generations as a kind of operating system. “Stories have long helped us make sense of the world, navigate uncertainty, and understand ourselves as a community,” O’Neill said. “I feel much of that shared framework has been lost.”
From that idea, O’Neill wrote a short script exploring an individual hero’s journey evolving into a collective one. The film was partly inspired by new research suggesting that sperm whale clicks contain elements reminiscent of human language, as well as by O’Neill’s own interest in whales as mythic record-keepers and messengers of new stories.
The first major creative decision of the project was how much generative AI to use and how to use it. O’Neill initially envisioned the surreal film as an entirely AI-generated experiment, but her final film features actors and traditional filmmaking elements alongside AI, creating a truly hybrid experience.
The making of “MythOS”
O’Neill shot actors on a traditional green screen and composited them with AI backgrounds. This led to a key decision: The team needed to have the full AI video backplates ready before shooting on the green-screen stage, so the actors could be lit correctly and grounded in the environments.
The wide and over-the-shoulder shots of the actors were created entirely in AI. “With their consent, we built a likeness card for each actor,” O’Neill explained. “It was made up of full-body shots from different angles and close-up expression references, all in wardrobe, to help guide the models.”
The team used Firefly Boards as their central hub, leveraging generative video and image tools to move quickly. O’Neill said: "Firefly was key for early visual development, concepting, and world-building, and Photoshop then helped us refine and composite images, especially as we blended AI-generated visuals with live-action elements.”
The film was edited in Premiere, where the story really came into focus. Late in post-production, the team decided to simplify the story, and the tools enabled the team to adapt and quickly generate additional shots.
“In a traditional filmmaking and VFX pipeline, that kind of late adjustment would have been far more costly and time-intensive,” O’Neill explained. “Under tight time pressure, Boards became a shared visual language that let a small, fast-moving team stay aligned and the story emerge as we built it.”
Taryn O’Neill (left) and Momo Wang (right) at the Adobe House GenAI showcase and panel discussion moderated by Variety at the Sundance Film Festival. Credit: Jonathan Hickerson.
Lessons learned from adding AI tools to filmmaking
Wang and O’Neill agreed that even when you combine traditional filmmaking workflows with AI, creating high-quality work is complex and requires immense time and effort.
“The ideation phase can be incredibly fun in generative AI, and even hallucinations can become happy accidents,” O’Neill pointed out. “But execution takes work, and for me, the biggest lesson was to pivot without settling. Lean into what the tools do well, stay clear on intention, and use them as a new translation layer for collaboration.”
Wang agreed. The tech doesn’t replace the craft, she said. It sits alongside and tests it. The core requirement is still professional filmmaking skills. And the magic still comes from creative people working together. “Technology will always evolve and iterate,” Wang said, “but that moment when a group of like-minded artists comes together to create something beautiful? That is irreplaceable.”