Adobe partners with Runway to bring AI video generation into Firefly
by Wayne Williams · BetaNewsAdobe has announced a multi-year partnership with Runway that pulls generative video directly into Adobe Firefly and, over time, deeper into Creative Cloud. The idea is to make AI-generated video part of the same tools people already use to edit, finish, and deliver professional projects.
Runway offers AI video generation tools that let users create clips from text prompts, control motion and pacing, and experiment with different visual ideas without shooting footage. It sits in the same general space as tools like OpenAI’s Sora, but is often used as a practical production tool.
SEE ALSO: Disney and OpenAI strike deal to bring classic characters to Sora
As part of the deal, Adobe becomes Runway’s preferred API creativity partner. That means Adobe customers get early access to Runway’s newest models inside Firefly, starting with the new Gen-4.5 model, which is available now for a limited time in both Firefly and on Runway’s own platform.
“As AI transforms video production, pros are turning to Adobe’s creative ecosystem -- from Firefly to Premiere to After Effects -- to imagine, craft and scale their stories across every screen,” said Ely Greenfield, chief technology officer and senior vice president, digital media, Adobe. “Runway’s generative video innovation combined with Adobe’s trusted pro workflows will help creators and brands expand their creative potential and meet the growing demands of modern content and media production.”
Runway Gen-4.5
The new Gen-4.5 upgrade offers better motion, stronger prompt control, and more consistent visuals from shot to shot. Inside Firefly, creators can generate clips from text, try different directions quickly, and then assemble those clips in Firefly’s video editor. Content can be taken into Premiere Pro, After Effects, and other Creative Cloud apps for more precise work.
“We’re building AI tools that are redefining creativity, storytelling and entertainment, with Gen-4.5 as the latest example,” said Cristóbal Valenzuela, co-founder and CEO, Runway. “This partnership puts our latest generative video technology in front of more storytellers, inside Adobe’s creative tools that are already the industry standard for many creators around the world.”
Adobe and Runway say they’ll work directly with filmmakers, studios, agencies, streaming platforms, and brands to bring new video features into Adobe’s apps.
If AI video generation is part of the same software used for final delivery, it will inevitably become part of everyday production. That raises questions about how it’s used, where it’s acceptable, and what it means for creative roles across film, TV, and advertising.
Adobe says Firefly users can mix different models depending on the job, including its own and Runway’s, without that content being used to train AI systems.
Runway’s Gen-4.5 is available now in Adobe Firefly and on Runway’s platform, with Adobe Firefly Pro users getting unlimited generations until December 22.
What do you think this kind of deep AI video integration means for everyday creators and for Hollywood? Share your thoughts in the comments.