Adobe has entered a multi-year strategic partnership with AI video specialist Runway, giving Firefly users early and exclusive access to Runway's latest generative video model and setting up joint development of new tools for professional video production.
The deal makes Adobe Runway's preferred API creativity partner. Adobe will receive early access to Runway's new models for integration into its products, starting with Gen-4.5, which is now live for a limited period inside the Adobe Firefly app and on Runway's own platform.
Under the agreement, the two companies plan to co-develop specialised AI features for video workflows that will appear only in Adobe applications. The roadmap starts with Firefly and extends across Adobe's wider suite, which includes Premiere Pro and After Effects.
Exclusive model access
Firefly customers can access Runway's Gen-4.5 model ahead of its broader public rollout. The model is Runway's latest iteration in text-to-video and other generative video formats and sits alongside Adobe's own Firefly models and partner integrations in the Firefly app.
Gen-4.5 focuses on motion quality, prompt adherence and visual fidelity. It supports dynamic and controllable action with strong temporal consistency across different generation modes.
Adobe positions Firefly as an all-in-one creative AI studio. Users can generate short video clips from text prompts in Gen-4.5. They can then adjust visual style, pacing and motion before transferring clips into Firefly's video editor for assembly into finished pieces.
Editors and post-production teams can move AI-generated clips into Adobe Premiere Pro, After Effects and other Creative Cloud tools for further refinement, colour work, compositing and audio integration.
Adobe's Chief Technology Officer and Senior Vice President, Digital Media, Ely Greenfield, said the combination targets professionals already working across its ecosystem.
"As AI transforms video production, pros are turning to Adobe's creative ecosystem - from Firefly to Premiere to After Effects - to imagine, craft and scale their stories across every screen," said Ely Greenfield, chief technology officer and senior vice president, digital media, Adobe. "Runway's generative video innovation combined with Adobe's trusted pro workflows will help creators and brands expand their creative potential and meet the growing demands of modern content and media production."
Runway co-founder and Chief Executive Cristóbal Valenzuela said the agreement extends the reach of its latest model through integration with widely used creative software.
"We're building AI tools that are redefining creativity, storytelling and entertainment, with Gen-4.5 as the latest example," said Cristóbal Valenzuela, co-founder and CEO, Runway. "This partnership puts our latest generative video technology in front of more storytellers, inside Adobe's creative tools that are already the industry standard for many creators around the world."
Targeting professional workflows
Adobe and Runway plan to work directly with independent filmmakers, major studios, creative agencies, streaming platforms, large brands and enterprises on new AI video features. The companies intend to embed these tools into existing Adobe products that already sit in many production pipelines.
The focus sits on integrating generative video into day-to-day workflows rather than offering stand-alone experimentation tools. The companies describe generative video as an element in pre-production, production and post-production stages, from concept visualisation through to final editing.
Gen-4.5 supports complex, multi-element scenes, including precise compositions and simulated physics. It generates expressive characters whose gestures and facial performances remain consistent across multiple shots. These functions are designed for use in storyboarding, animatics, concept sequences and marketing content.
Users can generate many variations from text prompts. They can explore alternative visual treatments and then choose specific clips for refinement within Adobe's traditional video editing products.
Model choice and control
Adobe is building Firefly as a hub for a range of generative AI models across video, audio, imaging and design. The company presents model choice as important for creators who want different styles and outputs.
Within Firefly, users can select Adobe's own Firefly models, Runway models and other partner models from companies including Black Forest Labs, ElevenLabs, Google, Luma AI, OpenAI and Topaz Labs. Users can also work with Firefly Custom Models that are trained for a specific visual style.
Adobe states that content generated in the Firefly app does not feed back into training datasets for generative AI models. This policy applies regardless of which model the creator selects inside Firefly.
The company frames its approach as creator-first and emphasises a view of AI as a tool under human control rather than a replacement for human work in the creative process.
Commercial rollout
Runway's Gen-4.5 model is available now within the Adobe Firefly app and through Runway's own service. Adobe customers on a Firefly Pro plan receive unlimited generations for an initial promotional period.
Adobe and Runway plan further joint releases of video models and workflow integrations. Firefly customers will receive first access to new Runway models through Adobe's creative tools before wider public availability.