
Avid and Google Cloud demo agentic AI inside Media Composer at NAB
Avid and Google Cloud are jointly demonstrating agentic AI workflows at NAB 2026, integrating Google's Gemini models and Vertex AI directly into Media Composer and the Avid Content Core SaaS platform. The demo covers natural-language media search, intelligent metadata enhancement, automated B-roll suggestion with style matching, and autonomous orchestration of multi-step tasks — sequences that today require manual handoffs between tools or operators.
The demonstration runs at both Google Cloud Booth W2731 and Avid Booth N2226 when the exhibit hall opens April 19. The collaboration extends Avid's AWS announcement this week, positioning Avid Content Core as a platform that can run AI inference from multiple cloud providers rather than locking into a single stack.
"By embedding agentic AI directly into the tools video editors live in, we're moving beyond simple automation," said Anil Jain, Global Managing Director, Google Cloud. The framing — agentic rather than assistive — signals that both companies are pitching AI that initiates actions across workflows, not just responds to prompts.
The distinction matters for post-production buyers evaluating AI: assistive tools surface suggestions, while agentic systems execute multi-step sequences. The NAB demo tests whether that capability is production-ready in an editing environment, not just a controlled lab.