TwelveLabs ships Pegasus 1.5 model and Rodeo AI editing co-pilot
TwelveLabs unveiled three interconnected products at NAB 2026 that together mark a shift from developer API to production-workflow platform. Pegasus 1.5, an updated video foundation model, adds Time-Based Metadata Extraction — automatically identifying temporal boundaries and generating structured metadata without re-indexing or manual annotation. The capability is aimed at media libraries where re-tagging at scale has historically required significant editorial labor.
Alongside the model update, the company launched Rodeo, an application-layer creative co-pilot that lets editors find footage, assemble sequences, and make edits using natural language commands with no technical integration required. The tool targets editorial teams who want AI-assisted workflows without building custom API integrations. A third announcement pairs TwelveLabs with Autodesk Flow Capture, adding Smart Search and Smart Actions to the digital dailies tool used by film and television productions.
The combination positions TwelveLabs as a direct competitor in the editorial workflow market, not only in the infrastructure-API layer. CEO Jae Lee described the direction plainly: "With Pegasus 1.5, Rodeo, and our integrations with industry leaders, we're transitioning from understanding video to operationalizing it at scale."