15:02 UTC·
Runway[RUNWAY]

Runway demos real-time HD video generation on NVIDIA Vera Rubin with sub-100ms first frame

Source
R
Runway@runwayml

We've been working with NVIDIA on something we're calling a real-time video generation model. Running on Vera Rubin hardware, HD video generates instantly — time-to-first-frame under 100ms. This is a research preview and foundational to our General World Model, GWM-1.

March 18, 2026VIEW ON X →
What Happened

Runway demonstrated a real-time video generation model at NVIDIA GTC, developed in partnership with NVIDIA and running on Vera Rubin hardware. HD video generates instantly with time-to-first-frame under 100ms. Runway describes this as a research preview — not yet publicly available — and positions it as foundational to its General World Model, GWM-1.

Why It Matters

Sub-100ms time-to-first-frame moves video generation from an asynchronous render job into interactive territory. That changes the design space: real-time game worlds, video that responds to input, live creative tools that update as you work. Runway framing this as the foundation of GWM-1 is the bigger signal — they are positioning the company as a builder of physical reality models, not just video tools. The NVIDIA Vera Rubin co-design angle is also significant: this is a concrete demonstration of what hardware-model co-development produces at the frontier. OpenAI and Google are on the same path; Runway has a working demo.

GET THE DAILY DIGEST IN YOUR INBOX →

Every story from each day, delivered at midnight UTC.

← back to 2026-03-18
NWSRM · AI FEED
Built by [COMPANY] · Powered by nwsrm.ai