By Michael Rubloff and Nitin Bhargava
From commercials to music videos to studio pipelines, radiance field representations like NeRFs and Gaussian Splatting have quickly moved from research papers to high-profile productions in just a few years. These technologies allow you to feed in ordinary 2D images or video and reconstruct lifelike 3D scenes with perfect parallax and lighting. For directors, this provides a virtual insurance policy for location shoots, cost-effective virtual sets, and the freedom to make creative decisions long after the shoot is over.
Here are four real world examples, showcasing how the technology has evolved and will continue to do so.
For the McDonald’s “Lunar New Year” campaign, the crew shot the entire scene once inside a Chicago McDonald’s and then fed the frames into Luma AI’s NeRF pipeline. This was before Gaussian Splatting, so the flexibility came entirely from NeRFs. In post-production, they were able to fly a virtual camera anywhere, including through tables, out of a Happy Meal box, and even past reflections, all while maintaining the real-world parallax and lighting. This allowed the team to "film in post" and experiment with camera angles that would have been impossible on the day of the shoot.
Learn more about the McDonald's Lunar New Year CampaignDirector Jake Oleson has fully embraced radiance fields, using them as the primary medium for entire productions. He reconstructed over 700 individual NeRFs for RL Grime’s "Pour Your Heart Out" music video and created the entire Zayn Malik "Love Like This" video inside Luma AI. The latter video received 10 million views in 24 hours. Oleson's short film, "Given Again," which he shot using NeRFs, won him Runner Up at Runway ML's AI Film Festival. His work is now used as a reference for labels looking for new visual languages.
Learn more about Jake Oleson's NeRF Music VideosTaking the technology to the edge of what’s currently possible, Sydney’s Electric Lens Company (ELC) and bullet-time specialists The Splice Boys built a custom pipeline for high-end commercial VFX. They developed an end-to-end 3D Gaussian Splatting workflow to capture a series of "frozen time" food explosions for a national campaign called “We Know Good Food”. The Splice Boys’ 150-camera portable lightfield rig captured each moment in high bit depth RAW, preserving both diffuse and specular detail. Unlike photogrammetry’s flat textures, 3DGS delivered the shimmer of wok-flung oil and the gloss of sauce droplets. A key breakthrough was the ability to crop and recombine splat models like volumetric Lego, allowing them to remove a girder to reveal talent or composite fire and smoke from different takes into one seamless scene. The final spots ran in 4K at 60fps, retaining enough detail for multiple aspect ratios without compromise. This process proved that 3DGS can produce photoreal commercial VFX with minimal cleanup and maximum creative freedom.
View “We Know Good Food”Radiance field capture is transitioning from a "special shot" to an everyday tool. The leap from the McDonald’s NeRFs to ELC’s custom-engineered 3DGS pipeline happened in just a couple of years. The next developments may include real-time lighting control, actor doubles, or sequential splat models that capture both space and time, allowing filmmakers to walk through moments as they unfold. As these pipelines become standardized, "fixing it in post" might one day simply mean loading the scan and re-shooting the camera path from your laptop.