New Products

Technicolor Genesis Virtual Production Platform

This advanced on-set previs system allows one to plan shots in real time with live and CG performers, saving expensive production hours.

Noah Kadner

This advanced on-set previs system allows one to plan shots in real time with live and CG performers, saving expensive production hours.

I recently had the opportunity to visit the Technicolor Experience Center in Culver City, California, for a demo of the Genesis Virtual Production Platform, the ongoing product of years of collaboration between Technicolor and its visual effects subsidiary, Moving Picture Company (MPC). Genesis has some lofty ultimate goals, but at present is used to empower traditional production team members with hands-on participation in the previsualization animation process via virtual reality. The evolving ecosystem of tools was already utilized on visual effects-driven films, including Disney’s recent live-action version of The Jungle Book (featured in AC May 2016).



The heart of Genesis’ power is its real-time performance, achieved by leveraging the Unity 3D gaming engine and NVIDIA GTX 1080 and RTX 2080 high-powered graphics processing units (GPUs). By combining real-time graphics with Vicon motion capture cameras, real-time motion control encoders for tripods and camera dollies along with HTC Vive virtual reality headsets, Genesis allows a crew to step into previs animation and photograph, direct, production design, and act scenes out virtually. Genesis also tracks every camera move, prop/character/light position, scene and take in a secure cloud database. The system operates over Technicolor’s own Technicolor Connectivity network, allowing crews in different locations to interact and collaborate virtually and simultaneously within the same scene. 



According to MPC’s Head of Studio Kerry Shea, “We want to give directors, cinematographers and their crews the traditional production tools they’re accustomed to and involve them more creatively and directly in the pre-production process. Genesis shifts the major creative decisions much more forward into the production schedule.” 


The first portion of the Genesis demo showcased the pre-production workflow, featuring a virtual tech scout. Artists at MPC created an internal sequence to demonstrate Genesis’ usage on a CG-intensive subject. The project, entitled Speakeasy, concerns a sassy CG-creature as he makes several rejected attempts to order a drink at a Prohibition-era bar. The entire set contained period props and character and could be navigated in real-time using the Vive VR headsets.


Virtual Production Supervisor, Joe Henderson, showcases concept art developed by MPC’s own Art Department: a vintage speakeasy bar.

According to Ryan Magid, MPC’s Virtual Art Department Supervisor, “My team had about a week to create all the needed 3D assets for the set and characters and then just a couple of hours to shoot the scene itself. We were able to create dozens of takes in that time all with the full, physical input of a director, cinematographer, mocap actor and an entire stage crew.”


Genesis captures not only the raw motion capture and camera support encoder data, but it also captures the frame-by-frame positions of every prop, light, virtual character and camera within each take. The CG footage generated during live-action capture can then be continuously changed throughout the review. The process empowers the creative team with a much higher degree of agency and interactivity with the previs process, which is usually the purview of CG animators.


With the pre-production demo complete, we next moved onto a larger motion capture stage ringed with Arri SkyPanel S30s LED lights; featuring a huge LED display, a Sony F55 camera tethered into the Genesis system live rendering the Speakeasy bar created in the previous setup. An actor stood by in a motion capture suit tracked by Vicon motion capture sensors. The crew offerd us the chance to operate a tripod which would pan and tilt the taking camera within the virtual scene or a set of hand wheels connected to the virtual dolly on track within the scene.


Roger van Helden utilizes a handheld virtual camera to capture CG character performance, puppeteered by motion capture performer Richard Dorton.

As the actor performed his lines of dialogue and mimed interactions with the other virtual patrons in the bar, we were able to operate the dolly and pan/tilt the camera. As we piloted the camera, we could see the results of our movements married with the live mocap actor in real-time animation on monitors throughout the set. In a matter of moments, we’d created a complete scene, ready for further development as previs animation. According to Shea, “Members of the crew can simultaneously and securely interact live with the scene from across town or the world. Even an executive can virtually participate from their office. All they need is the Genesis rack unit [which houses powerful PCs and the GPU hardware to drive the headset.]”


Candace Nelson and Zilong Liu capture CG character performance along a virtual dolly using specially encoded film hardware.

Virtual Production Supervisor, Joe Henderson, demonstrates virtual tech scouting and production design with the use of VR hardware, such as Vive headsets and handheld controllers.

As part of the stage demo, we also witnessed a more in-development usage of Genesis. Joseph Henderson, MPC’s Virtual Production Supervisor, explained,“One challenge we often have is how to get a CG character to interact with a live-action performer convincingly.” In this setup, an actor was framed by the F55 with a live animation image of a forest rear-projected behind him onto the LED screen. As the camera operator, panned, tilted and dollied the camera around, the image on the LED screen shifted perspective correspondingly. 

Directors, cinematographers, and production designers have the unique ability to immerse themselves within a virtual scene and plan their shots ahead of time, laying virtual cameras and set dressing before stepping onto a real-world set.
Directors, cinematographers, and production designers have the unique ability to immerse themselves within a virtual scene and plan their shots ahead of time, laying virtual cameras and set dressing before stepping onto a real-world set.

The animation was editable in real-time as the team demonstrated changing props, weather and lighting on the screen, all on the fly. “The live image quality isn’t quite photoreal yet,” added Henderson. “As GPUs get more and more powerful, that divide is narrowing. We’re already working with NVIDIA and their RTX 2080 next-generation, real-time ray tracing graphics cards to see how much further we can advance the workflow.”


With the live motion capture completed, we were invited back to view completed takes in the Vive VR headsets. The headset image is very high resolution with 3D imagery and sound, creating a high level of immersion. I was able to re-experience the takes as they’d been initially acted and photographed on the stage. With the Vive handheld controllers, I could reset props and lights and check out the action from any camera angle. You feel like anything is possible in terms of creatively exploring the scene and how to best shoot it before burning through time and budget on set with a large crew.


“There are a lot of potential uses for Genesis,” said Markus Ristich, Genesis Developer.“We recently did a show with a major sequence set on the London Stock Exchange. The director knew he couldn’t close down the Exchange for long periods, not even for scouting. So we worked with the city government to get Lidar scans of the area and blueprints along with a lot of reference photos. Then, we recreated the entire Exchange in Unity and prepared it for VR previs in Genesis. The director was able to work out his shots in comfort and efficiently determine which camera mounts and logistics he’d need to get his shots in the real place. We were able to make a very challenging tech scout into something much more interactive.”


As immersive technologies such as virtual reality, mixed reality and augmented reality continue to mature, we’ll see new and innovative ways to integrate them into traditional filmmaking. Genesis demonstrates VR and real-time gaming engines are empowering tools for directors, directors of photography, production designers and other key creative crew who are accustomed to more traditional filmmaking tools. And it’s also a lot of fun to play around on a movie set in virtual reality.


This short video explains some of the work they do at the Technicolor Experience Center:



Follow them on FacebookInstagram and Twitter.


Stay up to date with American Cinematographer on FacebookTwitter and Instagram.



Subscribe Today

Act now to receive 12 issues of the award-winning AC magazine — the world’s finest cinematography resource.

April 2024 AC Magazine Cover
March 2024 AC Magazine Cover
February 2024 AC Magazine Cover