The critical importance of adopting virtual production

Published on in Miscellaneous

Nic Hatch, CEO Ncam Technologies

As lockdown restrictions start to ease around the world, one of the key consequences that is starting to come to light regarding the Covid-19 pandemic has been its role as an accelerant. Certain trends within the industry were already well under way, such as the growth of remote, collaborative workflows in post production or remote contribution for live events. But Covid-19 has moved these very rapidly from being a nice option to becoming an absolute necessity, and next on the list is virtual production.

While post production has been able to continue almost without pause during lockdown, production itself is only now starting to ramp up again. The different rules and regulations surrounding it vary from country to country; changing all the time in response to wider events, with concerns about a possible second wave never too far away from most people’s thoughts. As a result, virtual production is very much a technology coming to fruition. 

Some of the technology driving it is complex, but the concept is fairly simple; virtual production allows filmmakers to plan, imagine or complete any or all parts of their production in the digital arena. Whether that’s blocking out scenes, pre-visualising shoots, motion capture, using LED screens for live compositing work, or using realtime rendering to show a Director and DOP how real and CG elements will interact, it’s all virtual production. Some productions have effectively gone all in on the virtual production methodology - the traditional filmmaking or episodic shoot, now features very little in their workflows; whilst others are only using elements of virtual production.

This is where Covid-19’s role as an accelerant comes in. The driving forces behind the uptake of virtual production were always the powerful motors of flexibility and efficiency; effectively you can do more and do it cheaper, quicker, and, indeed, better, once you have made the commitment to change the production pipeline to a virtual or semi-virtual one. Covid-19 has added a further impetus to everything as productions suddenly realise they need to keep their footprint low, reduce location work wherever they can, and ensure that stages run safely whilst also being cost-effective. 

And what is even more exciting is that virtual production is getting better all the time. Epic Games has already been demonstrating the realtime power of next year’s Unreal Engine 5 release, which will add greater real time capabilities and even better image quality into the mix, making the real and the virtual worlds indistinguishable. Already we can work much quicker than before. Using Ncam, directors can see exactly how their actors are positioned in the virtual space and quickly iterate and adjust scenes.

We’ve been doing this for a while. In ‘Solo: A Star Wars Story’ (2018), Nviz (then Nvizage) used Ncam to visualise VFX shots through the lens to show actors where graphic elements were coming from, speeding up both production and post, as well as helping the animation team to reposition objects on the fly as director Ron Howard iterated scenes. In ‘The Nutcracker and the Four Realms’ (2018), for example, Ncam was used to place markers on the set to guide where in a complex VFX landscape an actor was pointing. “Without the Ncam system, we would have had to guesstimate where Philip needed to point and then retrofit our world to match,” commented VFX Supervisor, Max Wood. “Not only could we see the shots in real time, but we saved a large amount of post-production time in not having to adjust our world for one shot.” 

This is just the start though. Take the LED screen technique (as used at its apogee by The Mandalorian, which filmed in a 21ft by 75ft set with specially constructed LED walls and ceiling, providing backdrops created in Unreal Engine) which enhances a huge degree of flexibility. We are no longer looking through the lens here or seeing a realtime composition monitor; we are shooting live actors against the CG backdrop. Okay, very few have the Disney-level budget to scale things up like that, but more conventional and smaller LED setups can still deliver the same level of quality on smaller stages. That allows productions to knock out a good percentage — in some cases probably even all — of their location work and bringing a host of benefits, saving transport costs, minimising environmental impact, speeding up rig/derig and shooting time, and even just making things feasible during the pandemic.

The next generation of software will enable that process to be even more instinctive and intuitive and will be good enough that what is captured in camera does not even have to go through a further rerendering process in post. 

Of course, it’s a technique that doesn’t just have to be used for genre programming either. While it’s always tempting to think CG = space or fantasy, the current crop of games engines such as Unreal can create convincing sets of pretty much anything, indoor or outdoor. Using photogrammetry, real life locations can be captured and recreated in exquisite detail, even down to the fact that their light can be used to partially light the virtual set from the LED screens when shooting starts. 

From an apartment set for a sit-com, to an Edwardian London street, this is a production technique that is going to dramatically change the economics of producing for film and television over the coming years. 

In the live space, virtual sets are slowly starting to dominate live television, as the economic argument they make has been matched by the quality of the end product. We first showed Unreal Engine working with our data at NAB 2016, and now we see it used on a daily basis by Tier One broadcasters to create virtual sets around the world. We have one customer in Buenos Aires producing eight different programmes a day, all with different sets, from the same physical space. Switchover takes place in the advert break at the top of the hour. 

In the UK, Sky Sports uses our equipment to ensure that they have the same virtual studio environment in over 90 different venues for its Premier League coverage. Whatever the size and shape of the physical space at the venue, the virtual studio is always consistent and the whole thing can be set up in under one hour.

Our technology was also used by Sony Innovation Studios to create a virtual version of the ‘Shark Tank’ set for Sony Pictures Television when pressure at the lot in Culver City meant that there wasn't enough space for its usual two stages. The virtual set was used for filming around 100 exit interviews and was indistinguishable from the real world one to the extent that crew members in the production truck often forgot they weren’t monitoring a real set.

The same arguments that have driven the adoption of virtual sets — flexibility, cost savings, creativity — are now becoming urgently examined for film and television production. They are compelling enough that they were always likely going to happen, it’s just that, as with remote post production workflows, Covid-19 has brought the necessity forward by a couple of years.

ProductionHUB ProductionHUB Logo

Related Blog Posts
How COVID-19 Impacted Web Design Businesses
How COVID-19 Impacted Web Design Businesses
No industry was left undisturbed by the current pandemic. The majority of web design and marketing agencies immediately felt pressures as clients canceled, marketing budgets were reduced, and businesses scrambled to figure out how to improve their online presence.
Published on Tuesday, August 11, 2020
Technology Breakthroughs: Wildmoka Auto ReZone Delivers on the Promise of Vertical Viewing Via Smartphone
Technology Breakthroughs: Wildmoka Auto ReZone Delivers on the Promise of Vertical Viewing Via Smartphone
A few days ago I received a call from a very knowledgeable person in the media technology field who excitedly said “Stop what you are doing I want you check out this company called Wildmoka. They have this amazing new technology that you will love. Well, being the technology nerd that I am, I did stop and I did check them out. After reading the press release twice I was hooked and wanted to know more. ProductionHUB caught up with Thomas Menguy the President and CTO at Wildmoka. He walked us through a very interesting process of seamlessly real time converting horizontal images to a vertical for viewing on smart phones via AI and the cloud. He also shared some of the research that was instrumental in the development of their new product Auto ReZone.
Published on Tuesday, August 11, 2020
Remote Truck Production: Are We There Yet?
Remote Truck Production: Are We There Yet?
As we slide into August and the dog days of summer, production companies and production personnel keep asking me the same question over and over. That question is: “Are we there yet?” It seems to be the overwhelming theme over the last month doesn’t it? To be honest, I think that the jury is still out. In some areas remote truck production “seems” to be working. Some mobile units are carefully rolling out and taking on new “remote” productions full stream ahead. Other truck owners are still taking a 'wait and see' approach and working in what I call a “hybrid” work model. In the truest sense of the word we are talking remote production but with a lot of caveats.
Published on Tuesday, August 4, 2020

Comments

There are no comments on this blog post.

You must be logged in to leave a comment.