M+E Connections

Smart Content Summit: The Not-So-Long Strange Trip of Virtual Production

Virtual production has come a long way in a relatively short amount of time and there is more learning and work to be done, according to industry experts who spoke March 10, during “Two Years Later: Making Virtual a Reality,” one of a series of Media & Entertainment Data Center Alliance (MEDCA) presentations that were held in conjunction with the Smart Content Summit in Los Angeles.

Just before the pandemic started, the best and brightest minds in virtual production were spending weeks to craft and create just a few minutes of virtual production content.  The complex hardware infrastructure required, in combination with the advanced workflows needed to make the hardware and software communicate instantly and in real-time, represented a mind-boggling leap into the future.

Fast-forward to where we are today as an industry, from a cloud, data and infrastructure perspective and you can see that efficient virtual productions that use remote tools and workforces are becoming commonplace and driving new tools for collaboration, connectivity and increased throughput regardless of the use case or environment.

The MEDCA session looked at where we’re heading as the challenges turned into victories and virtual production continues to proliferate in the M&E sector.

Part of the key to success is “setting the expectations right” is what Sinan Al-Rubaye, chief experience officer at Los Angeles software company ICVR, said he has learned in the past two years.

“A lot of VFX houses used to create very basic-level assets to block scenes during previz but then, once they move to post, they just throw them away and start creating new assets from scratch,” he said, noting “that’s very inefficient – a waste of time and money.”

However, he said: “One of the things that have improved a lot in the past couple of years is the assets that you create in pre-production to use in previz [now] actually travels all the way to post” production and is the same asset being iterated on so you don’t have to waste time creating assets that will be thrown away. It is now only the final 25% that needs to be done in post-production, he noted, stressing how much this has increased project efficiency.

Two years ago, to shoot 4-5 minutes of content on an LED wall used to take 2-3 weeks, he said. In comparison, “last summer, we shot 51 minutes of content in one single day,” he noted, adding this has happened in only a “span of less than two years.” The “pace of efficiency” is going fast and people are adopting the technology but are still “learning and improving” also, he added.

“The key is flexibility and openness” because we don’t have all the answers and [don’t] know how this will all pan out, according to Victoria Bousis, founder and creative director at technology studio UME, as well as a director, who participated via video from Athens, Greece.

“My primary concern” as a director is to tell powerful stories with strong characters in fantastic worlds, she noted. The audience won’t care how many polygons are in a particular shot; they care about the story, the characters and their journey, she said, adding she embraces technology to help her tell stories that would otherwise be impossible to tell.

“We’re still pioneering this method [and], yes, there’s a huge learning curve,” she conceded. “But at the end of the day when you’re put in that situation where you’re seeing your movie before you’re shooting your movie, at some point in that cycle, there’s this confidence you’re building” as you realise it is going to work, she explained.

As a bonus, the same assets an organisation is using for a movie can now also be used for a game version of the film or a virtual reality video based on it, she said.

There are many systems that are engaging and interacting as part of the complex virtual production process. So one important question is how do you sort through all of it?

“It comes down to workflows and pipelines and metadata,” replied Ryan L’Italien, director of solutions at software developer Perforce. “There’s so much data that’s being captured: 600 megabytes per second,” he said, noting that the many devices being used are several computers and cameras that are shooting data.

“How do you manage all of that data? You put it into one place because then you can iterate on it, work on it and share it amongst the teams” of your organisation, he explained.

“Hollywood is kind of embracing the software engineering mindset” now, according to L’Italien. In software engineering, the strategy is to “shift your test left, fail fast,” he said. Using that strategy with pre-visualisation and technical visualisation saves money and time, he noted.

One key is to complete as much as you can before production so that, “when you’re shooting, there’s no questions,” he added.

Guy Finley, MESA president and CEO, moderated the panel.

To listen to the presentation, click here.

The 2022 Smart Content Summit event was held in conjunction with the EIDR Annual Participant Meeting (EIDR APM), and was presented by Whip Media. The event was produced by MESA, in association with the Smart Content Council and EIDR, with sponsorship by BeBanjo, Signiant, Qumulo, Adio, Alteon, Digital Nirvana, Slalom and Rightsline.