M+E Daily

How Technology is Helping the M&E Industry Resume Production Safely

As the media and entertainment industry continues to move forward cautiously with the resumption of production, revolutionary solutions in virtualizing environments for both onscreen talent and onset crew/protocol are playing a starring role.

Three technologists who developed a few such solutions appeared on the virtual main stage of the Oct. 20 Media & Entertainment Day event to give attendees an inside look at the groundbreaking Entertainment Technology Center at the University of Southern California (ETC @ USC) project “Ripple Effect,” an R&D short film testing virtual and remote production.

While the M&E industry reels from the impact of the COVID-19 pandemic across production workflows and environments, the “Ripple Effect” project should resonate with content creators who are maintaining the balance of safety, speed and security in production.

Discussing the project during the M&E Day session “The Ripple Effect of The Ripple Effect,” Erik Weaver, director of adaptive production and special projects at ETC @ USC, noted that major parts of it address COVID and SafetyViz.

The goal was to create a photorealistic, 3D environment allowing a film to be shot safely by allowing those working on the project to maintain social distancing between each other.

Some of the standards that needed to be implemented were complex, Weaver pointed out, explaining: “There’s roughly 309 control practices across 18 major organizations from states to the different unions to different countries in how you’re allowed to go back to work – what do you have to do to stay safe. So we started by compiling everything and comparing and contrasting against other peoples’ work… so that we could build something that best fit all the primary groups’ needs.”

Those working on the project partnered with Virtual Wonders and DigitalFilm Tree and created a whole Lidar of both sets being used on the project, he told viewers, adding: “We then imported [it] into Unreal Engine. So we were able to then create a visual that is accurate in distance to everything on the stage itself. And from there we created a game engine manifestation that walked a person through these environments.”

While “you’re planning out some of these things, you can either put individuals in or pods of people because you’ll find that if you have a camera group, they all need to work in a pod,” he said, adding: “You can’t really avoid that with certain different groups.”

The system being used showed when certain configurations wouldn’t work – where conflicts were being created as they tried to maintain safety, he noted, adding: “It helped to completely understand better what was possible and what wasn’t.” And people would have that “aha moment” when they realized something wasn’t going to work, he said.

“It ended up being kind of eye-opening because, on the larger stage, we had a slightly larger team – well, quite a bit larger – and then when we went over to” the other stage, “we had to cut that down to 10 people,” and they could immediately see everything was not going to fit, he pointed out.

Meanwhile, the same previsualization techniques that Kathryn Brillhart, executive producer and director of virtual production at ETC @ USC, had been using for visual effects is being used on this project for safety, she noted.

In fact, several of the technologies being used on the project weren’t created overnight, Greg Ciaccio, executive producer and head of post and production technology at ETC @ USC, told viewers. “These things have kind of been happening over time. But we needed to take it a step further and ensure that nobody was too close” to each other, he explained.

A variety of hardware and software is helping, including Teradek’s suite of tools, Ciaccio said. A consumer 65-inch display, meanwhile, is being used in conjunction with a pro reference monitor that only a couple of people working on the project can see while shooting, he noted. The large size of the consumer display, combined with its high quality, has allowed all of the key people on the production to monitor what’s going on on the set without getting too close, he said.

When production on the project was delayed and one of the film’s two directors needed to attend a wedding out of town, she was still able to do her work. “We outfitted our director on the East Coast with a Teradek core feed and she had basically about two and a half seconds – it’s since improved [to] frames now – where she can basically interact and see through the lens as well as the witness camera,” which provides an overall view of the set, Ciaccio noted. She was, therefore, “able to give direction” despite being many miles away, he said, adding: “It’s not exactly like she was there. But pretty much.”

While “latency is super-important” and continues to be a challenge when working remotely, it will get a lot better, Ciaccio predicted, adding 5G and Wi-Fi 6 are “all going to help.”

Asked what film production will look like going forward as a result of these developments, he said, with a laugh: “We thought things changed when we went from standard-def to high-def, right? That’s a drop in the bucket now. It’s nothing.” Technology including Pixar’s Universal Scene Description and Nvidia’s Omniverse platform “will allow a whole crew to work virtually” now, he added.

M&E Day was sponsored by IBM Security, Microsoft Azure, SHIFT, Akamai, Cartesian, Chesapeake Systems, ContentArmor, Convergent Risks, Deluxe, Digital Nirvana, edgescan, EIDR, PK, Richey May Technology Solutions, STEGA, Synamedia and Signiant and was produced by MESA, in cooperation with NAB Show New York, and in association with the Content Delivery & Security Association (CDSA) and the Hollywood IT Society (HITS).