Collaborating Remotely for VFX, Virtual Production: Stargate and Zoic

Virtual production has grown exponentially during the last two years, becoming a safe way to keep productions moving and on schedule while also keeping people apart. The travel costs saved have also been significant. No getting on planes, no being away from home for weeks at a time.

 

That’s just one of the new technology’s many upsides. While the volume is reinventing film and television workflows, the crews brought back together on set are much leaner than they were before. More creatives and executives can simply dial in via myriad stable apps and platforms. It’s a streamlined set, enabled by remote workflows, that lets every virtual production flex its intrinsic scalability: they can go anywhere, go small or go grand, depending on budgets.

 

So while virtual production does take a team, it also allows a majority of the team to work and collaborate from anywhere. And even with COVID restrictions loosening, the idea of not sending an entire team to shoot in the Atlantic Ocean while others remain remote, is a win-win for most.  It’s why the majority of studios will continue to invest heavily in technology and assets to support those shifts in workflow in the near term. The initial investment may seem steep, particularly for smaller shops, but it has become clear that the transition to virtual production will drive costs down dramatically over time.

 

For this piece, we spoke to two well-established VFX studios — Stargate Digital and Zoic Studios — that have both embraced virtual production.

 

Stargate Studios: At Sea in the Volume
Our Flag Means Death, Taika Waititi’s latest series for HBO Max, takes place mostly at sea on a pirate ship. For the most part, it was shot in a virtual production volume on the Warner Bros. lot in Hollywood.

 

The limited series, written and created by David Jenkins and just greenlit for a second season, also stars Flight of the Conchords alum Rhys Darby as an eccentric pirate with a heart of gold. Waititi, who executive produces and directed an episode, gives his Blackbeard a gentler, unexpected backstory.

 

For Stargate Studios’ founder Sam Nicholson, ASC, bringing the ocean to the soundstage in real time was just another day in the studio. Stargate has built its VFX business on being an early adopter of new workflows and technology. “We have been pursuing virtual production for many years in our VFX work,” he says, developing what Stargate calls the “Virtual Backlot,” an expansive library of 2D and 3D assets of locations from around the world. “This allows us to create photoreal virtual environments relatively easily, wherever we are.”

 

He traces the shift to virtual shoots to the moment after 9/11 happened, when actors couldn’t get on planes. “All of a sudden, ER was shooting in Chicago and CSI was shooting in Las Vegas. More shows started shooting in New York. It was the first great location challenge the industry faced.”

 

Stargate began investigating how it could scale its greenscreen work by capturing real locations and turning them into greenscreen assets at Warner Bros. “NBC gravitated to the concept, then ABC did. Sometimes they even said it looked better than what they were shooting on location.”

 

Flash forward to the COVID-19 pandemic, and the perfect storm produced another grand shift. “At the end of the day, I don’t think people really like shooting on greenscreen,” says Nicholson. “It’s way less realistic than shooting on location. Plus, greenscreen workflows are quite complex and slow, with a considerable cost per shot. But when shooting on an LED volume, the whole process suddenly becomes easier. We bring all of the traditional visual effects and 3D CGI into this new medium and achieve much more in real time.”

 

Nicholson has always been interested in real-time visual effects. “In the very early days of the LED process, we wrote our own programming for off-axis projection,” he says. “This was about a year before Epic Games released the Unreal Engine. We’ve now fully integrated Unreal into our ThruView process to control virtual environments as well as kinetic lighting and camera-tracking on-set.”

 

As VFX shot tallies grew exponentially over the years, he noticed a pattern. “When you look at traditional visual effects shows, with perhaps a thousand shots, and you are doing multiple shows, we were delivering almost 16,000 greenscreen shots per year throughout our global network,” he says. “That’s when I realized that about 50% to 75% of these shots are pretty simple, usually just two-layer composites for dialogue coverage. The other 10% to 20% are really difficult shots, and about 30% are somewhere in between. If you can finally track in-camera with a new tool like virtual production, which is what ThruView is all about, and bring in better lighting, you can now have an on-set, integrated system that allows you to look through the screen rather than at it.”

 

Mark Costa, HBO Max’s new VP of production, was all-in with Stargate and the Our Flag Means Death filmmaking team to create the topsy-turvy ship on the high seas that is central to the show’s storyline. “We’d worked with Mark on a greenscreen show before, and it worked great,” Nicholson recalls. “Mark came back to us and said, ‘I might have this thing that’s going to be really challenging. We haven’t greenlit it yet, but could you do dry for wet? Could you shoot on a stage with an ocean for 14 weeks for a series and make it work?’ So we went about solving that problem.”

 

Preproduction is an important part of any virtual production, and Nicholson’s team shot a series of tests in Stargate’s studios combining photographic water, 3D ships and LED screens. “Going in to production, we knew we didn’t have time to create 3D ships or 3D water,” he says. “So It had to be the real world.” The results blew them away. Footage of ocean water in Los Angeles looked so compellingly realistic in their tests that they headed for a proper shoot in Puerto Rico.

 

“Because it’s a comedy, and because it’s Taika, we wanted it to be full-resolution on the 160-foot screen, edge to edge,” he says. “Taika creates a wonderful vibe on set, and things get funnier and funnier as the takes progress. The actors are working off each other, and we didn’t want it to feel like a traditional greenscreen visual effects show, which can sometimes be like shooting with a straitjacket.”

 

Using 360-degree rigs of Blackmagic’s 12K Ursa Mini Pro cameras stabilized on a boat, the team captured camera-original RAW footage of the ocean that was later subsampled at 20K horizontal across the 160-foot LED wall. “That’s pretty extreme data capture,” says Nicholson. “We brought all that footage back here and started testing it on the wall.”

 

As the set on the soundstage was built out, Stargate added more greenscreen beyond the LED wall “so there is literally no place on set you couldn’t shoot,” he says. “We tied the lighting in with our ThruView system so the lights automatically adjust to the plate, which lets you move very, very fast.” Everything is DMX-controllable. “You’ve got five cameras or nine cameras all completely rock-steady on a boat that can go up and down. But the horizon has to stay in the same place.”

 

The final-composite ocean footage is run through Unreal Engine to give the filmmakers precise camera-tracking control and allow for off-axis projection on the curved LED wall. For Our Flag Means Death, the wall has to rock right along with the gimbal-controlled boat. “Everything is adjustable on the fly” and fed through multiple on-set DaVinci Resolves, simultaneously linked to drive the 20K wall. “That’s really part of the ThruView code that lets you have full resolution on the wall with all the bells and whistles of any color-timing session,” Nicholson says. “If you want to put a grade on the wall, or you want to defocus it, you can literally be compositing live. It’s definitely not a fixed asset. This puts the control back into the hands of the DP and DIT; you’re doing what you would normally do in a great color-timing session, but you’re doing it really fast on a 160-foot wall with an entire crew.”

 

That’s the goal, he adds. “We want to make this level of real-time technology invisible” and bring traditional VFX onto a live-action stage that easily flows into any editorial and post pipeline. “The more invisible the technology is, the more creative virtual production will become. I’m really happy to say we never had a single minute of downtime throughout the 14-week shoot on Our Flag Means Death. The filmmakers on-set can now be completely in the moment and do what they do best.”

 

Zoic Studios: A Front-Row Seat at the Revolution
Julien Brami, a VFX supervisor, creative director and Flame artist at Zoic Studios, has always been interested in real-time filmmaking. “I’m attracted to everything and anything real-time,” he says. After joining Zoic’s Culver City office 7 years ago, he began experimenting with new workflows in Flame. “Flame was one of the best places to start because you had the ability to do real-time comp timelines combined with 3D. But when I started seeing software like Unity and Epic’s Unreal, I knew I should look into it.” At the time, his best friend, Jérôme Platteaux, a VFX supervisor at ILM, added fuel to the fire. “He told me, ‘You know, we’ve got a secret project, so I can’t tell you more, but I can tell you that you should really start learning Unreal right now.’” Brami even created a game he could play on his phone so he could better understand how the engine scaled and what he needed to do to make it work.

 

Although Zoic’s core team was initially underwhelmed with the quality of Unreal video, that all changed when raytracing was added to the game engine’s feature set. “I knew it was the future, and I knew I needed to be fully in it,” he says. All became clear for Brami and his colleagues when Disney released The Mandalorian, the secret project Platteaux had been working on at ILM, which was shot entirely in an LED volume. “Then they were like, ‘Oh wow, I get it. This is the future.’”

 

Launching its Real Time Group in 2021, Zoic joined the industrywide race to go all-in with real-time virtual production. “We knew we needed to embrace it, to recruit people, so that’s when we started discussing opening a real-time department to see where we could feed Unreal. We didn’t want to use it just for production or previz, either, but literally for everything.”

 

A mega grant from Epic helped move the needle, giving a “ton of artists the time not just to learn how to use Unreal but to see how far we can push it, where it breaks, and learn from that.” Through trial and error and several borrowed LED screens that Zoic co-founder and executive creative director Chris Jones brokered from a tech partner, Brami and a handful of others tried to recreate what they saw on The Mandalorian. “I was lucky because I knew I could ask Jérôme questions as we went, and that helped us avoid a lot of unnecessary mistakes.” Their big discovery? It wasn’t that much different from existing VFX pipelines. “It was mind-blowing because we realized that even with not that much knowledge and only eight or nine months of using Unreal, we could still pull off a shoot as complex.”

 

Before the pandemic hit, Brami had already been working off an HP Teradici box, “essentially remotely, but still working together in the same office,” he says. It was an easy transition to go fully remote from home, thanks to Zoic CTO Saker Klippsten. “Kudos to Saker. In less than a week after lockdown, we were all working perfectly from home with zero lag.” What was initially challenging was finding a way to communicate effectively in Microsoft Teams. “You couldn’t just interact with someone normally or walk to their desk and discuss a shot,” says Brami. So we started introducing dailies. We didn’t want to overwhelm them, so we also created a general chat room where people could just hang out.”

 

Going remote, says Brami, has leveled the field for many artists. Location is no longer an issue. “It’s brought way more people together, so to me, it opened us up to a world of talent,” he says. “When we collaborate with other companies or agencies, they feel like they belong to the same company. There are no more walls.”

 

When working with the Carolina Panthers on a recent Unreal-driven, augmented reality in-stadium presentation, Brami says it was as if the entire team was working in the same room. “There’s no more posting; we just share our screens,” he says. “It’s just so much more efficient.” Seen by millions on social media, the larger-than-life panther was brought to life by Zoic and Epic with help from Quince Imaging, Stype, Field Day Sound and Pixotope.

 

Although Zoic still has an indefinite WFH policy in place, Brami can tap into the servers from home or head into Culver City for a session. “I still go on-set too,” he says. “And I like being in the Flame bay, which we still offer for the client as well. So if you want to do finishing in the bay, that’s an option. We’re all vaccinated.”

 

For him, virtual production is just another workflow revolution that makes that collaboration possible. “It’s like when we went from film to digital. That brought so much creative energy back into the process, and this is doing exactly the same thing. I’m coming up on my 25th year working in this industry,” Brami says. “It’s never been about the project. For me, what’s important is the people I work with. I love collaboration. It’s why I’m still crazy-passionate about this industry.”

View original article HERE.