Virtual Production at ARRI Stage London

This text was produced through a conversation with ARRI’s Business Development Director for Global Solutions, David Levy, and the editorial staff of this magazine. The conversation was set up as an interview where David shared his experience, knowledge, insights, and perspectives on virtual production.


How was the virtual production stage born at ARRI London? How did you develop it over time?

Before the pandemic, ARRI supported several shows using virtual production techniques and had identified the technology’s potential importance. The pause in production due to the pandemic proved to be an important catalyst for the adoption of the technology. As a result, ARRI was invited to provide the technical design, and supervise the construction and installation of a virtual production stage at Studio Babelsberg, for the production company Dark Bay.

In parallel, ARRI decided to convert an underutilised warehouse space at its UK site, with the ambition of supporting our partners globally wanting to explore virtual production. We also designed the stage for product development and as a resource for our R&D. As a manufacturer, it is critical that our hardware and software complement this form of content production. The stage required significant investment, which meant it couldn’t just be used as a proof of concept or for product development, it also had to pay for itself.  So, when planning the stage, it was essential to ensure it could function as a commercially viable business in its own right. With the support of ARRI Rental and our operating partner Creative Technology, I’m pleased to say it’s going well.



What is the technology located in your studio?

The studio has an impressive technology stack, which includes an extensive inventory of hardware and software solutions from across the media and entertainment industries. From the gaming world, we use Epic Games flagship real-time 3D creation tool, Unreal Engine nDisplay, as our principal 3D playback system. Photon VYV is used as our 2D playback system, which comes from the live events world. Then we have the LED video walls, more commonly found in live events, broadcast, and digital signage. There are tracking systems, from our friends in motion capture and broadcast. Video synchronisation, processing and signal distribution systems, stage automation systems, IP lighting data networks, and of course, camera and lighting solutions from our portfolio of products.  The list goes on.

Because of the complexity, and our firm belief in collaboration being key to delivering such studios, we partnered with Creative Technology to design, deliver and co-operate the stage. Through this collaboration, and leveraging of our respective expertise and experience, we have been able to deliver a state-of-the-art virtual production studio.

The main ‘in-vision’ wall is 30 m x 5 m and constructed from ROE Ruby 2.3 panels, while the ceiling is 10 m x 10 m and uses ROE Black Quartz 4.6. The processor for both is the Megapixel VR Helios 8K. Then the back wall is 18 m x 4.2 m, and there are two moveable and tiltable side walls each 3 m x 4.2 m, all of which use ROE Carbon CB 5.7 panels, and ROE Evision 4K processing.  As previously said, our playback system is either native nDisplay for 3D tracked projects, or Photon VYV for 2D playback. The 2D playback system is most commonly used for car shoots.

To ensure the best possible rigging solution for the video walls, we suspend everything from a dedicated secondary steel structure. This is preferred over ground stacking as it means redeployment of panels is simplified and improves the overall longevity of the hardware. Our main 10 m x 10 m ceiling element is suspended via a fully motorized, millimetre accurate, automated rigging system. It can be tilted in any direction up to 30 degrees as a single piece, but also very easily separated into four equal individual sections. This gives us incredible creative flexibility and speed over where we place light and reflections within a scene.

In addition to all this, there is a dedicated IP-based lighting control network, where we’ve installed 50 of our latest ARRI Orbiter LED light fixtures on height adjustable lighting trusses to form a 360-degree configuration around the inside of the volume. The volume produces a lot of light, and that can be very important for believably embedding the performers in the virtual environment, but it’s a very soft, homogenous light, and the CRI and color reproduction is not as good as professional film lights. So, we enhance the ambient stage light with the Orbiters, which have a very good spectral output, and incredible color rendition. The Orbiter is an ultra-bright, tuneable, and directional LED fixture, designed with versatility in mind. This concept of versatility is something we have incorporated in all the systems that make up the stage.

And, finally, another important aspect is our camera system. We have an ALEXA Mini LF with a full set of Signature Prime lenses and ARRI Electronic Control System tools like the Hi-5 hand unit and UMC-4 lens motor controller, which transmit near real-time camera and lens metadata directly into the game engine environments. We have found this combination really gives us the most convincing results, whilst allowing for creative freedom and simplification of the process.



What is your pipeline?

We work with our clients from pre-production, all the way to post. Firstly, identifying if the project is even suitable for virtual production. Realistically, this technology is not suitable for every scenario… at least not yet. We are very honest with our clients, and if the scene they want to film is not practical or cost effective to shoot in this way, we tell them. What we love is when a particular scene can’t effectively be done any other way. Or when it’s too expensive or too risky.

For example, we worked on a project that had the lead talent floating in outer space, staring back at earth. Traditionally, this would have been a green screen shoot, with an expensive and time-consuming post process. Shooting in green screen is also notorious for producing detached performances and makes it not only difficult, but risky, for the director, and DP to faithfully deliver their vision.

By shooting at Uxbridge, the production worked in real-time with the artist, director, DP and VFX house on delivering a cost effective, and more importantly, creatively accurate representation of the director’s final vision. The only real post work required was the assembly of the footage and the wire removal, which is relatively easy to do.

I have to say, even we have been surprised at what can be achieved.  Productions have asked us for set-ups that would be impossible on location. For example, we had a scene with a motocross bike traveling at 70 mph on a dirt track and the DP got a shot with the camera about six inches from the artist’s face. This would have been impossible to do in reality; it would have been too dangerous. But with virtual production, it was easy and achievable.


Is ARRI involved in manufacturing this kind of technology?

Yes and no. We do not manufacture video walls, and we’re not here to make game engines. What we want to ensure, is that our equipment “shakes hands” with third-party technology. Future protocols will be IP-based, which means it’s important to look at how we adopt and integrate those standards. Interoperability and simplification are very important to us. Our products must fit well within that ecosystem and make it easier for our mutual users to harness the creative power of the technology.



Has this technology room for improvement?

There is a lot of room for improvement. This is the first generation. Admittedly, it’s not particularly new; it’s just the natural evolution of the in-camera VFX techniques we’ve been using for years. But yes, there are lots of opportunities for improvement.


What do you think of the On Set Virtual Production SMPTE RIS program?

A big part of SMPTE’s work is standardization. We rely on standards to help us evolve and grow. They should not hinder us, and I think that’s exactly what SMPTE’s RIS project is trying to address.  It’s about giving us the rules and the tools our industry needs, in order for us to work from a unified position.

We’re all part of the content creation industry. Our job is to provide creatives with the tools they need to create their art and tell their stories. It is still early days. Let’s hope that we continue in the spirit in which we started: collaboration and openness.


What differentiates your studio from others?

I think the main difference is that ours is a permanent installation that offers flexibility. While others are built for a single project and then torn down, ours continues in the space in which it was built, providing a great opportunity to learn a lot more; we can continually update, optimize, and improve it.

Also, I think the automation systems make us unique. The fact that our ceiling is movable, with millimetre precision, and can be reconfigured in multiple different shapes and angles, makes all the difference. We have a dedicated lighting IP network, which is also important.

Our studio wasn’t designed with a single production in mind, but to serve effectively and creatively as many variations as possible.



What are the main reasons to choose virtual production workflows?

The main reason should always be based on the creative. We do this to tell stories which would otherwise be untold. Other secondary reasons include control, speed, efficiency, risk mitigation, and cost.

Imagine a scene in a car where your two lead talents have to perform eight pages of dialogue over and over again to get the required coverage. This technology allows you to do it efficiently and effectively. You can move the camera and there’s little risk of missing the shot or making continuity mistakes. The same goes for magic hour. In a day there are only 20 minutes of magic hour, at best. In the LED volume you can make that brief, precious light last the whole day.


Does this technology change the job of a DoP or art director?

I don’t think it’s changing. I think it’s expanding to be involved in more parts of the process. I think it’s very important for those department heads to be part of the broader conversation about how to tell the story at every stage of production. And this technology requires all these departments to be involved.



From ARRI’s position, what projects related to virtual production would you highlight as the ones you are most proud of?

We are very proud to have supported the Dark Bay Virtual Production Stage. We provided on-set services, from planning and technical supervision of the build, and then rental services during the production of the Netflix series 1899.

Also, I would like to mention how proud we are of the first virtual production studio we built for these applications, for Epic Games at its Innovation Lab in London.


What challenges have you encountered and how have you solved them?

Based on our learnings and experiences we have made multiple phased upgrades/updates in our studio, and we will continue to do so. We expect to have announcements about further innovations soon. Automation, interoperability, simplification and standardization, cost reduction and efficiency—these are our priorities. Just like the SMPTE initiative you mentioned earlier. That’s what we want from our partners.


The most immediate and obvious challenge for us as a technology company is simplifying and automating the complexity of these systems, whilst also developing user intuitive workflows. The next challenge, I would say, is people and education. There is not enough of either to effectively scale. This is where innovation and collaboration come in. ARRI is working with a broad section of our traditional and non-traditional industries. From education to enterprise. This is not a revolution but a continuous evolution, and this is what we know from over 100 years of being part of one of the most diverse businesses in the world.

Reunion Island's pri
Ross Video implement