This interview is the result of a conversation this editorial office had with Mark Pilborough-Skinner, Virtual Production Supervisor at Garden Studios. We spoke with him about the particularities of their infrastructure, the current state of the technology and its future.
This company has created a virtual production facility with all the necessary technology and qualified and essential personnel to make these techniques an affordable option for any type of company.
What is virtual production for you and what are the differences between this technic and the classical green screen?
Virtual production is the natural progression of green screen, but at the same time, they are used in different ways. This new technology consists, simply, in taking some of your post workflows and moving them to on set.
With green screen, you can do a high-quality render afterwards. And if your production, like most productions, hasn’t done a lot of previs, you can decide after you’ve shot how it’s going to look. The only downside to this is that you are locked in with your physical lighting.
However, with virtual production, you can envision what it will actually look like in camera. This capability is impressive because you can get about 80% of your fill lighting from the LED volume itself, and you only have to use a couple of physical lights as your key lights or to replicate other sources. Not only that, you can actually previs how your subject is going to fit into the scene beforehand.
It is a powerful tool, however, it implies some caveats because the system runs in real-time. Here at Garden Studios, the first thing we do is to check which shots are more appropriate for virtual production, which ones are location shots and which ones we can do in one of our sound stages.
Which is your pipeline?
We’ve spent the last year developing and improving our system and then working and collaborating with 3D content creators, DOPs and gaffers to make the process as smooth as possible.
We use Unreal Engine because it offers unparalleled real-time graphics. The way this differs from matte paintings or traditional video walls is that we’ve got a tracking system which essentially keeps our virtual camera and our physical camera in the same location in virtual and physical space. This allows for shift of perspective or parallax in the digital content that matches the physical props and talent.
We use this software for the deployment of our 3D content, and then we use a media server for video playback. We also have the possibility to introduce a hybrid approach where we can send the video content to Unreal and thus playback a combination of real images and CG content on our screens.
The image is sent from our render node to our Brompton image processors, which in turn map it onto our LED wall. We use the Mo-Sys StarTracker to run tracking on our camera.
We also have full DMX integration, so we have the option of using a control desk to monitor both the physical and unreal lights, and we can also link them together.
Is this technology beneficial for the teams involved in production?
Creatively it is great because usually, you’ll have your director, your art department, your VFX supervisor, who probably won’t talk to each other until the day of the shoot or, sometimes, weeks after it’s over. Whereas virtual production forces all departments that would normally work independently to talk to each other before shooting.
Your VFX supervisors talk to your art department and know that your physical assets and digital assets are going to match up. Your DOP will already be thinking about the lighting configuration and talking to the director. It means you have to do a little more work before you get to the set, but I think the creative possibilities it offers are much greater than through the traditional way.
Has virtual production changed the way teams work? Do they need to be exposed to this system having gained in-depth prior knowledge?
I think a lot of people are apprehensive about using these techniques because it is really new technology.
However, cinematographers, for example, as soon as they do a shoot with it and you just give them a few little rules – one is not to focus completely on the screen because then you will get an interference pattern-, they become familiar with the technology. To be honest with you, none of the techniques they have to learn are new.
Same with gaffers. At the beginning they used to say: “No, I need to be controlling all the lighting of the setup,” but when they understand that we can control the light of the LED volume and be at their command or even offer them an iPad or traditional desk to do it according to their needs, everything changes.
The same goes for the art department. I think it’s probably the one that improves the workflow the most because there is prior communication. You have to make sure that the physical props match the digital props. And you can do that beforehand. It also helps a lot that, if you wanted to change the colour of a bookshelf in the background, you no longer have to hurry to find another bookshelf or repaint it quickly, but just adjust it on the screen.
What are the most appropriate situations to be shot in virtual production?
Sequences involving locations that are impossible or involve many risks or are very expensive. Car sequences, for example, they are amazing to do on a virtual production screen. It is true that you could do it on a green screen, but the car would be full of green reflections. With this method you get reflections all over the car from the surrounding LED screens.
Of course, it is not the right technology for all scenarios. If you want to do a big scene with a wide establishing shot of nature up a mountain, go shoot it on location, then bring us the video plates to do the close-ups. Or if you have a big stunt scene in your movie, where there’s choreography involving a lot of actors and then you need a lot of CG elements added, it becomes more difficult to do on a virtual production set.
What is the importance of colour in virtual production?
It is really important. Brands need to make sure that their colors are exactly how they are on camera. To achieve this, we use our displays in HDR mode, which gives us much more information. We have analyzed all of our LED displays and our monitors to get a true reading of the colors they can achieve. We then use the OpenColorIO plug-in which transforms the Unreal color space into the color space of our LED volume.
We have used Pomfort Livegrade as one of our color grading systems for the screen. What this solution can do is that you can grade the screen non-destructively while the content is being produced. You can do day, night and golden hour grading of the scene and switch between them at the touch of a button.
A lot of times when people do a virtual production, they do it by eye and make it look good in camera, but what we’ve found is that if you have a very solid color workflow that gets everything physically accurate from the beginning, you can start doing it experimentally and save yourself a lot of work. It also helps, obviously, in the post-production processes.
Do you have a graphic design department?
At the beginning, no. In the beginning, we had to be a turnkey solution for projects. Our rate card included the LED volume, Unreal Technicians and UE4 scene optimisation. What we used to do was that the client or a third party would create the 3D content and we would modify it and make sure it worked properly on our infrastructure.
Having said that, we’ve realized that if we provide content creation from start to finish, if we create the content, tweak it and throw it on stage, we’re accountable and we’re confident that it’s going to look good and work well. So we have now started to build a small content team and we have offered our talented team to several community projects.
Which one would you highlight?
A few months ago we did a virtual production event where we partnered with Epic Games, The Mill and Quite Brilliant. We did an hour and a half live event on the VP stage, where The Mill delivered several scenes for us to use. We also leveraged UE4 Marketplace content to shoot a live commercial in an hour. What happens is that a lot of people say, “Virtual production isn’t ready for live events because it’s new and things can go wrong.”
At this event, we did a presentation, we switched live to scenes from The Mill, we created scenes that had lighting transitions, etc. Then we went and shot six scenes in a row without having to reload any of them, streaming them in and out on the fly. It was fantastic, because in the space of an hour we shot a fake commercial based on audience suggestions. We had no technical problems and delivered a great event that showed both the promise of VP for advert as well as live events.
I haven’t asked you about the characteristics of your LED volumes, what makes them special compared to other studios?
We’ve got a 12-meter by 4-meter wall, which has got a curve. The reason we didn’t build a 20-meter by 8-meter volume was because that size of stage means it’s more democratized. Ours is still accessible to the ad agencies, independent filmmakers and smaller productions while still being suitable for Film and TV. We also have the capability to extend our LED volume by 6m if requested.
We are going to be expanding our virtual production offering this year as we’ve got a lot of demand for bigger stages and wonderful projects.
However, what we want to do is to keep it that way. I think it’s surprising, because normally, tools like this stay in the film world for 10 years until they reach all the other industries. The Mandalorian is now three and a half years old and that was the first example. Now, brands can use it almost immediately after it has been implemented in the film industry.
What is Garden Studios’ experience in virtual production?
I think part of Garden Studios’ knowledge and experience is because we’ve done everything from music videos and movies to commercials, so we can look at scripts and storyboards and advise people to use virtual production when it makes sense and to use traditional shooting methods when it needs to be done.
The interesting thing is that this technology is very recent. We have only been developing its possibilities for two years. And people who come to us are coming with very different ideas depending on the world they come from. I come from a game programming background. We have people with experience in traditional film, VFX and, even, live events.
This characteristic pushes our technological capacities forward and we are able to deliver better productions at Garden Studios, but we also share everything we find with other VP stages because, at the end of the day, the UK is an incredible center of film and television production. We want to make sure it stays that way as these new technologies develop.
When did you implement this technology in your studio?
We did some initial tests around November 2020. We then moved on to the Garden Studio campus in January of 2021. We did two, three months of building the stage, setting up our workflows, training staff and shooting our internal test projects. We did a couple of music videos. We reached out to DOPs to come in and experiment.
From March 2021, we were fully open and bookable for clients. In the beginning we were working on projects every week or every two weeks. Now, we are booked on an ongoing basis, doing multiple shoots a week or longer film and tv projects. The pandemic definitely accelerated virtual production. But I think, even without the pandemic, it would have been adopted very quickly because it’s a fantastic tool.
Do you think this technology will be quickly implemented and assimilated by the industry?
I think so. If you look at the studio map from January to March, a year ago, we were one of the only permanent virtual production studios in the UK. Now, there are dozens of virtual production studios appearing all over the UK. From large studios building volumes of virtual production, to modest mid-sized studios building small sets.
I think the technology is only going to improve. It’s at a very early stage now, but Unreal is investing a lot to make it more and more accessible. The limit right now is just image quality, but with the advent of Unreal Engine 5 everything will change. You will be able to leverage higher quality meshes using Nanite and fantastic lighting from Lumen. I believe we’re going to see the gap between offline and real-time rendered graphics slowly merge in the coming years.
We are not far away and, as soon as we can have high quality graphics in real time, virtual production will become an indispensable tool.
What are your next steps regarding this technology?
One of the main values of our company, as we have already mentioned, is development and research. Therefore, it is very important for us to encourage technological innovation. We are developing several ideas right now that we believe will help advance this technology.
For example, we are working on a project to extract the Unreal scene layout and project it onto our studio floor via laser projection. It’s a simple idea that could really help the teams get a feel for the layout of props and scenery and thus speed up the process for everyone.
We are also working with photogrammetry techniques to scan and digitize as many props as we can. We believe that in addition to going to a prop house for physical objects, it would also be invaluable to have libraries of 3D objects.
Finally, one of the things we are investigating is video playback. At the moment, there is no media server or playback option that does everything we want. This is very common, because right now, a lot of the technology that is used in virtual production comes from other sectors. However, it is also true that we are starting to see LED walls specially designed for these techniques and specific tracking systems for VP. We are talking to existing media server manufacturers to see if we can develop something that meets all the requirements we have found.