Orca Studios: a decided commitment towards virtual production
Independent visual effects and virtual production company Orca Studios has been since 2015 offering the latest technological solutions in areas such as cinema, television, and advertising. Deep in their skin is a firm will to make the most of technology and state-of-the-art implementations in production. Innovation does not limit, but rather expands the possibilities of an industry always characterized by progress. At closing of this edition, they have participated in productions such as ‘Down a Dark Hall’ or ‘Paradise Hills’, as well as in a soon-to-come series that will be seen on the video-on-demand platform Netflix.
Adrián Pueyo, Compositing & VFX/VPX Supervisor, who has worked in renowned productions such as ‘Captain Marvel’, ‘Star Wars – The Last Jedi’ or ‘Pirates of the Caribbean 5’, brings us a vision of present and future on virtual production.
What is the history of Orca Studios and what are the main areas of activity of this studio?
Orca Studios was born in 2015 in Las Palmas de Gran Canaria, in he Canary Islands. It was founded by Adrián Guerra (producer and partner of Nostromo Pictures) with the aim of developing the possibilities of the new field of real-time technologies for cinema. A year ago, Orca Studios reinvented itself as an independent visual effects and virtual production company.
At Orca Studios we cover the entire cinematography process, from previewing with real-time techniques and motion capture, concept art as well as illustration, modeling and creation of environments and assets, to complete post-production of visual effects, including virtual production on a LED set. At the core of our philosophy are filmmakers, and one of our top priorities is to guide them through this process with technologies that are constantly reinventing themselves, thus offering efficiency and creative control.
You have your own facilities where you can carry out your work. What spaces do your studios include?
We currently carry out our face-to-face work mainly from two sites.
For the virtual production part, we have a dedicated set in Madrid. It is equipped to meet all production needs (dressing rooms, offices, hairdressing, and makeup areas, etc.) and it is where we have our fixed LED environment, with a curved, 100m2, high-resolution HDR screen and another set of high luminosity screens so as to create an immersive environment, in addition to all the technical means and human resources that are necessary to carry out the filming.
On the other hand, our visual effects studio is located in Las Palmas de Gran Canaria, where we have a team of professionals and an infrastructure that, in addition to supporting virtual production, has a full pipeline of pre-production and post-production of visual effects. I would also highlight that our pipeline is highly based on remote work, which currently allows us to have a large number of our artists working from different countries.
What do clients of Orca Studios usually request? What area are they most interested in? Where is the trend moving to?
The interest in the advertising area stands out, since virtual production (especially in these times) provides much more flexibility when traveling to various locations, in addition to benefits relating to project times.
In large fiction productions, which is our main market, the learning curve is slower, and we have more demand for large international productions familiar with technology, which have been delayed by the Covid situation. Domestically, fear of this new technology and its possibilities is gradually dissipating, and not only because of the aforementioned production side, but as a vital part of the story’s visual style. This seems to be one of the trends that will be seen the most in the coming years: using virtual production not only as technological means, but also as narrative means.
One of your most interesting focuses is your virtual production system. What is the system you have implemented like? What are its uses?
Orca’s virtual production area (VPX) includes several different processes, comprisingscouting virtual or preview (virtual cinematography), but the most representative of them is perhaps a process known as ‘in-camera vfx’, the filming system in which we use a large LED panel as background, capable of displaying a rendered image in real time for the position and orientation of the real camera, creating an illusion of perspective and immersive background.
This, beyond generating ‘live’ VFX, presents a wide range of advantages and new possibilities as compared to other traditional methods of integrating virtual stages such as the chroma key:
– In a LED environment, the panels that surround us, in addition to acting as a background, illuminate both actors and actual props, greatly favoring the integration and credibility of the final result. This is especially important in more reflective areas, where generating, at post-production stage, a complex reflection of the environment on moving characters is a very complicated task.
– By having an image instead of a chroma, any component of spill (an unwanted flood of green on the characters) also disappears, replacing itself, as we have seen, by the colors and shapes of the virtual environment in which we want to be in every moment.
– It allows us to create, in a matter of seconds, a darkening effect on any part of the screens that generates an unwanted reflection or lighting. Or move part of the background to get a different reflection. Or create a small square of greenchroma (or of any other color) behind the actors and props in case we may want to use a conventional chroma, but without wrinkles and without generating any front spill, as it is placed only on the portion of the screen jocated just behind. Or perform live color corrections on the background, extra screens, or even on the materials of the 3D stage that we are seeing. The possibilities are really vast.
– On the other hand, the fact that all those involved in the filming (from actors, director or photography director to lighting technicians or even the clients themselves) can be seeing live the environment in which we find ourselves and not only an abstract background, creates a much more immersive and fun work environment, which means that everyone will understand much better the context of the final piece we are working on. Actors know where to look, lighting technicians where to light, seeing the final result; and, for everyone else, there is much less room for imagination at a time when every wrong decision could be costly.
Also, when compared to filming in a real environment, this system offers clear advantages in certain situations such as the following:
– A situation of natural lighting that we cannot maintain over time, such as the golden hour, sunrises or any weather situation that is not very controllable or predictable.
– Environments where we have already shot and we need to replicate perfectly, but in reality, it is impossible for us. For these cases, we can generate a virtual environment that replicates the desired scenario. A clever solution for this is to capture any real set at the time of shooting, and in case we need retakes at a later stage we have the virtual set as a ‘backup copy’.
– Places for shooting are very expensive for us due to a number of reasons.
– Scenarios where moving the entire team is impractical, for legal, logistical reasons, or where transferring talent is just not feasible.
– Scenarios where in case of filming in the real environment we would need a team of many more people, either for lighting, transport, catering, or any other reason. Especially important in currently prevailing Covid-19 times, where the limitation of the number of people attending shootings plays an important role.
– In general, any environment impossible to achieve in reality!
What specific technologies are part of your virtual production system?
Referring now by virtual production only to the context of in-camera vfx (or direct visual effects on camera), that we mentioned before, some pieces of the puzzle are the following:
– Real-time camera tracking, with systems like the Mo-Sys StarTracker.
– Real-time rendering engine, like that of Unreal Engine.
– LED panels with different lighting features and color rendition according to their function, electronics and other hardware and wiring necessary to power the set and launch the images.
– Machines in charge of translating the camera position to the equivalent image and mapping it to the necessary position on the relevant screens. In this case we can directly use engine functionalities like Unreal Engine, create our own tools or combine it with external platforms like disguise.
– Synchrony signal generator between all parts, for both camera and machines and screens.
The above are those related to hardware. Regarding digital workflows, the number of possible technologies that converge significantly increases. We could categorize as capturing environments (via photogrammetry, mapping projections, 360º photography, LiDAR scanning, etc.), generating purely digital environments (modeling, texturing, and shading, lighting, etc.) or color management (required in all steps from capture to recording), in addition to related to interaction and real-time rendering.
The LED screen is a key device. How is it integrated into the workflow?
It is part of the shooting process, once we have all assets created and the number of remaining variables in production is very low. We can consider that pre-production in a virtual production workflow is much more important than in a traditional workflow, and it is the price to pay for all the aforementioned advantages and potentially leave the shooting with a high percentage of takes on camera ready to assemble and give shape to.
In the same way, we see getting increasingly relevant the use of graphics engines originally from the world of video games that are becoming authentic standards. The most illustrative example is the Unreal Engine. How important are they not only in Virtual Production but in current post-production workflows?
The number of doors that these real-time render engines are opening for creation of audiovisual content is huge and goes far beyond virtual production. When creating visual effects, for example, the rendering process has traditionally been a very expensive and time-consuming part of production, where iterating through 3D scenes in order to generate changes would always tend to mean re-rendering, thus taking up a large number of machines and time. In any context that requires feedback on the final result of a 3D scene seen through a camera, which as you can imagine are many, having the possibility of rendering said scene in real time gives us an incredible advantage in order to pursue a result and to be able to iterate everything necessary and at full speed to achieve it. And these render engines have been becoming truly powerful and realistic in recent years, relying on both visual tricks and GPU processing, which is being accompanied by a rapid progress of processing capacity of GPUs themselves and machine learning algorithms aimed at reducing noise, increasing resolution, or generally improving the render results.
Could you tell us about any recent experience that you have undertaken in the field of ‘Virtual Production’?
Since we started with virtual production a year ago, we have been able to use the set for a variety of projects: some involvingmore traditional techniques such as the use of LED screens as canvas to project previously shot material (with all the advantages of light and realism that are inherent to a LED environment), others where the rendering of 3D environments in real time was a central part… Recent projects have included the filming of a series for Netflix and several commercials. In one of them, for example, we did not only rely on static 3D backgrounds, but activated different interactions in real time based on animations and transitions on the screens in order to generate very specific, ad-hoc effects for production, right at the time in director might signal to proceed.