The Creation of OpenVPCal

There’s times that you start a project planning to complete it in a few days, and it takes a bit longer. Well, I’m here to talk about one where “a bit longer” turned into 2+ years. And for this I’m blaming the amazing people that I was able to develop it with. So here’s the story of OpenVPCal

OpenVPCal is an open-source Screen-to-Camera color calibration tool for virtual production (In-Camera VFX) that I’ve been developing for the best part of the last 2 years alongside with Francesco Luigi Giardiello, an Imaging Technologist from Netflix and amazing color scientist, as well as his team, plus a whole lot of input from many other people.

OpenVPCal is now (writing this on November 2022) in beta state and already being tested by 40+ partners around the world. The tool was presented in Siggraph 2022 by Carol Payne and Francesco 1, and we also made a presentation of the tool on Netflix Post Production Summit 2022 in Madrid, last week.

So here’s the a quick story about how all of this came together: Orca as a company, the collaboration with Netflix, and the creation of OpenVPCal, this tool I’m writing about.

This won’t be a technical post about the tool (I’d love to extend more on that on other posts!) but instead a bit of an anecdotal story on why and how this happened.

Orca Studios

Orca Studios is a Spanish company that’s focused primarily on visual effects and virtual production for feature films.

Orca’s LED Stage in Madrid (picture taken in Oct 2020)

Our adventure with In-Camera VFX started in the first months of 2020. Orca was born under the umbrella of Nostromo Pictures, a Spanish production company that would guarantee a flow of projects, giving us a lot of opportunity to test and experiment these innovative techniques. So a team was put together that came from the visual effects industry (such as myself) as well as game development and environments.

So the first real time ICVFX LED stage was built in the country, and… yes, right when we were starting, a pandemic happened. But that kept us focused on development for the following months, mostly around capture and building environments. Additionally, the ability virtual production gives for controlled shoots with a reduced crew and covid protocols made it an ideal filming solution for the situation we were living in. So by May 2020 we were already in a good position.

Disguise and first commercial projects

One thing we realised is that we still didn’t have the development manpower to support the full system we needed, and at the same time we were receiving more and more requests to shoot plates. That made us decide to switch from bare UE machines to a system based on Disguise, with their newly released RenderStream that would introduce a convenient media server, and then enable us to run UE directly inside via their RX machines. This made the system super powerful for shooting plates AND running UE with a tracked camera at the same time.

So we shot our first tv commercial using ICVFX with plates in July 2020. Here’s the BTS of the shoot:

And two months later I was lucky enough to supervise the virtual production for the first tv commercial in Spain using live ICVFX with UE and tracked camera.

Since then we’ve been one of the first studios to implement many things in production around ICVFX with UE, thanks to the amazing collaboration with disguise. Things like shooting scenes with raytracing, NVIDIA DLSS, clustered rendering just for the camera frustum, and we even were the first company to run UE5 for ICVFX!

In the last months of 2020 Orca got established as a go-to studio for shooting ICVFX commercials with UE. In the following months we kept doing similar projects, and that enabled us to run and improve the system without demanding the performance or color accuracy of film.

However, as most of us at Orca came from the film industry, and especially with Nostromo ready to give us film projects to work on, what really drives us is to strive for the highest possible quality level. So I started learning about color science, realising that we were never fully grasping how to make the content look on the camera exactly as we intended. But soon enough this was going to change.

Collaboration with Netflix

(can be spelled almost like Calibration)

A proposal from Netflix came together in November 2020 that couldn’t have fitted better: they planned to make an exercise exploring the potential of In-Camera VFX in production and showing it to others, so we were to make a collaboration between 4 companies in order to practice a real case where a real shooting location is scanned, reconstructed digitally, and then replicated with ICVFX, with actors, to see if we can replicate shots from the original film without any visible differences, as if we were just shooting pickup shots but this time in an LED stage. And the selected scene was the famous bank from La Casa de Papel.

For this endeavour, El Ranchito was chosen to work on the virtual photogrammetry reconstruction and UE scene creation, Vancouver Media would bring the same team that shoots the real thing (with DP Migue Amoedo and his crew) and even some practical items such as weapons or the famous red suits from the series, and we at Orca would provide the stage, the technical means and the supervision of the actual virtual production. That was a fun mission!

As any fun mission tends to come with new challenges, this wouldn’t be an exception. For this project we had to replicate original shots from La Casa de Papel that were already shot. And this meant that color accuracy would be a central topic, so we couldn’t get away with not having it fully under control anymore.

Meet Frankie

As soon as we started preparing everything on our side, Netflix got us in touch with Francesco, Imaging Technologist and part of their Production Innovation team in London.

This team is a special division that helps all Netflix productions plan and understand their workflows, especially when they revolve around new technologies and techniques.

Francesco would be helping us nail down the color science, calibration and understanding of the content to screen to camera ICVFX pipeline. After an initial conversation with him, we saw we were talking about the same issues and he explained some concepts to me in depth such as metameric failure and linearity, and those were precisely the topics I had no idea how to solve.

Little explanation of these concepts, feel free to ignore if not interested:

  • Metameric failure: Inability to produce a metamerism successfully.
  • Metamerism: phenomenon in which two different lights (with two different spectral power distributions) trick an observer (human or a device) into perceiving them as being the exact same color. This is a fundamental aspect of color science in general, as all screens and color reproduction devices we work with on a daily basis are based on this concept: they make us perceive millions of different colours that we see in the real world, while the spectral power distribution of the light they emit is limited to that of the color emission components of the device (narrowband LEDs, OLED etc).
  • Linearity: In the context of In-Camera VFX, it refers the whole topic of creating color operations in a linear way (or the way in which nature and real lights work), as the content that we show in the LED screens for virtual production isn’t generally meant to be pleasing to the human eye but instead is usually meant to be recorded by a camera, and therefore tends to benefit a lot from displaying light in a linear fashion, like it does in the real world… so the creative tone mapping that is generally applied to images as viewing transforms, in the context of ICVFX usually introduces some unintended behaviours.

Francesco was incredibly helpful and also proposed some next steps. So we started testing the different points, and soon we realised we had just entered a very very deep rabbit hole.

As you might have seen on the Tools section of my website, I have a background as compositing tool developer, and Nuke (a software that I might spend more time with than any human being) is an extremely powerful tool for color math operations. So this was a nice combination… Francesco opened a world of knowledge about color science, and I couldn’t help making these ideas into nuke nodes and automations that would eventually become a color calibration tool for ICVFX stages.

By the time we did the test for La Casa de Papel we already had a working proof of concept and this helped us approach the shoot with confidence on what the camera would see and how to help Migue and the camera team perform any changes they requested.

You can see the video of the shoot here:

I believe it’s really important, in order to succeed working with a new system, to first understand very clearly its limitations, so that then you can work around them and exploit its strengths in creative ways. And fortunately on the shooting we all were on the same page about this. So it went great and it was an awesome learning experience for all the people involved, and proof that ICVFX is a solution worth considering for film from now on.

Let’s make OpenVPCal!

We decided to formalise our intention to make a fully featured working calibration tool that anyone could use without much technical knowledge and that would support all possible combinations of cameras, lenses, screens, processors, led panels and others. That’s the moment that the OpenVPCal project was born.

General LUT breakdown

The tool would need to follow these premises:

  • Reduce Metameric failure between the camera and LED wall/external lights.
  • Easily deployable by anyone without any additional equipment.
  • Flexible enough to a point that is completely agnostic from your pipeline.
  • Fast enough to be run between takes without holding cameras and crew.
  • Reliable enough that can go beyond different camera/lenses/settings.
  • Take care of many of the complexities under-the-hood.

And we would later agree, thanks again to the amazing support from Netflix and Orca, that this joint project would be open sourced and made freely available to the world once ready.

We had a really long road ahead for learning, testing, error-proofing, and receiving lots of input from the team at Netflix and the amazing people at other companies such as Epic.

So basically through 2021 and 2022 we’ve pushed this project forward, getting it to a production ready state, as during this time it could still be considered a prototype. But that didn’t prevent us from using it in many films already…

Films such as Through my Window, Dancing on Glass, or the soon-to-be-released sequel to Birdbox already have ICVFX scenes using Unreal Engine that I supervised, as well as color calibrating the whole thing using OpenVPCal. And it’s definitely super useful and quick to use, so we can consider it production-proof.

So, what’s the current state of the tool? Well, after we finished a complete implementation in Nuke, we partnered with a few incredible Netflix engineers – Daniel Heckenberg and Francesco Giordana – to make an independent, extensible Python API possible. And that one’s now an integral part of the color operations in the Nuke tool that’s currently in beta testing. Next step will be developing a standalone app, starting Q1 2023, and the final Open Source tool is planned to be released in Q4 2023.

I’ve personally been learning so much from all this thanks to Francesco, Netflix and all the opportunities that arose from this and am super grateful now for being able to contribute back in this way.

So, now you know where all this came from!

In future posts I might get into the details of the tool. If you’d like to stay updated with the coming posts and stuff, feel free to subscribe to my newsletter. I haven’t sent anything yet but gathering emails of interested people for whenever that starts 🙂

Francesco showcasing how to run the calibration patches, in NFX Post Production Summit Madrid 2022.
  1. “OpenVPCal”: An Open Source In-Camera Visual Effects Calibration Framework: Carol Payne, Francesco Luigi Giardiello

Join my Newsletter

Subscribe to my newsletter for updates on my tools, articles, courses and more…

You can subscribe to my newsletter to get notified about my new tools, blog posts, new courses and other relevant info I’ll post about.

Section Coming Soon

Subscribe to get notified once this section is up and more!

You can subscribe to my newsletter to get notified about my new tools, blog posts, new courses and other relevant info I’ll post about.