The LIE of Virtual Production
Less than LED Screens : A 101 for Producers
Virtual Production is a difficult concept to communicate, because as soon as that phrase is mentioned, all sorts of convoluted thoughts spring to mind: LED Screens, Virtual Reality, Point Clouds, all very technical, complicated, messy, and expensive ideas.
So in this article I'll endeavour to clarify the core concepts at the heart of Virtual Production, to the extent that everyone from your Producer to your Hair & MakeUp team will have a solid foundation of knowledge from which to navigate and implement Virtual Production in the future.
Table of Contents
What is Virtual Production, really?
The Significance Recap
Example Workflows
Next Steps
What is Virtual Production, really?
The first thing to understand is that Virtual Production is not one technology, one workflow, one idea - and this is why communication in this space has always been a bit muddy.
But worry not, we can understand Virtual Production quite simply with some metaphor work.
Virtual Production is not a single thing, it is a collection of possible things.
A Cloud of Possibility.
Rather than being the equivalent to the invention of the wheel, Virtual Production should be viewed as a sweeping term that is used to refer to everything that “the wheel” (our as-yet unnamed breakthrough) now enables us to create.
For the wheel that would be everything from sewing machines, to windmills, to cars, to robotically controlled industrial machinery!
That’s a massive sweeping range of ideas that include every level of cost and complexity, and Virtual Production is no different.
While LED Volumes, and In-Camera VFX are often understood to be synonymous with Virtual Production (and are indeed used synonymously by some studios), that is the equivalent of saying that “a car is what the wheel allowed us to create”...
It's a statement that's technically true, but so reductive of everything else the wheel enabled that the statement is more misleading than not.
If not "Virtual Production", what is the new breakthrough?
If there's only one takeaway to paraphrase into your group chat, it's this.
Virtual Production actually refers to a serendipitous combination of several technologies all reaching maturity at around the same time. Unfortunately because they are quite esoteric and technical, it's difficult for anyone unfamiliar with their history to grasp why these advancements are important.
But you can trust me for now when I say that they are meaningful, and I’ll later explain why (in the Significance Recap)if you're really that interested!
So, let's take a look at our three basic breakthroughs:
Great Instant Graphics
Remote Instant Collaboration
Affordable Instant MoCap
As I said, it's quite a dry list, but what do they all have in common?
And what is the one thing that Post-Production, VFX, and all that computery stuff definitely is NOT?
Instant Virtual Production
Okay they're all instant, but it remains a list of very technical terms.
It all still sounds uninviting, so why should anyone care?
Especially your Producer, your Gaffer, your Set Designer?
Well here's the great thing, being instant means that all these benefits that have, for decades, been only in the hands of the most technically... patient individuals... can now explode out into the hands of the impatient and practical.
These tools are now freely available to help anyone, instantly.
Think about how many tasks you do on a daily basis that need to happen immediately, that you would never accept adding any delay to: testing wardrobe under lighting, setting up camera angles, blocking out your stunt movements, even scouting locations.
For these tasks you have reliable workflows that you know, and you don't want them interrupted and delayed, because the way they're currently working... works! Plus you know them well, so why complicate it?
But now, by reaching into this Cloud of (Instant Virtual Production) Possibilities, alternate workflows are possible, and newcomers who don't already have any ingrained habits for how-to-solve-these-problems will find new novel alternatives that are no slower... in-fact they're quite a bit faster...
So don't be surprised in a few years when you find Gen Z and Alpha rolling their eyes calling you a grandpa at 40 when you haven’t adapted.
And don't be too upset if you're a working Cinematographer who comes home for the holidays to find your 9 year old niece laughing about how you need a crew of 20 to do what her friends are doing on their phones.
But what am I talking about, what are these novel opportunities?
What is changing?
I'm not looking to hype, mislead, or sell you on anything when I explain any of this.
You can scroll on to find some example workflows and crack into it, or keep reading here for a brief “recap of significance” that better explains why Virtual Production is so revolutionary.
Either way, I urge you to linger a moment on this thought:
The biggest shift that's required is in the minds of creatives.
Because for as long as computers have existed they've been tragically slow for anything super useful, realistic computer graphics have been slow, sharing files internationally has been slow, just wrangling data from a day of filming is still awfully slow most of the time.
But now it doesn't have to be that way, and once you can flick that switch in the back of your brain to start intuiting that computers can now assist you and provide instant results to incredibly detailed complex things... a whole new world of workflows await you.
All these places that computers could never fit into the pipeline before, they're suddenly completely rewiring what's possible.
Super Brief Beginner's VFX Context (The Significance Recap)
Even if you know nothing about VFX, you should have no trouble understanding that it takes a lot of work and a very long time.
After all, every film, every video you watch is just lots and lots of photos being played back very quickly. Normally you're seeing 24 of those photos every second, but that means that if we want to use VFX we need to add that VFX to every one of those photos... 24 photos every second.
And that simply hasn't been possible, just to add VFX to one single photo can take a computer anywhere from a few minutes to weeks!
This is why VFX falls into the stage called "Post-Production" because it needs to happen after Filming (aka Production), where the weeks, months, and years can be spent waiting for the VFX to create those 24 photos for every second of film.
This time commitment has been a law of VFX/CGI ever since its inception, so every thought about where VFX can be used, what problems it can help solve, has forever been constrained by that limitation.
VFX takes a long time, and we need to wait to see what it looks like.
And that is the thinking that can be completely ignored by newcomers, high quality realistic VFX can be an instant tool accessible on even a mobile device.
To someone already experienced, already working in the industry, VFX becoming “instant” is often misunderstood as merely another technological improvement for Post-Production.
But by making VFX instant, it can now be used to solve other problems that demand instant feedback, that previously had no connection to VFX.
And it’s that, the previously unknown potential! All these possibilities, hiding in a cloud waiting to be identified and put to use, are what we collectively call "Virtual Production".
Workflows
I use this term Cloud of Possibilities because there are so many new workflows and combinations of these new technologies to solve problems across so many different disciplines that people are still figuring out new ones every day and surprising the rest of us.
But here are some core basic repeatable Virtual Production workflows used across the world.
Workflow 1: The Location Scout Scan
Goal: To replicate the real-world location with Virtual Production to better communicate key aspects of the location.
Requirements: A Location Scout with a device capable of environment scanning (iPhone, Camera, LiDAR Scanner, Drone).
What's Involved: When visiting the location, the scout needs to budget extra time to scan their environment.
At the Indie/Free end this can be as simple as using a free iPhone app to walk around videoing every side and angle of the location in the span of 5-10 minutes (depending upon the size of the location). The iPhone solution works quite well because of their built-in LiDAR sensor, which makes the scan accurate, quick, and uncomplicated - with instant feedback on the phone making it quite idiot-proof. (The scan used in “The LIE of Virtual Production” video at the top of this article was captured this way.)
As budgets step-up, so too the quality of the scan increases, but so does the complexity involved for the person capturing the scan. Typical DSLR/Mirrorless cameras can be used to capture a location with photogrammetry, but this requires practical working knowledge of the proper technique. Drones can similarly be used for larger areas, and for capturing areas beyond the reach of photographers on foot, but also requires a solid foundational knowledge of photogrammetry’s best practices.
The highest budget solution is with LiDAR Scanners, which return us to the simplicity of the iPhone, but are typically quite cumbersome and still need to be positioned appropriately to ensure a full scan.
Benefits: With a scan captured, it can be sent as a 3D file to be viewed by anyone involved with their phone or laptop. The quality of the scan doesn't need to be amazing, as the most immediate benefit comes from providing an intuitive sense of the space, scale, and relative positioning that can be much more easily communicated through a 3D file than a set of photos and schematics.
This is a wonderful first step to get comfortable with, because so many opportunities branch out from capturing your location in this way. Everything from recreating the real-world seasonal Sun Position to predict shadows, to rehearsing blocking and experimenting with lenses and set decorations without ever needing to send anyone back to the location.
Downsides: The quality of the scan relies entirely upon the person taking the initial scan, and with the iPhone apps it can look quite blocky, as if hurriedly sculpted from clay. If this is too distracting, the most affordable solution (without stepping up to higher quality equipment) is to have a 3D Generalist "clean up" the scan. It won't take too long or cost too much, a few hours to a day at most depending upon the size of the location.
How to access the scan can also be a point of confusion. Simply opening the 3D file (.obj, .fbx...) works on most mobile devices and laptops, however they may need to download a free app to do so depending on the model. This method of viewing these scans isn't necessarily that intuitive though, and doesn't include the correct lighting of the real sun position. Slightly more advanced software can be used to explore with more intuitive controls for someone familiar with video-games, but this requires a little more setup and understanding. The best method for exploring these environments is with a VR Headset, but again this is a step-up requiring someone on your team with a working knowledge of these systems. However once the VR Headset is on, the whole workflow is incredibly intuitive and requires no prior experience.
Workflow 2: The Virtual Tech Scout
Goal: To solve "Tech Scout" problems within a Virtual Environment, reducing the need of sending people to revisit the real physical location.
Requirements: A 3D Scan / Recreation of the location; a device capable of viewing the Scan and gathering measurements (Laptop, iPad).
What's Involved: Building upon the Location Scout Scan, we have a location recreated in 3D, now we need a helpful way of interacting with it.
The industry go-to for the moment is Unreal Engine 5 (UE5), and if you're hiring a team or creating a department (VAD) to take care of this, this is what they will be using. Realistically to start they're only going to need one powerful Window's computer (equivalent to a high-end gaming PC, if that means anything to you) and then a suite of other devices to allow you to interact with the scene . This could be a single Consumer VR Headset (ala Quest 3), a few iPads (to be held up and moved as Virtual Cameras that display their view with Lens data), or a large motion capture space that will track the movement of people and equipment and place them accurately into the environment in real time.
However the same can be achieved with much less on more accessible and familiar devices. UE5 can run on MacBooks, albeit lacking in some functionality, but other free 3D software such as Blender can provide a much more reliable environment for technologically-impatient individuals. Blender isn't setup for VR viewing, but can be very intuitive for those who like to work with their laptops, and navigate within the environment as if playing a video-game.
Benefits: Everything from measurements of gaps and clearances, to predicting shadows at different times of the day, to set dressing, even testing colour grades and wardrobe in the environment can be managed.
The biggest advantage of this technique is how accessible it is, if there's a question about the location that someone forgot to ask, forgot to check, it's all stored there in the scan, and can be reopened by anyone anywhere in the world in an instant to find their answers. Plus the files are very small, so even if you're in Australia you can download them more quickly than traveling back to the location.
Downsides: The different methods of viewing 3D models like this on different devices are not all equal, some have more features, some are more intuitive, and they don't all play nice together. For the highest-end best-of-the-best features you will need a team, or a team member, with knowledge of Unreal Engine 5, and if you're accessing it from different locations together you'll also need someone with knowledge of networking / network programming. But most of the same features can be achieved with far less complexity, locally, on your own devices, and all that's required is sharing the Location Scan 3D Model.
My best advice here for beginners is, after you get the initial 3D Scan, have a 3D Generalist clean up the location, add the correct Sun Positioning, and provide that updated 3D Scan to your team. Then you may all each access it individually with whatever tools you find most useful. iPhone and iPad users have apps that allow them to view the model as if that iPhone/iPad is a real camera in the environment - this is great for Directors and Cinematographers. Laptop users can use Unreal Engine 5, but it is a bit demanding and complicated, so I recommend Blender unless you have someone managing UE5 for you.
Workflow 3: The Shotlist
Goals: Experiment with Camera Angles and prepare a detailed Shotlist with the 3D Scan / Recreation of our real location.
Requirements: (As above for Workflow 2) A 3D Scan / Recreation of the location; a device capable of viewing the Scan and gathering measurements (Laptop, iPad).
What's Involved: Building upon the Virtual Tech Scout, we have a location recreated in 3D, we're interacting with it through our personal devices, now we can do something particularly useful!
While our Tech Scout workflow was primarily focused on capturing data from that 3D Scan - measurements, shadows, positioning, planning. We can also get creative here and ADD more elements to the 3D scene. This may simply be Virtual Standins for the Actors, or some basic set dressing and props, or even the exact lighting setups for each and every shot.
These new virtual elements we're adding are called "Assets" and can be downloaded for free from some online stores/libraries, or you can scan them with an iPhone or photogrammetry, or they can be created by a 3D Generalist on your team. This is all possible with the same software and tools we've already mentioned.
Furthermore, the way you've been viewing this 3D Scan on your device has always been through something called a "Virtual Camera" and as the name implies, we can control it with all the same settings that a realworld film camera demands.
In practice this means that the Director / Cinematographer can move around "Virtual Cameras" in the environment and dial in the settings to test how they want to film each shot. Yes, that means down to the smallest details, Full Frame with a 40mm T/1.3 at 4:3 (so trendy), focused on the eye of the lead from 2.1 meters away, the camera tilted up by 6 degrees... all of this information is stored, and more! This approach already had everything you needed hidden in the background, waiting and ready to export shotlists, plans, and maps with an overwhelming amount of precision.
How exactly to export these plans will vary between pieces of software, some have built in pdf exporters, others will need you to note them down yourself in a spreadsheet, but they all definitely support the old reliable screenshot approach.
Benefits: The ability to test framing of shots Virtually in this way may at first not sound that helpful, after all what are you framing? But depending on your scale this can get very very detailed. Say it's just your camera and the environment, okay not great, but what if we add in a mannequin and a couple of lights... lights that we can dial in to your preferred lighting ratios? Or what if we swap out that mannequin with some live motion capture of your mate, and we use your iPad on a shoulder rig to dial in the camera movement?
All of this is possible with the same suite of tools, they're just layers added on top of each other as you're getting more familiar with UE5, Unity, Blender, or whatever you choose. And that in itself is a massive benefit to this approach - you can start super small with no expectations, but just build experience and confidence, and slowly find your way to a helpful implementation.
Downsides: As we increase in complexity, your options for which software / hardware to make use of becomes more limited. Realistically in the world of realtime motion capture (as we would need for your the actor standin) you'll need to use a Game Engine like UE5 or Unity, and then for the iPad/iPhone tracking you obviously need to be in the Apple ecosystem. The more of these little pieces you assemble, the more restricted you become in how you can actually make all these elements play nice together, and the more certain it becomes that you need a team, or team member in charge of keeping this all working.
Next Steps
No matter who you are, the key to benefiting from Virtual Production is understanding what it actually means, and why it is significant, which hopefully this article or video has helped you some ways towards.
For Producers:
Virtual Production is harder to implement for established Producer than those still emerging, as your network of contacts and frequent collaborators are also needing to go through a period of relearning and adapting to make use of it.
However, there are some clear Next Steps you can embrace to make more informed decisions going forward.
Separate Virtual Production and LED Screens / Volumes in your mind.
Hint and incentivise your departments to do their own research and gain experience with Virtual Production, starting with your Location Scouts.
Have your VFX Supervisor reach out to learn what 3D systems your different departments are already using, you might be surprised how much is already there just waiting to be put on the same page.
Next time your team is looking to PitchVis an idea, urge them to try UE5, PitchVis is an excellent pipeline to get familiar and setup the infrastructure required for Virtual Production without the same risk or investment required as for later stages of Production.
Be very careful about hiring "Virtual Production Studios", these are typically Location Studios specialising in In-Camera VFX or LED Panel / Volume setups. However they are not all operating at the same level of quality and consistency, and you should do some extra due diligence, and ideally hire a Virtual Production Advisor / Consultant to help you navigate that relationship.
Depending upon your scale, consider hiring a Virtual Production Assistant to manage and maintain your own personal (or production studio's) collection of assets, locations, and keep a small demo room / portable kit available for pitching and onboarding. This really is the most powerful untapped option right now, because so much Virtual Production communication is coming from a place of selling something - selling a big studio all-in-one solution - individual Producers are remaining distant from the process, seeing it instead as something you hire a studio specifically to do. But that's misleading and unnecessary, just as your Production studio may own their own gear, art, trucks, or whatever is cost-effective in your part of the world, owning your own Virtual Production Assets is always cost-effective.
For Individuals:
As an Individual getting into this space, it's a weird time as it's all new, everyone with prior knowledge and experience is heavily biased by the way problems used to be solved and the workflows they're already familiar with, and there’s a fair amount of fear coming from a place of job-preservation in the face of these changes.
So my best advice is as follows:
Try to think of the Virtual world and the Physical as one and the same, any problem you're having with one you want to be able to intuitively and instantly switch over to the other to solve it if it's more appropriate.
Practice Physical and Virtual Filmmaking together, if you're an absolute beginner the best way to learn the differences and what to pay attention to is to try and create the same shot in a Virtual environment and a Physical one.
Make something you want to see, don't try to learn techniques and guidelines based upon the current industry. We're working from generations of assumptions and industry expectations that have nothing to do with what you want to make, and you don't need to abide by them.
Further Reading
Many of these links are targeted at professionals already working in Screen or VFX, if you're an individual looking to start creating on your own, simply download something from the Software section, or check out some Indie Educators and have fun.
I’ll keep updating this list so feel free to reach out with any additions.
Contacts
Software
Indie Educators
Reading / Resources
Virtual Production Field Guide: