The indie virtual production revolution with Unreal Engine

How Unreal Engine can help small teams and indie movie makers to boost their workflow and create realistic VFX shots in real-time with virtual production.

Cory Strassburger in an extract from his

In the last five years, an astonishing evolution in the virtual production industry has taken place all over the world, empowered by the more accessible technologies for indie producers, the need to respect Covid rules while filming new movies, and the advances in the real-time VFX pipelines in Unreal Engine.

In this article, I want to share my personal experience from working in this industry; provide an overview of this flourishing industry; discuss the revolutionary tools that everyone can now use to produce their own movie, and highlight the challenges behind the scenes.

What is Virtual Production?

Nowadays, virtual production assumes a very broad range of definitions due to its rapid evolution. In an attempt to capture them all, we can define it as an innovative video processing technique that seamlessly combines physical and digital elements using real-time software and 3D tracking devices specifically designed for this purpose.

Extract from

Among all applications, placing real actors in 3D virtual environments and giving life to virtual actors are the most common use cases for which virtual production is being employed by movie companies and live broadcasting channels.

Example of placing real actors in 3D virtual environments. Extract from

Example of giving life to virtual actors. Extract from

To the joy of many producers, today these and similar use cases can be implemented basically for free using and its . This represents, especially for indie producers, a real revolution and an opportunity to step into this market even with very limited resources.

The use of Unreal Engine for Virtual Production

In 2018, I started working with professional virtual production systems and I was particularly surprised to find out that two of the main software used to make real-time VFX were Unity (which has been used while shooting The Lion King) and Unreal Engine.

Back then, only a few tools were available within these engines to make virtual production simple enough for public use. Thus my team and I had to manually deal with a lot of issues like synchronization, wrong video formats being detected causing dropped frames and visual artifacts, and wrong colours resulting from incompatible render passes. This required a lot of coding to solve all these challenges and to create custom automated configurations.

After all, both Unity and Unreal were initially designed for games development, and most of the rendering pipelines and mechanics were not fully supporting the features required for a professional workflow in virtual production.

Credit: Disney. Extract from . More information in this article by 草莓视频在线

However, Unreal Engine has evolved drastically in the last years, and the majority of those troubles have been solved with the latest tools offered by Epic Games (such as , , and ). These remarkably boosted many movie makers and broadcasting producers, which are now free to use Unreal for their virtual production projects availing themselves of much simpler systems and comprehensive workflows that have helped the release of content such as the The Mandalorian and the Fortnite World Cup live show, in very short times. This trend is expected to continue in the future according to .

Extract from

Extracts from

Extracts from

Extract from

Why an Indie Virtual Production Revolution?

People might think that only expensive LED walls or movie sets can accommodate producers to create their movies and real-time VFX, but the reality is that a simple green screen, some affordable tracking devices, and a camera could be enough for a streamer or a hobbyist to start successfully experimenting with virtual production.

This was made possible by Unreal鈥檚 new virtual production pipeline, which significantly reduces the times and costs of production compared to traditional approaches and makes room for new customized and simplified solutions that everyone nowadays can benefit from.

The virtual production pipeline & Unreal Engine

Diagram from

Traditionally, filmmakers were forced to use a sequential production pipeline that took a lot of effort, time, and especially money in the process. This typically implied three steps:

  1. Shoot a video of the performance with the real actors;

  2. Apply to the recorded footage some Previz* manually designed by artists to serve as a guideline for the post-production team to apply the visual effects afterwards, following the producer鈥檚 requirements;

  3. Add the VFX with final pixel quality in post-production, replacing the Previz with higher quality models and animations.


*Previz are visual aids that give the producer an idea of how the final shot would look like when the visual effects will be ultimately applied to the original shot. The name refers to the 鈥減re-visualization鈥 of a drafted version of the VFX overimposed to the real footage performance.


In this perspective, a single mistake along the whole pipeline could have meant unrecoverable losses of hundreds of thousands of dollars with several difficulties to fix it if not facing huge additional costs and production times.

Since Epic Games stepped into virtual production, a more flexible and iterative approach has been adopted, in which not only a director could quickly change a final edit multiple times, but the live shooting of actors and the visual effects could be visualized in real-time simultaneously saving a considerable amount of money and time. As a result, indie producers can enter the market and produce their shots faster and cheaper than ever before.

More affordable camera and motion tracking solutions

In conjunction with the software advancements covered by Unreal Engine, plenty of new hardware has recently entered the market, providing motion capture and camera tracking capabilities at very reasonable costs.

Mocap Suits

Extract from

A couple of years ago, I used an for some virtual production testing. Despite the fact that it is not particularly accurate in terms of room-scale body tracking (the character slowly drifted away due to small inaccuracies accumulating in the body sensors during the use) I found it very easy to set up and quite responsive to arms, legs, head, and chest movements. As a starting point, it could be a valid investment for an indie studio for full-body tracking.

More professional but expensive suits use reflective markers achieving a more accurate room-scale body tracking, but require additional special cameras to track them, which for small indie producers might be overkill. Among them, are a good example.

Camera Tracking and Face Tracking Solutions

In the past, I have worked with , which also provide a handy to integrate the live tracking with custom rendering pipelines for the engine. With their camera bar, it is possible to accurately track the real camera movements both indoors and outdoors without markers, which represents a good deal for simple setups and small studios.

Another system that I have seen in showcases is , which uses reflectors and mainly targets indoor spaces. Surprisingly though, in this context even just an iPhone can be a valid alternative for indie producers these days. Thanks to the and the on the App Store, it is possible to move the virtual camera in Unreal just tilting and translating the phone seamlessly or to live capture a performer鈥檚 face to drive the one of a virtual character. This is possibly the cheapest and more convenient solution for newcomers in the field, which I recommend for less demanding shooting requirements, especially when live distortion correction and lens tracking are not critical for the shots.

Extract from

Extract from

It鈥檚 also good to know that a few custom plugins for Android exist too, but they are not as well integrated into the engine as the iOS plugins yet.

Homemade tracking solutions

Extract from

It might not be obvious, but nowadays many custom made solutions exist for home-based virtual production sets as well! Some of them use traditional consumer devices like Vive trackers, Kinect, and Oculus trackers. Others get even more creative by using gaming steering and pedals controllers to drive props and the virtual camera in Unreal Engine. These are a remarkable alternative to professional solutions and are slowly becoming a trend among many indie producers and YouTubers. A comprehensive overview of some of these homemade setups has been covered during the .

For convenience you can check the following YouTube videos and tutorials for more examples of homemade tracking solutions and configurations:

  • Creating a Virtual Camera using HTC Vive in Unreal Engine (4.26) ()

  • iPhone Facial Capture with Unreal Engine | Unreal Fest Online 2020 ()

  • New Virtual Camera 2.0 Setup in Unreal Engine using your iOS Device ()

  • How to Setup an iPad as a Virtual Camera using the Live Link VCAM app ()

  • Virtual Production in Unreal: Zoom, Focus, and Camera Tracking using 3 Vive Trackers ()

  • VR Mocap for Unreal Engine 鈥 Quick Start Video ()

  • Virtual Camera Overview ()

  • Virtual Production with a Projector & Unreal Engine ()

  • Camera Stage Controller | UE4 Virtual Production ()

  • My Virtual Production 草莓视频在线 At Home ()

Which tools are recommended for Indie Virtual Production in Unreal Engine?

To help indie producers start their journey in this fascinating industry, below I have listed some of the most important tools for producing your virtual production VFX shots in Unreal Engine.

Live motion capture with

Once you have a motion capture suit or a tracking system, you are going to use Live Link to let Unreal decode the position of your face/body/camera in real-time, dealing automatically with the synchronization of your moves with the engine鈥檚 virtual character or camera. More info on how to configure Live Link and understand its dynamics can be found on the and in this .

Extract from

One of the most appealing features is combining Live Link with Metahuman to easily create and control virtual characters in real-time. For convenience, here are some tutorials from YouTube:

  • Unreal Engine Metahuman Live Face App Tutorial ()

  • Calibrate Metahuman and Live Link Face the right way ()

  • Unreal Engine Metahuman Face and Body Motion Capture Tutorial ()

Video Capture your live performance with the

All you need to know when dealing with capturing your live video and streaming it into Unreal Engine is covered by the Media Framework, a set of assets and tools within the engine that offer a wide range of supported formats and video sources. From a simple webcam stream to a connected DSLR to more professional cinematic cameras (which usually are more expensive and use SDI cables requiring Aja or Blackmagic video cards to convert an analogue signal into digital format), you can define different video streams to be processed accordingly in real-time.

Extract from

Some useful tutorials can be found on 鈥檚 and 鈥檚 channels on YouTube.

Compositing with the new

Once you have a moving character and your live video, it is time to compose the virtual animation with the real footage. This can be done with Composure, which allows you to blend virtual and real footage into a final shot accordingly and in real-time.

The workflow is similar to the one used in by professional composers to combine VFX and pre-recorded videos, with the difference that in Unreal this can be easily done in real-time and with a few commands.

Additionally, if you are already familiar with the Nuke workflows you can directly stream from Unreal all the real-time rendered sources and compose them into the Nuke graph using the . For a quick reference, you can check this and some . Note however, that the Nuke license could require a considerable financial investment, and if you are starting your journey with a limited budget, you might want to use Composure instead, optionally combined with a more affordable third party software like for post-production final editing.

Extract from

A good resource to learn Composure is one of the , . There are also some useful YouTube videos for beginners:

  • Using Composure with a Backplate and HDRI ()

  • Getting Started with Composure | Live Training | Unreal Engine ()

  • Unreal Engine 4 Green Screen Tutorial ()

  • Indie Virtual Production is here! ()

  • Unreal Engine Virtual Production Composure Output to Viewport ()

Prepare your virtual sets with

To considerably improve the perception of space and size of your virtual set, you can use virtual reality within the engine and visualize your entire scene in 3D from your headset. It is very difficult for filmmakers to design a shoot while looking at a 2D window into a 3D world. Putting on a VR headset unlocks the extremely powerful spatialization aspects of the brain to understand how pieces of the environment fit together, and it allows filmmakers to design a shoot using the storytelling instincts they鈥檝e trained throughout their professional lives. With VR scouting it is possible to easily change the lighting, camera setups, add markers and measure your props by using your hands and a simple UI. Especially for VR enthusiasts, this represents a much better alternative to 2D displays when dealing with designing the space of your virtual scene.

Extract from

Let your crew work remotely with VR & Unreal

If you have a small team for your indie project, Multi-User Editing will let you e-meet them remotely directly inside your virtual scene in Unreal Engine, using either PCs or VR headsets. This is extremely useful especially for working with team members across the globe, breaking the boundaries of the physical world. Just like being in a real set, this tool will help the director to better control the scene setup while shooting the scene in real-time, and the actors to improve their performances by being immersed in the virtual stage.

Extract from

For a better understanding of how the Multi-User Editing tool works, I advise you to watch a webinar from Epic Games released on their YouTube channel and a live presentation from Siggraph:

  • Explore Collaboration with Unreal Engine鈥檚 Multi-User Editor | Webinar ()

  • Generations 鈥 Siggraph2018: Real-Time Live! ()

More information on the relative Siggraph 2018 presentation can be found .



What鈥檚 new in 2022 for indie producers

The Virtual Production Week 2022 by Epic Games

Recently Epic Games organized the , a comprehensive event that captures different talks and showcases the opportunities that Unreal Engine has created for the movie industry and especially indie producers to realize impressive virtual production projects even with small teams.

Among the different sessions, one of the most relevant to the topic of this article focuses on how small teams are empowered by the engine鈥檚 tools to produce impressive shots and streams.

Believable virtual actors and performances

Still in early access, Epic Games is working on a powerful easy tool that will help creators to produce believable virtual characters for their cinematic shots and videogames.


and experience

Extract from

With an unexpected surprise at the end of 2021, Epic Games released a new unbelievable experience made with Unreal Engine and Metahumans: 鈥淭he Matrix Awakens鈥. Even if this was only released for consoles and is at the moment not available for the Editor, this project demonstrated how powerful the engine has become in terms of real-time procedural rendering.

Creators can now experiment with the Metahuman Creator early access directly from a web browser to produce their virtual humans and customize them from a list of templates. These are then cloud-saved onto the Epic Account used to login into the tool, and can be imported in any scene through the .

This free tool is definitely worth checking, especially if your team is small and might not have enough artists to 3D model realistic humans from scratch.

The era of Virtual YouTubers

People like Cory Strassburger managed to create amazing content for YouTube showcasing how simple it is nowadays to create virtual characters and animate them through innovative virtual production pipelines in Unreal. Specifically for his works, Cory used the iPhone face capture app combined with an Xsens body capture suit to give life to characters like Xanadu, which is probably one of his most popular so far.

In the following videos, he shows how he managed to create his characters from his homemade virtual production room. Check them out!

Where to find more learning resources?

To help you learn more about virtual production and its workflow in Unreal Engine, I wanted to write down a list of useful learning materials available online for beginners and more experienced indie creators:

Unreal Online Learning courses (Epic Games account required):

YouTube channels:


More information on the roadmap planned by Epic Games and Unreal Engine can be found on one of their latest webinars.

I hope you will be successful in your journey and that my contribution will speed your learning up!

Previous
Previous

Metaverse momentum is fueling VR/AR headset market growth

Next
Next

PlayStation VR2 is the next generation of VR gaming on PS5