The purpose of this blog is to document the research and development of my senior thesis project. It is my hope to update as often as possible and with as much detail as possible, so that anyone may recreate and use my method of virtual production. Here’s a little introduction of what I’m hoping to accomplish in the next 8 months:
The Senior Thesis project I intend to create is the development of a virtual production pipeline intended for use in low-budget or indie video and film creation. This project will utilize emerging techniques from high-budget productions and adapt those techniques for use with more accessible technologies. The project will take its form in a suite of tools which facilitate virtual production.
Virtual Production
Virtual Production is a method of film/video production that mixes live footage and computer graphics at once, in real-time. There are three major aspects to a virtual production pipeline:
1. 3D scene creation,
2. Real-time camera tracking,
3. Real-time visualization:
a. real-time compositing from green or blue screen (results displayed on monitor, ie. hybrid virtual production);
b. real-time display of 3D scenes on set using projectors or LED screens (results displayed in camera).
Problem
Projecting/compositing live imagery is not new - in fact it dates back to the 1930s. Rear-projection was regularly used in film production for in-vehicle driving scenes. However, this required a fixed perspective that needed to be carefully planned. The camera and background imagery needed to align perfectly to maintain visual continuity, and the live-action camera must be stationary. New technologies have created solutions for this, but current methods of virtual production are extremely expensive and technically complicated. And in some cases require technology and software that is proprietary or unavailable to most creators.
Solution
To make Virtual Production more accessible and affordable, I will adapt industry techniques for use with more widely available technology. For example - using cellphones and projectors in lieu of 8K LED screens and BlackMagic cameras. It is important that my setup for virtual production is replicable. So, the project and its components must be documented in detail. This may take the form of video documentation, tutorials, a written guide, or a combination. Furthermore, all of the software to be used in this project will be free/open-source, ensuring that anyone can access the tools required.
Objectives
Create a real-time camera tracking system using an iPhone to control a virtual camera in Blender. Using Apple’s ARKit API, tracking points will be used to calculate the real world position and rotation of the iPhone. This will control what is projected or composited into live-action footage and simulate a real-world camera in actual space. This is the basis for creating a projected/composited backdrop that is realistic as it will simulate real-world camera effects such as parallax. This may involve the development of an app.
Compositing tools for TouchDesigner for use in hybrid-virtual production (keying, matting, despill).
Build a prototype combining these tools for testing and documentation. This will include designing a studio space with projectors and/or greenscreens.
Release the tools and content I have created via video tutorial and/or written documentation.
Virtual Production is an exciting and emergent production method. Bringing virtual production capabilities to a larger group of creators opens possibilities for content creation that is currently not possible. Let’s get started!