>>Matt: Hello, and
welcome to this Unreal Fest presentation. Thank you so much for tuning in. My name is Matt Workman, and
the name of my presentation is Work From Home
Virtual Production. And I'm going to be talking
about how I built my own DIY virtual production studio in
my house and the type of work that I'm doing today. So a little background about me. For 10 years, I worked in New
York City as a live action cinematographer primarily
working on commercials and music videos. And what we can see
here is that I'm on the bottom left
of the screen there, and I'm operating a
camera remote head. And I'm viewing the
image right there. This is a very typical shoot
that I would be part of. And my specialty was that I
was doing a lot of previs work, specifically in Maya. And I was working on
really technical jobs with VFX companies,
and they were already working with previs. So I really wanted to be
able to communicate myself, so I started to get into
the fray, so to speak. This next image
here is from a shoot that I did that was using
the TechnoDolly, which is a motion control crane
here on a [INAUDIBLE] studio. And this is the
technical previs. And I was working with
The Mill, and they were kind enough to give me
their TechnoDolly IK rig. And that's how I was planning
for this particular shoot. After the planning, the
fun part-- for me, anyway-- is executing it
in the real world. So we have to program
the actual crane. And it loosely
followed the previs, just making sure
that we're getting the right boards across,
and the right frames. But on a real shoot, we
always have a little bit of things changing. And so this is the primary
type of work I did as a DP, and this is how I got introduced
to working in 3D, as well. So that was how I
got introduced to 3D and combining it with live
action cinematography. And I actually went on to start
a company called Cinematography Database, where I
made products that did just this in
Maya, in Cinema 4D, and eventually in Unreal Engine. And what I've done
is I've basically combined all of my knowledge
and assets and tools from the other
programs, and I've brought them to Unreal
engine and put them together in one standalone package that
is called Cine Tracer that's available now on Steam. Cine Tracer has
become really popular in the film industry,
which I'm very proud of, and was recently featured in
American Cinematographer, which is one of the biggest trade
journals for cinematography. So that's how I got
started in Unreal Engine. And for SIGGRAPH
2019, Epic Games called me to be the director
and DP for a pretty awesome demo here that you can see that
was working with LED walls, and was quite
literally the preview and basically a mini demo of
the tech used for Mandalorian. So I spent about a
month on that stage, working with Lux Machina
and Profile Studios and a bunch of partners
that you can see here, and my mind was really blown. I'd never heard of doing
this sort of thing. And ever since working
on this project, I have been taking the
steps to be able to recreate this myself at an indie level. One, to see if it was
possible, and two, just because I really like
working in this kind of hybrid live action and
virtual system here, where you can actually see
final pixels through the camera. That is just amazing to me, and
what really started my journey into documenting building
an indie virtual production studio. So earlier this
year, 2020, I started to convert my basement,
which luckily is pretty big, into an indie virtual
production studio. And what kind of led to this
whole work from home concept-- this is all pre the
shutdown, but what turned out to be pretty good timing. You can see the very beginnings
of the studio are pretty small. It's one computer, one camera,
and a very small green screen. But it kind of
balloons by the end, and we're able to do
some pretty cool stuff. So the first component to
building a virtual production studio at home is some sort
of camera tracking technology. And the most available
and the most, I think, robust that I would
recommend most people to have is the HTC Vive Pro
and Vive Tracker. This is your standard
Vive Pro that you would use to play video games. But using the controller
and the tracker, we're able to do really
high-precision tracking for camera movement, which is
the first step that you really need. As I found out later
on, I would eventually add more base stations. And these are these
sensors that basically allow you to track
the controller or track its position. And the typical Vive
Pro comes with two, and I got two more for a
total of four, you guessed it. And this just gives
you better coverage, and you can make
a bigger volume. So this is one of
my first demos, where I was just testing
out the Vive Tracker here. And I'm not really trying to
make the image look amazing, though I went for an
overcast look, which is really easy to make
look good in Unreal Engine. What I'm testing here is the
latency and the responsiveness of the tracker when you
have four base stations up. And it is really responsive
once you set it up right. You do have to be worried
about things like windows or very large, shiny surfaces
that are causing reflections. This technology is
based on laser tracking. And if you've ever worked
with LiDAR or other laser technologies, things like
reflective surfaces and mirrors and glass can really
throw off the tracking. So for me, I'm in a drop
ceiling location with a carpet, so it's pretty perfect. I have almost no latency, and
it's very, very responsive. So after getting the virtual
camera going with you Vive Tracker, my real goal
is to combine live action and virtual worlds together. So that means putting the Vive
Tracker on my camera, which, at the time, was a Canon C300. And the way that you get video
footage into Unreal Engine-- this is the first big
step for a lot of people-- is that you need a capture card. And the one that I'm using is
the Blackmagic Design DeckLink 8K Pro. And this allows me through SDI
to get video in real-time-- live video-- into Unreal Engine
and make it usable. And this is one of my
very first tests here. No camera tracking yet. But you can see on the texture
in the background of the plane that I actually have
live video captured, and it's now usable
in Unreal Engine. So that's really the first step. And this clip
continues, and it's just a little preview of what
my YouTube channel has, which is much longer versions
of what I'm showing here. These are just clips from them. So next, after
that, we're really looking for camera tracking. So we want to be able
to match the real world camera with the virtual camera
and put those images really right on top of each other. So in this case,
you can see that I'm using a Vive controller, and
that works perfectly well. This is one of the
very first tests with the C300, which turned
out to be not the best camera for this. There's some cameras that
are better than others. But this is the
first time that I'm doing camera tracking and a
live composite in Unreal Engine. Now, this is still very shaking,
and I'm moving very slowly to make the demo look good. But it's the first
time that I was really able to get this sort of
virtual set setup going here, and it was very encouraging. And this is the first step,
I think, for a lot of people. And then I'm also showing
how this is a live set. You don't bake any of
these environments. Everything is dynamic. I mean, I guess you
could bake the lighting, but I tend to keep it
all dynamic, and showing that I can now actually
change and alter the 3D background, which is
half the fun, in my opinion. So after a little bit of
work with the Canon C300 and some more research into it,
I decided to switch cameras. I wanted to be able to do 4K. I still needed SDI. And very specifically, I needed
time code and the ability to genlock the Engine to the
camera's time code or frames, essentially. And really, the best camera
after doing the research is the Blackmagic Design
URSA Mini 4.6K Pro G2. A lot of studios are using this,
and it is pretty affordable, and pretty much the most
affordable entry level professional camera
with all the features we need to do mixed
reality in Unreal Engine. That's the camera, and I still
very much recommend this one. Higher end cameras work as well. So this is one of
the first tests that I did, switching to
the Blackmagic camera. And we've changed a
lot of things now. We have better sync. We're doing genlock
to the frame. And in this little
clip here, I'm basically explaining
how this all works, at least to the best
of my knowledge back then. And I have switched
to the Vive Tracker. And you can see I'm
zooming in on the SDI and talking about the
different features that are needed for a camera
to really work well in Unreal Engine. And we have a much better
camera track and composite happening in real-time here. So after getting
the camera tracking, we want to start to
look at the keying, actually cutting out the person
or the talent, the live action part, and doing that
in Composure, which is built into Unreal Engine. And this is just me learning it. You can see here in
this camera setup I continue to change
my camera around. But if you look at the
little monitor there, you can actually see that I'm
viewing the live composite now as I'm shooting it. And that's what you really want. You want to be able to see
what the virtual set is so that you can
operate and change the lighting in an informed way. So this is one of
my very first tests where I'm shooting my
wife, Diane Levine, here. Shout-out. And you can see I have
the camera tracking, and it's going OK, but I
have a pretty bad chroma key. And that's not actually
Unreal Engine's fault. That's me just
learning, and I'm still happy to share my
work in progress. I think I'm still not executing
at full quality even today, but this was the first test. And for me, it was
quite promising. But what I'll show you here now
is that in Unreal Engine 4.25, 5 they came out with
a new chroma key, or I believe a color difference
key is what it's called. And that allows me to
do keying like this and compositing in Unreal Engine
at a much, much higher quality. And I've just
learned how to use it in a more accurate
way, especially with the despill
and the edge matte. There's a lot to live keying. I'm still learning it. But just using the
color difference keyer made a huge difference,
even with the transparent and the reflective objects. And this is how I test it,
with my llama and myself there. So moving on from the
live key and the tracking, one of the next
things that I really need to get into the system at
an indie level is really hard still is to encode the focus. So if I'm going to focus
on the real world object, the background should
go out of focus. And that's in Unreal Engine. And then, likewise, as I
focus to the background, I need to have this all
tracked at the same time. The high-end tracking systems
have solutions for this, and they require
custom hardware. And they're a little
pricey, but I figured out pretty lo-fi way
of using two Vive trackers to track the focus,
and this is that demo. I still do not recommend
going this way. It's really not reliable. But as far as learning
how to program myself a really lo-fi tracker for
focus, I learned a lot, and it did technically work. But again, in the end,
I really would not recommend something like this. You can see that I'm
using two follow focuses, and it's quite DIY. But I was still happy to
learn this entire process. Moving on for me,
after I started to get more comfortable with
the virtual camera and the VCAM with mixed reality is
what I moved on to next. So you can't quite
see it in this one, but I'm actually in
the frame with her, and I'm filming it, which
I'll show you in this demo. This is, I think,
one of the first ones I did that was like this. So I'm filming the Countess,
who is a Paragon character that you can get for free on
the Unreal Engine Marketplace. And this set is also
from the Marketplace. And I'm testing my handheld
V camera, virtual camera, with a flashlight or a
spotlight also amounted to it. So I could do an
on-camera effect. And at this point, I figured
I had to do mixed reality, so I thought I might as well
just do both at the same time. So these are both live
feeds out of Unreal Engine. And if you look
closely, my mouse is actually still on the screen. I forgot to move my mouse
during the video capture. But that was a great test, and
there was a lot of response to that on the
internet when I posted it, which was very
encouraging to keep going. After that, I started to
build different controls so that I could move around
and change focal length and do different things. And I essentially map them
to an Xbox controller. And what you'll
see in this demo is that I'm actually
piloting and moving around the camera stage
with the Xbox controller to find where I
want to shoot from. And then I go handheld
with the V cam and do what I normally do. And what I really enjoy about
this as a cinematographer is that I can see
the reflections and I can see the different
tones from the lighting while I'm filming it. And so that's really what makes
this quite engaging to do. It's really fun compared
to working with something that's all gray box or your
standard, low-quality OpenGL viewport. Doing this all in Unreal
Engine is really fun. This is my next test. As I'm continuing
to mix real world cameras with
virtual cameras, I'm also very interested as
a DP to mix real world lights with the virtual lights. So you'll see that I've parented
a virtual light to a real light and roughly matched their
lighting quality so that I could potentially shoot
with a real person and a virtual person and match
the lighting between the two. That was what that
demo was about for me. Moving on to remote
collaboration, this is getting to, I think,
the point of this talk, is really about what can
you do remotely from home with virtual production. So this is one of my
first collaborations that you can see here. And I'll talk about
that a little bit more. This is a virtual
reality experience created by MacInnes Studios. And we got connected through
Epic Games, I believe, on LinkedIn. And he sent me their project,
their Unreal Engine project. It was originally a
virtual reality experience. And I wanted to make a cinematic
with it using my new techniques and tools. And this is a demo from that. So you'll see that here-- it's barely a little small. We'll see a bigger
picture of it-- is that I'm
controlling the camera stage with my feet using pedals,
which we'll see a picture of. And then the camera itself
is also motion tracked. But this case, it's on a tripod. So I'm trying to move myself
around in a smooth way and do pan and tilt.
So it's a little bit of a juggle to do this. I think, ideally, this is
really run by two people. But this is my very first
remote virtual production collaboration where
someone authored the scene. I actually did all the
lighting and changed that up. And then I'm also
doing the camerawork and making a little short
film edit out of it now. And this is actually part
of the MacInnes competition, their real-time
competition, now where other people were
given this scene to do the same exact thing. So that's been a pretty
exciting collaboration, and I'm excited to see what
people make with these assets. So this is a picture
of that hardware. I'm still continuing
to iterate on hardware. So this is a farm
simulator hardware kit. So there's a wheel,
which is great, that spins 270 in both
directions and then self centers. And it comes with a joystick
and a lot of buttons. And most importantly for
me are the foot pedals, which I had mapped to moving
forward and back based on where the camera was looking. And it just allows
me to experiment with different live control
systems, which is really what I'm looking for,
which I'll show you why live is so important for me. Before I get into
the next demo, this is the next piece of
hardware that has really allowed me to work like I
used to on a live action set in Unreal Engine. I can really work how I'm
very familiar with working. And so I upgraded to an
innovative camera cart-- that's the first one--
so that I can move it around and potentially
bring this on set when that's allowed to happen. And the other piece of kit
are the NODO Inertia Wheels, and those are the two wheels
that you'll see there. I'm using those to
essentially control the pan and tilt of a camera. The rotations for the
camera are actually set up like a real gearhead
would be in the real world. So there's slightly offset
from just a nodal camera move. They have real world offsets
built into them, which I really appreciate. And I like the way that
cameras look that way. Another piece of kit that's
on here is a midi controller. And I'm mapping that to
playing back mocap and mapping lights to it. And it just, essentially,
without even having to look, I can reach down and
do things to the scene without having to go
back to the editor. And that's all
happening very quickly. And the very last piece
of kit that you see here is the Atmos Sumo monitor. It's 1080p and it's
HDR, so it looks great. And I am live operating
off of that monitor, and I'm able to record
with it, as well. So for my workflows,
I am very much looking to stay in real-time. I don't go back to Sequencer. I'm not doing your
typical VFX workflow. I hit Record on the monitor. I live operate
almost everything. And at the end of a session,
I might have an hour or two of just pro res footage
that I go and I edit and I color grade,
and I move on. That's very much the type of
workflow that I like to do. And I'll share the very first
demos of that type of workflow here. So this is a collaboration
that is still going on with SuperAlloy Interactive. And forgive some of the
messy mocap placements. This was really just
testing the whole thing. This isn't the final mocap from
the project we're working on. But what you can see
here that is pretty final is that I have the camera head
now being remotely controlled by the inertia wheels here. And I have a simple A
to B key frame system that you set up while
playing or while shooting. We don't do this using
Sequence or anything like that. And it allows me
to do camera moves that I like to do like
I was in the real world. So in this case, I'm kind
of emulating a dolly. And the earlier ones, I was
emulating a technocrane, where we end up above the actors. And this combination of
live camera operating while I'm watching
the action happening gives a really organic
feel to this and is, again, how I prefer
to do camera work. I like to do these takes
over and over again. And I like to
physically get better at performing the move live. And it allows me to
move really quickly. And, as you'll see
in just a little bit, it allows me to work
with the live director as well in a very
fast, organic way. So the way that this
collaboration happened was that, first, we came up
with a little script, mostly led by SuperAlloy on the creative. And then we did a remote mocap
session, which I don't think is anything new, but
it was new for me. So we got on a
Zoom call, and you can see we can see
two witness cameras, and I can see the
Xsense view of this. And we went through
the shot list, and we did some different
mocap takes of that. And I was able to
consult and just give my feedback from a
cinematography point of view what I thought would be needed. Really, I was just kind
of learning, in that case. So once the mocap was
first passed, cleaned up, nothing too crazy, then
all of those mocap files were sent to me. And I put them onto
these two characters, and I built a system
that would allow me to film it in real-time
using the setup you see here. So it's very much the same
tools, just on my desk. And what we did was
actually a live Zoom call where I could film
with the director live. So what I'm going
to play for you now is a clip from a
four-hour livestream that we did to YouTube
where we actually went and filmed this scene together. And we went through the
mocap clips, picked angles, and the idea was to make this
as much as possible to feel like a live action set. And I think there were
really moments in there where this felt like
doing things live. And doing it completely
remotely, on Zoom, I'm in Boston in my
basement, and SuperAlloy is in Las Vegas in their studio. Actually, I think he's
in his house in this one. And it was a great time,
and I learned a lot from it. And I think it was really
productive, actually. I think that if we had
done this independently, it would have been a much
different product than doing it and finding the
coverage together, even remotely over Zoom. I think it was
pretty successful. [VIDEO PLAYBACK] >> And rolling, and let's try it. >> There we go. Yeah, that's cool. >> Ah, that's OK. [GROANS] >> A little bit off. Wait a minute, let
him get into position. Let's see what happens. >> Yeah, I'm not sure he lands. This is rehearsal. >> That's OK. Let's just go over
the shoulder on him, and I think we can
buy it, you know? Yeah. And let's go higher up on his
shoulder just a little bit. There we go. >> And then-- so it's
going to be the punch, and then he's going
to walk over there. >> There you go. And just tilt down a little bit. There we go. Nice. That's looks good. Keep it here, keep it here. Oh, that one missed. But that's all right. We're going shoot a
different angle for that. >> And action. >> [VOCALIZES] Yeah, cut. We need to do slow-mo. [VOCALIZES] See,
that's beautiful. [INAUDIBLE] >> Yes, OK. And rolling, and speed. >> That's fine. >> Yeah. >> We got it, yep. [CHUCKLES] Love this. [LAUGHS] >> You like it. [CHUCKLES] >> Does that not work? >> You know, I'm operating, so
you have a better eye on It. I'm keeping them in the frame. It seems like it worked,
based on your reaction. >> Am I the only one
laughing out there? I might be. [END PLAYBACK] >>Matt: So that was
genuinely a good time. There were a lot of fun
moments and a lot of discovery, and live discovery and
live collaboration there. And for me, I think that
really is the future and what I'm going to be
working towards, for me, with virtual production
from my house. It's harder to do green screen
or LED virtual production in my studio. It requires real people to
be on set together, usually. But for something that's
all in-Engine, it's CG, I think the tools are already
there in Unreal Engine. And then combined
with Zoom, [LAUGHS] we can do these type of shoots. And I'm getting
a lot of interest from different companies
to do stuff like this. And this was really
our first test of it. I thought this
first pass was going to be about an hour and a
half, something like that. Ended up being four hours, and
then an additional hour session to finish up this
little short film. And we're going to go back
and do another pass of this. This was temporary
work-in-progress mocap. A temporary set. It's not finished, but we're
looking to go back and do another session-- maybe
we'll livestream it-- with the final
assets and actually make the final product. The rough cut of the
film is pretty funny. I can't wait to be able to
share that with everyone. So what I've been
working on lately is cleaning up
this whole process. And now I'm working with
HP, so I have a Z8 there as my new workstation, dedicated
for virtual production. And inside is the
NVIDIA Quadro RTX 8,000. So, because my workflow
is so real-time, I'm not going to Sequencer, nothing
is rendered or comped later, I am capturing it live there. So to get the most
performance out of Unreal Engine, the
Z8, the Quadro 8,000, that is allowing me to
do ray tracing and 1,080 or 4K live capture, and get
the best quality possible. And the primary interest
that I have with this is to do either like an
animated show, something, like, for Netflix, or entirely
CG music videos. And these are demos
where I'm kind of testing that workflow here
and showing different studios and record labels this quality
that we can get, really, just quite affordably and simply. And in this case, I'm
showing off my system where I'm recording
to the monitor, I'm handled with
the camera, and I can switch to the
wheels, all while getting this in real-time. And, like you saw before,
I can do this live with a director or
creative director. And we can do live iteration
and filming of these scenes like, again, we were on set. In this next demo here,
this is actually a demo that I'm showing as a
music video, which you'll see after this clip, and
really trying to push what's possible with this workflow. So we see a car driving,
and we see this actor getting out and walking around. I'm, of course,
filming it handheld. I'm filming it with
the Inertia Wheels. But the thing is I'm
trying to show here is that I didn't hand
animate any of this stuff. This is all basically done
using Take Recorder and Sequence Recorder in Unreal Engine,
which I'll show in just a minute. And so this allows for
really a fast iteration. I could do lots of
different takes, and it allows me to
film it really quickly. So you'll see here I'm
just controlling this car like it was Grand Theft Auto
with an Xbox controller. The actor gets out again just
like your standard video game because we're working
in Unreal Engine. And I'm able to record all
of this, essentially, as lo-fi mocap data,
where I wouldn't have a mocap volume
easily accessible that would be this size. I'm able to do this
just like a video game. And anything that's
a little funny, we can just essentially
film around it. And so you're going to have
to forgive the mocap here. I'm still testing out
different live mocap workflows and post-editing
mocap workflows here. But this is me taking
that same scene and turning it into a
proof of concept music video for a couple of people. So that's me in a mocap suit. Again, no hands, no fingers,
and a little bit stuttery. Again, that's my fault. But I think that
this works really well as a proof of
concept, that if we picture that this was for a
real artist and that was the B-roll in the
different locations, that we could pull off a
pretty compelling digital music video or all-virtual music video
with pretty modest resources, and it really wouldn't
take much time. Especially if we're using
things like Marketplace assets, like I am in this case. So this is starting
to wrap it up. What I'm developing now
on top of Cine Tracer-- really, a lot of the tech
that I figure out and develop ends up in Cine Tracer. But I'm also now building tools
that are for Unreal Engine Editor because there's
just a lot more power there and flexibility to
import assets, et cetera. So I'm really working on
something loosely called Virtual Production
Tools right now. And it's designed
for directors and DPs to be able to work
in any of these three categories of virtual
production using the same framework, the same
hardware, and the same set of skills. So one working in-Engine
where it's basically like a game framework,
like Cine Tracer, for previs in
virtual filmmaking. That's really what we saw
a lot in this presentation. Virtual sets. So working on a green
screen, primarily. Interfacing with
different camera trackers, because there's a lot. And controlling real
world lights with the DMX plugin through Unreal Engine. And having a
standard orientation of how those would be set up. I'll be showing more of that
on YouTube later this year. And finally, LED walls where we
can control the virtual world, again, using my hardware
interfaces that I like to do. It's really a tool
that I want if I'm going to start shooting on
LED walls again as a DP. And again, interfacing
with real world lighting, and having the interaction be
somewhat automated and more fluid, and less having to
do really heavy R&D. Just have it be more turnkey
and a standard setup when it comes to LED walls
and real world lights. And, again, a big focus is
on standardizing hardware. So you'll see that this
cart continues to grow. That's the Tangent Wave2,
usually used for color grading. We're working on mapping
that into Unreal Engine to make the color adjustments
much faster on the fly. And I'm working with
camera manufacturers, lens manufacturers, lighting,
grip, hardware, software from many different industries
are all coming together in Unreal Engine for virtual
production for the film industry. And I'm helping to just
facilitate communication, and then also with marketing
and just doing demos of it online just so people
can see how it works. Because this is
a very new field, and there's not a lot of people
producing content around it. So that wraps it up for this. Thanks so much for coming
to this presentation. If you want to
follow up with me, here's my handle on Twitter and
on Instagram, and especially YouTube, is cinematographydb. That's where I'm posting
my very long vlogs on virtual production
and eventually doing tutorials, and more demos
of high-end software, and also DIY solutions
I'll put together. That is where I'm putting
most of the content regarding virtual production. So thank you so much again. See you out there.