Dual Follow Focus + Dual Vive Tracker

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
one common type of virtual production is called mixed reality and that's where we're filming a real actor in a virtual set and the main thing that needs to happen to make this all work is that we take a real world camera and a virtual camera and we match them together as closely as possible so we've covered this before but if you want to match a camera's position or its orientation like which way it's looking you can simply use a vibe tracker but what do we do to track a real world lens hello and welcome to this virtual production vlog my name is Matt workman and that is what we're gonna be covering today so we're trying to solve the problem of matching a real world lens like this in Unreal Engine and as you can imagine there are some pretty high-end solutions for this already that worked really well but they are expensive it's expensive to make custom hardware custom software it's all really expensive so what can we do at the India level so behind the scenes I'm working on a couple different hardware partnerships and developments to try to make this possible but I actually came up with an uber Indy way to do that today and it works kind of the solution I came up with I'm calling the dual follow focus dual live tracker setup now this is probably the most like lo-fi way of going about it it's possibly the most inexpensive way but it is a little bit wonky you know spoiler alert I don't expect a lot of people to do this because I think we're gonna have better solutions in the future but if you wanted to do it today right now and you want a system that actually works well that's what we're gonna be showing so first I'm gonna go over the backstory and how this kind of all originated and then I'm gonna go over the five steps I took to get this system to actually work the back story so I've been trying to figure out a lens encoding solution at an indie level for quite a while shut out I got a look at his name probably to say it on Kyel Natali's he is a member of the virtual production Facebook group which is linked down below he I think not jokingly but kind of you know loosely suggested that we put a vibe tracker just on to a follow focus wheel right the follow focus is already changing the lens and then the tracker we could track that and I was like I immediately wrote back and I was like that's a really good idea I don't see why that wouldn't work and so I thought about this for a little while and then it finally clicked in my head I was like what if we used to follow focuses and so I already had one follow focus and I went on Amazon I bought like a $50 follow focus there's a lot of like really really inexpensive all the focus is out there right now so I bought two of those because I knew I was gonna probably break one kind of like opening it up and stuff and I kind of did so step one was assembling the hardware and getting this - just like mechanically work so the idea was to take a Manfrotto follow focus but I already had and stick that onto my Canon zoom lens and for the record that Canon EF still zoom lens is the hands-down worst lens that you could possibly do this with but I still got it to work so I'm pretty pretty happy about that we had to put a lens gear on the actual lens and then with the Manfrotto follow focus we actually have mechanical hard stop so I can set a hard stop mechanically for the minimum a mechanical heart stopped for the maximum focus distance so we won't just infinitely spin the ring that's the problem we would lose calibration if the ring can just spin forever so we kind of have like fake hard stops on the lens now and then because I have a side plate for my Blackmagic Ursa mini Pro g2 I was able to put another rod on the side like it was a Panavision camera they have rods on the side at least the old ones did and then I was able to mount the second follow focus on the top and so now when I spin the regular follow focus it spins the second follow focus and I just plopped another vibe tracker on the top and that's it that's the mechanic so if you can pull that off with your camera wig you're ready for step one step two was to get both five trackers paired with steamvr and getting them tracking in Unreal Engine so if you haven't worked with the vibe trackers yet don't throw away the dongle that's in the box like I did you need those you're gonna need two you need a dongle and which is basically an RF receiver per vie of tracker so I have two dongles plugged in to my computer and all you have to do is pair them in steamvr and I had to update both of them as well really easy consumer video game kind of stuff to do and when it all works in steamvr when you turn them on you will see both boom vibe trackers will show up the next step is to get this up and running in Unreal Engine so I'm building a blueprints I'm actually building a whole framework for virtual production and in India level myself talk about that at the end so I started a new actor a new blueprint and I downloaded a 3d model of a Blackmagic camera I have the vive tracker 3d models from the HTC developer site which I can talk about that later as well and in another video and kind of prepping this yourself and so I basically have it now so that both five trackers are connected in Unreal Engine they're tracked and they're placed and you can actually see them in 3d in the way that they are in the real world so this is really helpful for me I'm a very visual person I like to see like is that where the trackers are relative to camera and so now I have a rig that does that for me I can enter the offset of the vibe tracker from the sensor and that all works out pretty well so Stevi are connected Unreal Engine also connected now step three is to get the actual rotation data from the focus vibe tracker and be able to get that in a consistent way so when we work in Unreal Engine and we have the five tracker you can basically just ask for its rotation that's really easy but that doesn't actually work 100% of the time in this case because if you pan or tilt the camera you're going to actually change the orientation of the vibe tracker that's spinning right so this is spinning but so is the camera and so is this so it's a little bit messy that will not work in this system so what we actually have to do is take the vibe tracker and delete or remove the rotation of the camera trackers rotation so we we're gonna have something like this happen and we have to cancel out this rotation from this vibe tracker and we only want the rotation on in this case and Unreal Engine that's the z-axis so once I got that up and running in Unreal Engine I could pan and I could tilt the camera and I could still ask for hey what is the rotation of the vive tracker and I called that in my blueprint I call that the rotation value right and that value I basically want a map at 0 to 1 but in this case I was keeping the actual raw data to start with I'm still spinning my hand step four was to map the rotation data to the lens rotation data or the focus distance and this is called lens mapping now on a professional set on a really big virtual production shoot you are gonna super map all the lenses and it's a pretty complicated and drawn-out process you have to map things like field of view aperture focus distance distortion chromatic aberration entrance pupil there's other things and you don't just map it and it's like one value you have to make a big matrix and so you have to map it for every focal length if it's a zoom lens you have to match it for every aperture in every focus distance so you're taking all of those values and multiplying them together that's how many times you have to map the lens put them in a huge data table and then interpolate correctly to the right data depending on where the lens is so it's a it's a big mouthful like I said the high-end systems they charge a lot of money because they've figured this out they've mapped a lot of lenses they have a system for it it works it's reliable it's hard to do so me at this like mega Indy level I'm not gonna try anything like that and like I said the Canon still lenses you can't even really do it it's not worth even trying but the one thing I can do is one map for one focal length a couple distance marks right and that's all I did this lens again is it still ends it has like no witness marks on a cinema lens you would do a map of every single one of the witness marks on the side this lends them oh I'm sure it's over there but this is like the very similar lens that lens over there has five witness marks it's like point seven meters one meter one point five three five and then like infinity so they I guess six so I basically spin the lens to 0.7 and I look at the rotation value and I map it then I go to one map that 1.5 etc uh and I make a map using an Unreal Engine curve and I interpolate back and forth so when the lens is at this rotation data I just tell the camera to be at the looked up focus point and that's like the very basics of lens mapping it's a lot more complicated if you want to map distortion in you know UV distortion and a whole bunch of other stuff but to start right we're staying very indy this is like a very out of its entry level but it's a basic look at virtual production i was able to do it and it worked pretty well fifth and final step of this was of course to test it so i take my new camera tracker class that has double vibe trackers and all the data being calculated and set up in a nice way that's accessible and i brought that into my virtual production set that is like a kind of a spaceship right now and it already has composure set up i'm doing the live key and now i just rerigged everything so now that my camera tracker sends the data to my CG camera layer if you saw the composure tutorial you'll know what that means so now the camera tracker which is being driven by two vibes is handling camera position rotation and now it's also feeding focused data things like sensor size and focal length I just enter that manually directly into the camera I have not encoded those things I really don't need to for the stuff that I'm doing so the test was to put the llama up he's back there still and focus to him at one meter and then see are we focused at one meter in Unreal Engine and we were and so the whole reason we do this is because controlling focus and wracking focus changing focus in shot is a very important cinematic tool it's one of the most important ones Unreal Engine looks really good out of focus which it sounds funny but all their game engines and even older unreal engines they didn't look good it didn't look like how real lenses look and that's what we're like trying to do in most cases so what we can do is we can focus from the llama and we can focus to the background and it looked like we're actually doing that so I'm focusing maybe in this case to like a CG character and then I can focus back and the lens in Unreal Engine is working very similar to the lens in the real-world so this is the whole point of doing this this is so that the DP and the first AC or maybe the DPS pulling focus themselves they can actually just operate the camera like they normally would make focus decisions like they would on set do focus polls do racks and they don't have to think about anything additional they don't have to worry about oh what's it doing an Unreal Engine and blah blah blah no DP you frame up you move on a dolly or crane however you want Wireless follow focus and you work like you normally do and you use focus to set the tone to tell the story and we're doing this in mixed reality so it's really kind of cool to force a rack focus for the first time I did it well I've done it before but this is the first time doing it on my studio to rack focus from the llama to the CG character and back it's like wow when you look through the monitor you're like I feel like I'm almost there like and when you're shooting it and you feel immersed in it like you feel like it's there I think that gets you more into it you get more creative you can think of more interesting shots right you're not just racking to like green screen and back and you're like I have no idea what I'm doing you feel like you're there the audience will feel like they're there to because you're gonna treat it that way you're gonna make decisions they're gonna look good cinematically for the story to sell the product etc so the llama is like my de facto test subject because I have it it doesn't require any people but I also wanted to test shooting an actual person and wracking focus and testing the key and everything make sure that it looks good on an actual person which is the most important part so thank you again to my wife Diana Levine for helping us out with that everything's looking pretty good I think that there's still a little bit of delay matching and some more tweaking that can be done to this system but honestly as I'll tell you in the next parts I'm not that concerned with it so everything is up and running and conclusion time is this a good system and should you maybe try this out I would say no this has been kind of like a really fun experiment for me I'm really anxious to get into lens encoding and like I said I have some much better solutions I think coming out but you still have to go through the map and you're gonna have to do a very similar process you'll just have more consistent hardware this is like the definition of a hack it's really wonky and most importantly it's really fragile every time you change a lens you're going to need to recalibrate this entire system and to be honest getting the follow focus to actually beyond the lens so that it keeps the tracker on the top so it can be seen by the base stations it's really dicey you're gonna need a whole lot of MacGyver a lot of camera rigs to even make that work and to be honest it's gonna slow things down so so much so like I said I have some solutions that are coming out I can't give dates or name partners yet but I'm working on it I want this to work and I'm trying to get other people to help us build solutions they're gonna be coming when they happen if you've watched this video you'll just know it's like oh thank goodness we can do this so much better now so I don't fully recommend this system to people because there's gonna be better things coming but if you're anxious like me to just get focused at all you just want to do it and you don't want to spend you know the super account of money on the high-end systems just yet and you're okay getting another vive tracker this does work today it can work in certain situations I mean you can't like go crazy with it but it does work but I don't fully recommend it I think we're gonna have something better in the future so that wraps it up for this virtual production vlog hope you enjoyed the journey of building this kind of like wonky door follow focus encoding system I am personally happy that it works I had fun building it I learned an awful lot doing it as a developer and this brings me to my next subject and final subject is that I'm starting to build a framework around this because every time I go into Unreal Engine you don't want to set this up from scratch right I have now a blueprint that can handle that rig in tracking and has an offset built into it and as I add more and more features to it this is becoming like its own like mini Cinna tracer but for virtual production in the engine and there's things I need to do with the game mode and the player controller and there's the way is that I like to personally control a virtual production set and I am starting to package this now in a more serious way and I'm going to be calling it for now virtual production tools for unreal engine so kind of kind of a long name but it's going to be basically a framework like I said player controller mode actors characters UI is like little camera UIs and widgets to troubleshoot this and get it up and running and it's for now gonna just beat me from me and you'll see it develop over time but in the future if people are interested in they're working with vibes and indie stuff and some of the other systems we're gonna integrate I'll probably eventually release it somehow not sure about that no guarantee either but I have started the project officially because for me to even keep doing these tests and talk to lights and talk to those things it's just a lot of programming and little things to put together I don't want to have to do that every time I just want to load up virtual production tools good to go so the last thing I want to mention is I always do is if you want to continue this conversation learn more learn the specifics join the Unreal Engine virtual production Facebook group I run it and there is a lot of people in there now there's people sending up their own led walls if you want to see that lots of vibe trackers lots of high-end trackers as well people who are thinking of building hardware for this manufacturers who already build hardware I like getting info and talking to each other and it's it's growing pretty well and thank you to everyone in there for giving me feedback I post stuff constantly I'm like the most spammy person but it's my group so I can do it I want but I'm not looking for internet fame or cloud or likes what I actually want is feedback real feedback and suggestions even if people like that's crappy just hurt my feelings at first but it's like that's crappy you should do this I'm like oh but that's a good idea that's a really good idea that's how this follow folk has happened is the feedback I put this stuff everywhere cuz I want to either get the attention of someone who can help us build it better or someone who might just give a comment that sparks an idea and the more ideas the better up to a certain amount that but but like I just want to get as much feedback as possible because solutions come from that data I use the internet as sonar I shoot it out and it comes back to me and the more data that is good data and then I can process it makes better solutions so that's the virtual production group come be my data get data from other people that's about it I'll see you guys on the next video
Info
Channel: Cinematography Database
Views: 40,503
Rating: undefined out of 5
Keywords: cine tracer, cinetracer, unreal engine, ue4, previz, previs, virtual production, virtual cinematography, virtual camera, focus encoding, mixed reality, lens calibration, vive tracker, htc vive, virtual reality
Id: QSVueDjI86g
Channel Id: undefined
Length: 17min 43sec (1063 seconds)
Published: Thu Mar 26 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.