INSANE Filmmaking Tech Is NEARLY HERE

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
let me tell you about five Cutting Edge virtual Production Technologies that are so close to changing the industry as we know it and stick around cuz you're going to hear from the people behind the technology as well and what they think the future holds number one is a long awaited piece of software from octane render called Brigade this is basically their real-time path tracing solution from what I understand what makes Brigade actually possible is octane's temporal denoa for path traced Rays you can see in this demo video on the left hand side that's the real-time path traced cleaned scene whereas on the right hand side is a normal path traced scene when they actually let the path traced scene resolve it ends up looking exactly the same as the temporally denoised version which is bunkers really the implications of Brigade are massive because octane already has connectors and plugins for a whole host of 3D packages including Unreal Engine and Nvidia Omniverse when it finally does release it'll only be a matter of time before you find it in your 3D package of choice now it looks like it's going to be bundled in with the 2024 release next next year you could have unbiased spectral path tracing in real time in your virtual production scene that is massive for photo realism I'm really excited about this one get a look at this video that's not just me is it that lighting match is incredible so apparently what's going on here is they have about nine light panels all positioned around the talent to create an even distribution of light and then they are calibrated by software in order to match the color from the virtual scene that's obviously a very oversimplified version of what's going on here so I reached out to Peter from anti latency to tell me a bit more we have a locations we were surrounded by Green what inspired the need to say create cyber gaffer all other parts of the pro process you have to install it just once like a camera camera tracking uh background rendering pipeline all that you just need to turn on and and that's it it just works but that is not true for liting every time for every new scene you have to move fixtures adjust brightnesses colors and there is no way to do that automatically I think this is the final piece of puzzle for virtual production with this technology uh we will be able to install lighting just once you can use almost any uh rgbw light then this lighting will be always uh synchronized with the enrail engine of course this is not yet uh a finished product what is your say road map our first goal is to deliver uh closed data in January we will use that feedback to prepare a version one we really really wants to build a lot on top of it what is the what is the Horizon from from where we are now in some aspects of virtual production uh people keep trying different approaches and uh finally it will come together with a scientific more scientific approach I believe so I think the proof is in the pudding here those results are so good but the challenge will be when they start to incorporate multiple people movement and blocking but that's just challenges for down the road at the moment this is just really exciting let's talk briefly about Invidia Omniverse now I know that's not new tech but this is more of a prediction because I think Nvidia Omniverse has an untapped potential for real-time virtual production there are a lot of great creators using it for CGI short films like JS films who make amazing stuff but I'm yet to see it used as the primary engine for VP Omniverse has some amazing strengths when it comes to 3D content creation for instance it has application in tons of Industries like design and Engineering it also benefits from all of the RTX rate tracing technology dlss and all of the AI research that Nvidia does of which there is a lot and that's a topic for another video plus their nuclear server allows you to create large libraries of assets and share them with people across the world what it lacks is a virtual Production Tool Set we need video input and output camera tracking lens tracking lens calibration some grading and cinematography tools with a few of these implemented you could start to see Nvidia Omniverse rivaling Unreal Engine in the virtual production content creation space so my prediction is NVIDIA is going to shake up the game if you haven't heard about Goran splatting it's a way of creating a 3D representation of an object or an environment using static images so it's very similar to photogrammetry in that sense but these retain Reflections which are notoriously difficult to reconstruct they also run really fast so they're great for using in real-time applications like Unreal Engine so if all of this already exists why am I talking about it now well take a look at this a perfect 3D Recreation of a person's upper body it's a goian Splat created by infinite realities who in my opinion are at the Forefront of using this technology Watch What Happens Next yes a moving fully 3D person I reached out to Lee from infinite realities to tell me more about how this all works this year has been really transformative like really transformative mainly due to um the release of like Nerf research so neuro Radiance Fields uh Nerfs in general and then this year Goran splatting I mean even in the last few weeks the last month the amount of code releases and research papers that have come out are just insane the kind of quality that you are capable of what is it take to make that the synchronizing dslrs at once in time is very difficult especially with mechanical shutters but doing it with video cameras is even harder and having to do it at a much faster rate and that's sometimes 30 to 60 times a second you don't really know how good the data is until you inspect it after the fact so you have to be really well trained and disciplined and prepared in the setup so that you know the results are going to be good the setup that you've got infin realities can you just describe what that is to sort of capture your data we've got two separate systems we've got um a like a sphere shaped head capture system that has uh 300 plus LEDs on it it depends what we're doing in studio but it will have between 48 to 56 cameras on it then we have another capture system there a body capture system that has close to 500 LEDs on it and that rig when it's fully operational has about 160 cameras on it it's pretty mad I think there's like misinformation generally um online about what um Nurf and gorgeous blats can do in that you only need a couple of Cals it's just not true but as people have shown you can do some pretty cool stuff with like 16 cameras or 32 cameras we posted an example on our um X Twitter page but it had 20 cameras and they're quite they're quite spread out around somebody in the kitchen and it works if you're in that frust room that camera frust room and you move around but as soon as you go even 20 cm or so out it starts to break down because that data just doesn't exist it's not going to build something that they never saw who knows where it's going to be in 10 15 20 years time it it might be insane it might be a full-on Star Trek po de sort of thing you know now I covered switch light on the channel before and the amount of development they've gone through in the last 5 months is frankly astounding they were at the time a web application that ran in the browser that would estimate the normals and albo for any image you fed it and then allow you to either relight that image or download the maps for use in another application which I did in the results are in the video above but they were one of only three applications that did this and make no mistake they were the best I defy anyone to find me a better normal map I'm currently beta testing their desktop application which is so much more detailed and it can take video and image sequences as input and estimate the normal albo roughness and speculum apps plus if you're doing your relighting within their application they can do an extra AI pass which gives you high frequency detail and subsurface scattering quite literally the best relighting I've ever seen they also now have connectors to blender and Unreal Engine meaning you can now export your results out to those pieces of software and it will set up the materials for you that's really incredible I had some further questions for their CEO H Kim what is the biggest challenge when it comes to uh video relighting at the moment every single AI product out there that is related with computer vision it has some problem regarding temp consistency we are really working hard to solve this problem and I think we're really almost there I'm kind of confident that if we actually launch our product we are going to be one of the most um production ready AI tools out there what do you imagine switch light looks like in say a year from now that's a hard question um what I want to be what I want switchlight to be is not just a as I told you before not just the reli but we actually want to expand to other territories of bfx and virtual production we are talking to bfx artists if we are talking to like virtual production guys like you you're keep telling us there's there's more problems to solve we have great potential to solve each of those problems with our great AI team here at B I was curious whether you thought there was any future in twitch light to incorporate 3D and I'm I'm sure you don't have any solid ideas but how do you think maybe that might come about actually I have a more solid idea than you know because we we've been actually been brainstorming about 3D here at evil and um okay so only say things that you're happy to go in the video okay if you took a footage with your single camera maybe you don't have to only do relighting but maybe you can change the view after the shot was taken but it's not going to work for every type of scene it's only going to work on humans that is something that we are working on I'm not saying that we we solve this or we can solve this if everything goes perfect maybe in later you won't need to even do the camera tracking because you will you can change the camera after the show was taken wow I think you just blew my mind a bit actually I wasn't expecting that that's incredible each of these pieces of tech really deserves its own video so that's why I plan to cover each of them on the channel as they develop so if that sounds like it's something for you then hit that subscribe button just below and the bell notification to stay informed about new videos and then head over to watch this video to find out why I'm so addicted to fil making an Unreal Engine
Info
Channel: Joshua M Kerr
Views: 15,699
Rating: undefined out of 5
Keywords: virtual production, virtual production technology, octane, octane brigade, nvidia omniverse, gaussian splatting, omniverse, switch light, future of filmmaking
Id: hLjJJ8LU5iI
Channel Id: undefined
Length: 11min 1sec (661 seconds)
Published: Wed Dec 13 2023
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.