GAME ENGINE DEVELOPER Reacts to UNREAL ENGINE 5 Demo

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments

Best explanation!

๐Ÿ‘๏ธŽ︎ 8 ๐Ÿ‘ค๏ธŽ︎ u/No-1HoloLensFan ๐Ÿ“…๏ธŽ︎ May 16 2020 ๐Ÿ—ซ︎ replies

Around the 10.15 mins mark he breaks down the billion triangles statement, quite interesting as even if number of triangles = number of pixels, 1080p is circa 2 millions pixels and 4k circa 8 million, by that calculation, 1440p is around 3.6 million triangles. Interesting.

๐Ÿ‘๏ธŽ︎ 10 ๐Ÿ‘ค๏ธŽ︎ u/tandeh786 ๐Ÿ“…๏ธŽ︎ May 16 2020 ๐Ÿ—ซ︎ replies

You can really see that he is passionate about it.

By far the video with some sort of "insider" perspective so far.

(BTW, if you have a better one, please send me the link that I want to see it. This was a real treat)

๐Ÿ‘๏ธŽ︎ 9 ๐Ÿ‘ค๏ธŽ︎ u/Dorjcal ๐Ÿ“…๏ธŽ︎ May 16 2020 ๐Ÿ—ซ︎ replies

So the part that sticks out to me the most is when they walk into the room with the 100โ€™s of statues; He talks about how amazing it is but how the demo must be HUNDREDS of GB for the demoโ€™s file size. So what Iโ€™m getting from that is that a full fledged game using the entirety of the technology brought into this demo is just not feasible. I get that thereโ€™s compression blah blah blah, but that would of also been used in this tech demo to a certain degree. But probably very little to full show off the tech.

Ok, so then now the โ€œXBotโ€ in me starts to consider the advancements in compression technology that Xbox has strived for for this gen, and not just gone for raw throughput and I/O speeds like the PS5 seemingly has gone for. So if a game of this fidelity requires that much of a hardrive footprint, and Xbox focused more on compression tech (like BCpack), then wouldnโ€™t the Xbox be more capable at running an ACTUAL game with the tech running on UE5? Iโ€™m sure the PS5 probably would fair better in a head to head comparison on the demo (especially that flight scene) with just a straight up download of the entire demo and running with its uncompressed data. But In an ACTUAL real world application; where assets are heavily compressed to fit within a game file size, then perhaps the Xbox will perform better?

Two scenarios here:

1- Load the 500GB (just a guess) tech demo on the two consoles and run it. PS5 runs it better??

2- compress that same demo into a 50GB file and then run it. Xbox runs it better??

Just a thought...

๐Ÿ‘๏ธŽ︎ 6 ๐Ÿ‘ค๏ธŽ︎ u/Carsickness ๐Ÿ“…๏ธŽ︎ May 17 2020 ๐Ÿ—ซ︎ replies
Captions
hello guys my name is the Cherno today we're gonna be taking a look at Unreal Engine 5's new amazing demo so yesterday I woke up to like a lot of messages on every platform telling me to check out this amazing new demo that Unreal Engine has released and I thought that it would be fun to give you guys my thoughts on this and I have actually seen it once before so this isn't gonna be like a completely new like reaction video or anything like that it's more gonna be my kind of thoughts on on this video but the time that I did say this was quite brief so this is going to be a little bit more of me taking my time enjoying the video and just letting you know what my thoughts are I'm actually really excited to look at this more closely now for those of you who don't know who I am I like to make game engines I worked as a software engineer at EA on various game engines first I worked on EA's primary mobile game engine called Osiris and then later frostbite in fact see I have the frostbite pen this is all I have now and then I quit all of that because I wanted to work on my own game engine called hazel as well as make youtube videos for you guys so that is what I am focusing on now ok so without further ado let's dive in and take a look at Unreal Engine 5 also I'll have a link to this video of course in the description below hello I'm Brian Karis technical director of graphics here at Epic Games hi I'm Judith object / special project I just want to say that straight away this kind of engineer artist pairing just completely reminds me of my life so back at EA I did a bunch of like various media things because of course i am i have a youtube channel i can edit youtube videos and all of that and i had a friend and i still friends with actually the technical artist that we had on the team and the two of us would always do things together we made various like silly videos for like internal EI use we even went to GDC together in san francisco there's a good time and this is just ah just so nostalgic few years ago we got together as a team and brainstormed where we thought we could push forward state of the art in real-time graphics there are two key areas that stood out the first dynamic global illumination yeah um dynamic global illumination so GI this is definitely one of the areas that probably needs the most improvement in in fact this is something like doing Jimmy G iodine a mcclee is something that the industry has been trying to do for a very long time it's actually one of the big reasons why we're separated so to speak like fidelity wise with like the film industry and offline graphics because what global illumination is for those of you who don't know is more or less a simulation of how light bounces in the real world so light doesn't just kind of hit an object and then completely disappear it bounces off an object and then hits another object and this kind of culmination of light bounces is what gives every object you see in the real world its color now in offline graphics and like visual effects for movies and all that they're able to use ray traces which is basically a very realistic way of rendering something but it's very very slow and using a ray tracer enables you to calculate the light bounces perfectly however in real-time graphics that's not really possible to do at the same scale that's why there's so much buzz about all this kind of real-time ray tracing stuff there's r-tx stuff it's amazing because it lets us do certain ray tracing kind of tasks in real time it can give us perfect reflections or possibly more importantly not perfect reflections meaning kind of blurry reflections which are very difficult to do otherwise it can give us beautiful ambient occlusion it can give us absolutely perfect shadows all of these things we can now kind of do in real time but it's not really anywhere near at the same scale in fact we rely on a lot of kind of denoising algorithms to try and make the scene look good because of the low amount of samples or rays that we're able to actually calculate so because of all of this global illumination which you know requires ray tracing essentially it's very difficult to do that and we have a lot of approximations that try and kind of you know simulate this light influence but ultimately it's very difficult now doing all of this offline is still something that the industry does quite a lot meaning we can take our time to calculate absolutely perfect global illumination for a scene and then bake it into some textures which we can then just sample run-time it's very very fast but it means that everything has to be static we can of course have multiple kind of versions of these so we could have like a morning scene and afternoon scene an evening scene worth of like global illumination but if the scene changes dynamically that data is useless meaning that like maybe if we have like a cave and then a rock kind of moves off and reveals all this kind of light entering the cave now and that has to all bounce around the cave and give it kind of that ambient lighting we can't really do that that has to be dynamic and that's what they're talking about here beautiful bounc lighting instantaneously I don't have to be constrained to do a game where the work has to be static and I'm able to eat a lot faster and we yeah that's that's important as well it's not even about like making stuff dynamic even for a static world if you're able to kind of change the lighting as you're working on that world and see it kind of you know change instantly that also is kind of quicker iteration times it makes the development of the game a lot faster because you don't have to wait hours for your computer to figure out where on earth the light bounces of going all this new system lumen we're about to show you what it is capable of but first there was another area that we thought we can push forward truly virtualized geometry the artists won't have to be concerned over poly counts draw calls or memory could directly use film quality assets and bring them straight into the engine and that's a big deal parties I just want to be able to import my ZBrush model not photogrammetry scan my CAD data without wasting any time optimizing creating melodies or even lowering the quality to make it hit framerate and the end that's what it's all about that's what I like about this as well by the way everything I've heard so far it's not just like an end result in terms of visual fidelity these are actually workflow improvements that's a big deal because if artists can spend less time doing like manual labor tasks such as reducing the poly count of a very high quality model they can just put it in and let the engine take care of it automatically that's honestly amazing they mentioned photogrammetry scans and that's actually a really good example of this if I go out with my camera and just take this amazing like high quality photogrammetry scan of like a rock in the real world that's great it's gonna look amazing but I can't really use it in the context of a real-time application or a game because it's just too high quality so I have to go in I have to kind of remove everything doing this automatically can sometimes lead to a lot of visual artifacts so it's good to just try and clean that up kind of by yourself but the fact that you have a rock with like millions of triangles now and you have to spend time cleaning that up it takes the fun out of everything who cares about this rock like I'll just build a rock from scratch you know sorry environmentally realistic rock I can't use you man you're too good so this is this is good art it just works and we hold this on technologists what that's brilliant here is a nice feature of Unreal Engine it's running live on a Playstation 5 all right all right so I I don't know the specs of a Playstation 5 um let me know off the top of my head I know that they're supposed to be quite powerful and I also do want to emphasize the fact that it is new hardware which means that I'm sure that they on one hand they probably couldn't like do amazing work with it that they will be able to like 5 years from now because you know at the introduction of any new platform it takes a little bit of time to get used to it but on the other hand since it is a dedicated piece of hardware if they were provided with very good specifications and you know they they probably collaborated with Sony on this and they spent a lot of time on this then potentially they could extract a lot more performance from this than from just a PC and the reason is that if you're running on dedicated hardware which I imagine has like ray-tracing cos and all of this other like you know dedicated hardware I guess then it's actually easier to optimize for that because if you know exactly what hardware you're running on then you can specifically optimize for that hardware so I'm kind of torn like on one hand PlayStation 5 ok that's cool this new hardware but on the other hand since it is like a specific piece of hardware there's a lot of optimization potential there as well [Music] this by the way I have so many things to say at this rate I feel like this is gonna be like an hour long so I might just keep it a little bit more more you know not as detailed Wow right off the bat this kind of reminds me of the photogrammetry style tests that embark we're doing on Unreal Engine just I guess maybe just cuz of the aesthetic but looking at this seem like immediately like the detail is absolutely astounding it looks it looks like it definitely could be an offline render when the sound the sound is really not any let's stop a moment and take a look at some of the key features of this demo much of what you see was built with quick silmido scan assets but instead of using the game versions we use the cinematic versions which would typically only be used in film they're around a million triangles each thanks to virtual texturing they all use 8k textures as well Nanuk can render an insane number of angles very quickly there are over a billion triangles of source geometry in each frame billion with a be the nanite crunches down losslessly to around 20 million at drawn triangles okay 20 million that's to quite alone what does that many triangles look like I mean maybe a ps5 can handle that but Wow okay this isn't knowing is this is these are these a triangles each a different color yeah so what you're saying here is essentially like a an ID of like a color ID of each triangle here so each one of these colors represents a different triangle it's probably not enough colors and I hate because I represent there's actually it 20 million yeah there's not there's not even enough colors there's not even enough individual colors available to us to actually visualize all of these triangles so they're probably having to reuse colors yeah that's that's that's a lot of detail the thing is with triangles in general like they mentioned that you know is over a billion triangles like that's all great but the the reality is that you know if you're rendering to like a full HD display for example that's only about 2 million and a 4k display would be well you know four times that so it's like 8 million pixels if you have that many pixels and you have a billion triangles then you know if what if if each triangle takes up one pixel that's still too much data so what they're having to do here I imagine is some kind of dynamic like level of detail algorithm or something that's actually able to work out what triangle of this kind of 3d model is going to have the most influence on the outcome of the scene and that's probably the triangle that they end up picking but again because it ends up being so you know it's so small in this scene like these triangles are tiny some of them you can kind of see like you know these triangles here obviously are quite large you can see they're in the shape of triangles here over here as well but most of them since they're far away are just one pixel large and if they've got a good way to kind of call about lists of triangles down and obviously have a really good data structure that you know is very cash friendly and all that so they can actually iterate through those triangles and get the right data for each triangle as fast as possible then they have to work out what triangle to keep what triangle to discard and essentially what data from that set of vertices they actually want to keep and then using like you know different calculations that would be done in a shader such as you know sampling from textures lighting equations all of that stuff so lots of work going on so small the like noise nanite achieves detailed down to the pixel which means triangles are on the size of pizzas yeah exactly this amount of geometric detail requires shadows to be pixel accurate as well and then I can do that too mm-hmm yeah I do really wonder how they're doing shadows specifically they mentioned that nanite does that which is the virtualized geometry thing but I guess that since each since a lot of this is done kind of in a per pixel basis they're just able to you know you probably use those billion triangles when trying to work out what is occluded by you know what occludes the light source and what causes a shadow speaking of lighting all of the lighting in this demo is completely dynamic with the power of lumen that even includes multi pounds global illumination no light Maps no baking here without UI all of that people okay so this is without G I say so this is just like a directional light with like a shadow Kosta no bounce lighting at all meaning that once the light hits a surface it that's it the light disappears and you can see that it looks well looks terrible it's completely dark you can't see anything because the rest of what you're seeing beautiful but imagine I'll show in a minute with lumen enabled with dagger that that is all of that light bouncing around this entire environment all this ambient light is actually calculated by I guess in some way simulating or tracing how the light bounces off the surfaces that it does directly hit can move the light and the bounce changes instantly okay this is interesting so they're talking about the bounce changing instantly labeled we can move the light but I can see here that if I go frame by frame you can see that this area as a changes doesn't immediately change it kind of fades out it's difficult to say whether this is a deliberate visual effect to make the light transition the same smoother which it very well could be but also what they might be doing is taking a couple frames to actually recalculate this new bounce lighting so the bounce lighting can still be dynamically calculated and baked into some kind of light map and then obviously apply it that way it's kind of cached and you can keep reusing it without having to calculate every frame which you know I imagined they probably would be doing some form of that and then you know maybe if that lighting calculation takes you know a couple hundred milliseconds or a couple frames they can kind of keep reusing the old ones while the new ones are being generated and that way it doesn't change instantly but you get a transition but again they could be doing in this oldest real-time I don't know it's just that kind of fade seems a little bit interesting but they they are saying it's instantaneous so maybe it is you move the light and the bounce changes instantly okay let's keep going we've made some great additions to our audio system as well convolution reverb allows us to measure reverberation characteristics of real spaces like actual caves that we sampled and reproduce them in virtual spaces so this is really cool like first of all I just want to point out that audio is a huge deal like if you guys watch this demo muted it will feel completely different then if you watch it with headphones and you know completely the it adds so much immersion like audio is incredibly important and I do not doubt for a second that if they had not gone with high quality audio and all this fancy audio kind of technology that this just would not feel anywhere near as good they also mentioned sampling like a real-world reverberation that's really cool and that's really useful because you know with physically based rendering what people have done is they've kind of gone ahead and sampled how light reacts off of different surfaces in the real world and how it like bounces off that so that we can then use that in our physically-based materials and that's part of the reason why rendering looks so good now so the fact that they're kind of doing that same process essentially for audio they're able to actually measure really like reverb characteristics in like real-world caves and see how that reacts that's really cool and that's definitely gonna bring a whole new level of realism to just kind of stuff sound field rendering allows us to record and playback spatialized audio so good all of this adds up to a more immersive experience yeah I don't doubt that animations are really on point as well by the way like all of this stuff like feels really good actually I really like the kind of you can see that um you can see that this kind of shadow here is kind of it's a little bit red as well it looks like it's kind of blending in with the subsurface color which is again really really nice and definitely has a whole level of realism you can see the light going through our fingertips and all that subsurface scattering here is kind of making all of this you know all have her hands he looked a little bit red and that that goes a long way to making really good kind of realistic looking skin so they've done a really good job on that as well this swarm of bats was created with a Niagara effect system particles in Niagara can now talk to one another and understand their environment like never before if this sound is amazing festival but then also the possible functionality to run fluid simulations like you see in the water below have to say that so far I think the worst part of this demo is the water I mean it doesn't look terrible but I don't know people like you know characters wading through water that's that's been done better before but you know I mean technically though they could be running this on a completely different level of simulations so it's less static and you know maybe that's why [Applause] the demo runs on our Kaos physics system here we are using it to accurately simulate the rigid bodies of the falling rocks and the cloth of her scarf yeah I imagine that each one of those rocks is probably completely like a physics body and reacts to the environment like it should which is really cool and of course the characters always have thumbs kind of scarf or like a cape or something on to get that cloth physics in them I love the ambient particles floating around the ones that aren't really collide with anything but that factor is a hole that goes a long way for like visual fidelity as well now that the environment is so complex we've needed to greatly improve our animation systems to adaptive we've added predictive foot placement and motion warping which dynamically modifies I K and body position to look more natural that's that's so true as well I'm glad that he mentioned that you know because they've come a long way in other areas they now need to come a long way with their animation because again like if you you know if you have not-so-great looking natural realistic animation but it just looks realistic like everything you're rendering looks realistic the animation isn't it's gonna completely just destroy your whole scene so this animation looks very good and I think that's also very the character to more realistically interact with the environment we've added the ability to trigger seamless contextual animation events like her hand on the door Thomas mm-hmm-hmm yeah so the fact that she's able to put that put her hand on the door and that's obviously not like a premade animation that's just a contextual animation that the kind of animation AI has probably figured out based on the context of where she's moving and how she's moving through the scene I think that's you know that's incredible and using that kind of procedural AI animation really I think is the future because you know for example if your characters the wounded maybe they've been like shot or something and they like staggering and they're just grabbing on to like nearby objects because they can barely walk that's gonna go a long way and I think that that's that's really cool dynamic Ti is amazing not just for speeding up iteration but also for its impact on gameplay any light source can move while still yeah say like the fact that I turned out that this scene is completely dark there's no light sources there's no ambient light in the scene at all it's enclosed the only light source entering from the outside is quite far away but she's got this kind of particle that's that's emitting light she is shining it and then you can kind of see that all the areas around her like they're very dark but you can still make them out and that I imagine is because of that dynamic kind of global illumination line balancing well having beautiful bounc lighting dynamic illumination yeah and you can see that specular reflection that he's talking about and specular as well which you can see on all the specular collection that he's talking about all this Niagara powered bugs reacting to the light yeah and then the the fact of the particles are acting to the light as well as luminous not only reacts to moving gathering all right but also changes in geometry yeah well yeah I mentioned so then of course it's dynamic yeah that's all the dross sounds really good by the way but also you can see that's what I was talking about like you can bake lighting for a scene and it can be perfectly you can look great but if you need to like remove part of the same like this as a if it's not like in one go they can see these rocks are kind of gradually react like falling away if we could rewind a little bit in geometry yeah so you can see that the rocks are kind of yeah they're falling slowly away and revealing this stuff and as that's happening like the scene is getting brighter I love the lighting on the actual rocks falling as well looks really really good and yeah there's that hole up there yeah really cool and you can see completely changed the lighting of the whole scene right like it's now just completely remember we mentioned highway all the assets this statue is imported directly from ZBrush and is more than 33 million triangles no baking of normal maps no authored LEDs yeah so that that's great so no normal maps on on this at all and no LR dates no level of detail kind of models which is you know two things we could talk about that but um essentially the point of this I think is to basically say that you can take this amazing high quality model that you know artists don't even have to worry about that poly counts that triangles like I mean they probably will you know on one hand because I feel like if you had this in Maya or in blender or something like that and it was that many triangles she's gonna make it very difficult to work with but even if Unreal Engine can handle it maybe the other tools necessarily can but anyway um you know having this having to not worry about this having having the ability to sculpt and model and do all of that stuff with basically you know seemingly infinite detail that's really cool because it means that you don't have to then go through and create all these lots all these kind of level of detail models which are basically just you know lower poly count versions lower quality versions of this original model because you need them for different rendering scenarios like if the camera is very far away from this actual model being able to get your like source asset and just drop it in and have Unreal Engine take care of the rest in real time right in the runtime is amazing and is making of normal Maps as well you know normal maps are something that we use to basically simulate more triangles than there actually are so you can apply a texture which has like per pixel basically data which says we're like certain you know certain triangles and fake triangles we're then almost would be if they were there and obviously if we have 33 million triangles we don't need normal Maps because we don't need that kind of you know cheap way of getting more detail because we have more detail and we can do more than a single statue there are nearly 500 of that exact statue at the same detail level placed in this room for a total of over 16 yeah that's amazing that is amazing just having to sort through all this data like you know I mean these are instanced and everything but like I'm sure that is damaged probably like over 100 gigabytes if not hundreds of gigabytes because there's so much you know texturing so much to me triangles so much data so just reading that data you know that's why it's interesting to look at the ps5 architecture as well I'm sure that a lot of this is obviously being streamed in from disk every frame that disk that SSD has to be blazing fast so how what technology they're using but demo that's gonna that kind of that like relation between disk and then you know vram and all that stuff has to be really tight and I think that might be even specific to this architecture there are hundreds of billions of triangles yeah really cool [Applause] [Music] so with nanite you have limitless geometry and with lumen you have fully dynamic lighting and global illumination yeah and then all gaming on a Playstation five I should probably have looked at the PlayStation 5 specs and architectures below this video but I'm assuming that's probably released by now but yeah it looks absolutely amazing and I also wonder what resolution this is and I'm washing it in 4k so assume maybe they're entering it from there but uh yeah look at all that detail that's really really amazing lockout is everything's falling apart around her because of course that makes the technical demos seem better but she's not she's really not having it good to hate this doesn't need to be constrained to small rooms it can stretch all the way to the horizon hmm I was waiting for this I mean I've already seen this demo so I was coming but I'm still waiting for it yeah well just look at that yeah not much stuff going on I feel like we should always go through this frame by frame that motion blur is pretty nice we'll follow those rocks yeah this is really cool also it's kind of hard to tell because obviously this video is compressed and put on YouTube but some of the artifacts like I'm seeing probably with video compression it's difficult to say just like tell if they're actually noise within the renderer or if they're actually just video compression I'm just assuming the video compression though hmm some of them are probably noise but yeah this looks this looks phenomenal this looks like a movie this looks like it's been you know this looks like it's been offline rendered for sure and the thing that strikes me as well is that you have to remember that this is not just rendering this is not just a scene that's been completely scripted and it's not just being rendered a lot of this is actually well it's it's interactive right the fact that it's interactive the fact that all of this has to be simulated and you know kind of computed in real time you know rendering aside that's also amazing and obviously with the kind of billions of triangles that they're talking about with source geometry there's more to that and getting it on the screen you know obviously it has to interact with the environment around it and with the player and that's a whole nother problem of kind of dealing with models and dealing with assets of that high kind of high fidelity high quality stuff and making them you know also be interactive and all that stuff you know not just the rendering that is impressive here for sure [Music] depth of field is pretty nice as well they've always had good depth of field that I think that's really important [Music] well that's probably right in real time as well yeah that's amazing 2021 that's a while way all right sir yeah my final thoughts on this are it is it's incredible like it's really good there's a lot there's a level of detail here that we just probably like I don't think we've ever seen before in any game engine I think that I think they've taken all of that 49 money and actually like put it to very good use because this stuff is for sure industry-leading technology you know being able to deal with that amount of source data in real time and kind of work out what you need what you don't how to put that on the screen how to make it interactive that's a huge problem and they're solving it like well seemingly very well I can't wait to actually dig through this and see how it all works and they think I think they did mention like a technical breakdown would be coming so I'm really really excited for that if you guys want to see me talk about that as well drop a comment below but in general yeah like what do you guys think about all this tech you know leave a comment below to what you think the most impressive part of this was I mean there are so many but um I am interested to see what you liked most of all if you did like this video and you want to see more of this stuff don't forget to hit the like button and subscribe to this channel thank you guys for watching I will see you next time goodbye [Music] you
Info
Channel: The Cherno
Views: 1,342,707
Rating: undefined out of 5
Keywords: cherno, ue5, unreal engine v, unreal engine 5, epic, unreal engine, reaction, demo, game engine, game engine dev, rendering, graphics, amazing
Id: 9PmjQvowfAI
Channel Id: undefined
Length: 30min 56sec (1856 seconds)
Published: Sat May 16 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.