Rendering Architecture | Game Engine series

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
Hey look guys my name is China welcome back to my game engine series so last time we took a look at an introduction to rendering definitely check that video out if you haven't already because today we're going to continue off kind of where we left off in that video and actually talk about the architecture of this whole renderer and just the whole problem of we need to render graphics on the screen how do we do that in an engine as promised I did prepare a little bit of a PowerPoint presentation which should be pretty cool so going to jump in that into that in just a second but I just want to kind of start off by just saying first of all huge thank you to all the patrons that make this series possible patreon come forth out of the gener thank your support just means the world to me especially right now because I'm so busy and I still want to be able to pump out these videos for you guys and just seeing that people are so interested in support of this series is absolutely amazing to me if you're interested in becoming a patron definitely check out that link you'll receive a bunch of rewards such as videos early although that's kind of on hold at the moment since my life so crazy but you'll get access to things like source code for hazel engine that has already been like whale away written in in advance essentially so this kind of stuff that we're covering now has somewhat been implemented already and it's all usable in a branch and just kind of as like as a way to kind of get more of like I guess more control into where hazel actually going being and being able to steer it into the right direction i commonly post things on discord as well and the kind of patrons channel relating to Hazel's development as i write the code and like it's almost like a development blog at times i try and do that just to keep you guys updated with what I'm actually working on because obviously I work on hazel kind of independently and then eventually what I kind of implement eventually we're way later in the future makes its way to videos such as this one and also obviously in these videos I explain what I'm doing a lot more and talk about like how we're going to structure certain things anyway rendering architecture let's dive in okay sir rendering architecture in the hazel engine first of all that beautiful background is not rendered in hazel that's actually a photo I took and pretty much the South of Victoria here in Australia if you want to see more of my photos in ceramic on floor / the channel will be a link in description because I do love photography as well but maybe one day we can render something like this in hazel that would be quite nice quite a good goal to aim for anyway there are two sides to this whole thing and I guess one of the things that I should probably talk about I mean in the introduction to rendering video I did cover a lot and I don't want to really overlap too much so we're going to taking more of a practical approach at this now but effectively the way that I kind of see this is there are two sides to a rendering system in a game engine there's something called the render API which again these turn is that this terminology is something that I've essentially made up I mean it does make sense obviously and other people have used it in the past and do use it but this isn't like from some text book or anything like that so don't expect that but essentially what we have is a render API which is the platform and API render API graphics API specific part of this whole system that we're building what that means is that this is actually specific to OpenGL specific to Vulcan specific to DirectX this is this kind of this side of this line is stuff that we actually tie in to those platforms or those api's whereas this side which we'll get into you know which we'll get into kind of in the future once we get through all the all of these points this is going to be like the render kind of the API platform independent kind of side of this side of these two two sides side of these two sides of the system so let's take a look at what we've got so random on text right so again this side what this is is it's an implementation that is implemented like in OpenGL or in Vulcan or there is actual API so this is the stuff that is specific that we have to basically implement per API there's no way around that right so we have essentially what do we build right so we build a render context we build things like a swap chain a frame buffer vertex buffer might just be easier till just list all these I think at once an index buffer you know texture kind of API shared or API States pipelines render passes and then we'll kind of move on to the next side so let's talk about this for a bit what I mean by this is that these are essentially in a way I guess you could call them render API primitives and I probably missed like a bunch of things this is just like a loose kind of off the top of my head kind of list but essentially we need to if we're if we're like and the way that we're gonna start this out is you'll see in the next slide is by implementing OpenGL so if we use that as an example right what we need to do is essentially make you could say a class for each of these things but essentially an implementation for OpenGL for like this render context the swap chains frame buffer the vertex buffer index buffer because these are kind of render primitives out of which we tell our graphics card what it is we want to do right so things like a vertex buffer right we have a buffer of data we want to send it into the GPU and then actually be able to use it maybe like kono statically maybe dynamically however we want to use that right we need an API to be able to do that and the cool thing about this is that we implement this in every API so in other words we'd have an implementation for OpenGL we would have an implementation for Falcon right and then once we have that we can actually use it from the renderer side of things agnostic Lee right which means that it doesn't matter we have one implementation of the renderer which just says okay first up create a render context in a swap chain then you know create any frame buffers you might need for actual rendering then I want to render a triangle so let's upload some vertex buffer data and some index buffer data and we're going to create a shader of course and then we'll like create a render pass or a pipeline or anything else we need and then we'll kick off rendering that triangle right that's it okay but the thing is to render a triangle we use this render api from the renderer and the cool thing about this is that suddenly if we decide well actually less render using Vulcan or DirectX 12 or whatever this part obviously is the part that contains the actual implementation of those api's but this part does not change we just have to hit a switch that says okay now do it in direct x12 but our logic for drawing a triangle our actual run the commands that we are essentially going to be the same now in the last video in the introduction to rendering video I talked about how it was really hard to kind of draw the line and this is the line this is the line I was talking about I've drawn that line and I've said this is on this side this is on this side all right but now you actually kind of see what actually belongs to each kind of side again if I want to suddenly add a texture to a triangle right what do I do I just say ok let's load a texture and bind it and we'll draw using that texture right and we'll update the shader code obviously to actually sample from that texture and then draw those pixels but how do we use that texture we just say all we want to make a texture right the implementation details of ok what does it even mean I want to make a texture you know am I going to read that from a file and then what I'm gonna do with that data right is that going to be implemented by by DirectX 12 or Vulcan or OpenGL or whatever that is on this side of the line right so this is just an API that provides us access to that kind of lower level code but then we use this API in our renderer to actually render what we want okay and the benefit of that obviously is that um it doesn't matter what API we under the hood it's kind of like we're creating another API on top of all those other api's it's going to do exactly what we want in a much much much simpler way usually and then all the hard work is kind of done behind the scenes right that's kind of the whole idea so let's take a look at things that the renderer would be responsible for so the actual 2d and 3d render alright weather weather that before were deferred whatever that is going to be implemented largely using actual kind of parameters from this now I have to mention that there are differences in the API like I didn't just say last episode that all this is gonna be so hard it's always so hard to do this and then here I'm like that's easy we know inter-school live so I'm gonna list all these dot points we're gonna do this and that everything's gonna be sweet no because you know implementing a deferred renderer in OpenGL versus something like Vulcan it's different because in Vulcan you just have kind of more of these primitives than you do in OpenGL in OpenGL you just kind of create like a bunch of frame buffers and that's kind of it whereas in Vulcan you know you can have like things like pipelines and like descriptive sets and just more kind of just more actual content for this which is why something like open like I I can't OpenGL or something completely different - like Vulcan or directx12 because it is right like the approach to some of these advanced rendering techniques is actually quite different so it's gonna be interesting when we get into things like deferred rendering or even like tile based rendering or anything like that right because the way that we kind of set this up might actually differ and we might have to purple might have to extend this render api and basically you know whenever we need to just write more vulcan courage than otherwise possible with these primitives because we want a more efficient way of deferred rendering for example because you can't achieve deferred rendering with these primitives that's fine but Vulcan might have a better way of doing it by using like more control essentially and obviously Vulcan has things like that you know seeing the synchronization parameters like fences and semaphores and that kind of stuff right not something that Penelope has so there's gonna be some difference it's not gonna be as straightforward as this might look cuz I don't want you guys to think that I've I've made it out to be so hard in the last episode and then suddenly I'm just like that it's gonna be easy it's not gonna be easy I'm just simplifying it for you guys so that we can actually start doing this and so that you don't feel too overwhelmed but maybe feel a little bit overwhelmed just cuz I don't want to like make this have to be actually more simple than it is so because of that yeah like we might have to say well you know what it doesn't make sense to have a deferred renderer implemented with just these like platform and API agnostic primitives that might not make any sense and that's totally fine because then what we'll do is we'll move that part of the code into the render API zone now the downside the draw but that usually what happens when you make that decision is things get more efficient but you suddenly have more code because suddenly anything you move obviously into the render api zone needs to be implemented per api so we need to suddenly have a deferred rendering or whatever implementation for opengl for Valken from whatever api is we actually choose to support which we kind of want to avoid obviously what we actually want to do is try and have as much code as possible in that platform and API agnostic kind of section of our code because that means that we don't have to have a separate event implementation every time we want to do that the reason we have this line in general this might seem like a silly question but I know some people might be a little bit confused the reason why we have this line is because again think back to rendering a triangle do we want to write code four or five times per API if we have like DirectX 12 if we have Vulcan if we have OpenGL if we have metal whatever do we want to write code to render a triangle for each API no what we want to do is create all these primitives and then as I kind of described you know will create a vertex before anything that's before the texture you know the shader whatever we need we create that and then like the code to draw a triangle is just easily handled by hazel right and then that kind of branches off into all those API so that we really only need to write that code once we want to render a triangle this is the correct render triangle we write it once and then it kind of you know will eventually lower lower and low level as it kind of goes down into the API it will eventually hit you know whatever render API we're using such as OpenGL or direct x12 anyway let's get back into this render thing so things like a same graph right like a same manager that's not something you usually have in you know you don't want to write raw opengl code kind of in that like for that kind of system it's something that is you know it doesn't have anything to do with actual rendering API is you know you have like sorting right deciding what to render when right some things might need to be rendered after certain other things because of things like blending but also we typically want to be able to sort things that use the same material or the same shader or the same texture kind of together so that we don't need to switch States I've often things like culling right so what fits into our frosting maybe some kind of occlusion culling as well those systems again have nothing to do with specific render api's left to do them for every single round or api so why not just have like why don't why does why not just have the render and a lot right materials so materials are very closely tied with shaders essentially what a material is is just a shader plus some uniform data I meant from there we'll probably have things like material instances which are based on materials that's typically how most renderers and most engines kind of work but again that's something that is implemented here so having like client-side uniform storage and then a shader linked and all that stuff that's that that we that applies to every single render API that's not different whereas something like a texture or a vertex buffer you know might be different to render API which is why we need it to be here and not here right hopefully this is making sense lots of right level of detail so meshes that we have lower quality and like kind of smaller meshes essentially in terms of polygon count and triangle count and the further away we decide to render something so that we don't render as many triangles systems like that not something that belongs in the actual render API animation as well it's purely kind of a client-side renderer implementation of that doesn't need any actual render API support cameras right again something that is just you know we need to like maybe a camera might be tied to a frame buffer maybe a camera might be rendering to a render target right that kind of stuff implemented on the renderer side doesn't need to be part of that I'm in fact render passes almost belong here as well the reason I put it put them here is because they can be seen as a primitive and they are implemented differently in different render api's but in general you know if we want to control like if we want a big you know and a render pass again there'll be an API for that in the actual renderer as well on visual effects so things like particle systems that stuff as well is now that stuff may these stuff don't get me wrong will require support from render api things i didn't list here are things like you know batching for example or instancing right batching is something something that you usually can be controlled by the render side whereas instancing requires render api support so things like that suddenly you decide you want to bash something but you want to use instancing in this case that requires instancing to be kind of set up in the api here but then can be used from the renderer so that's as well post-effects and by that I mean specifically a post-effects system right so actually if you break down what postfix is you know we render our scene to a frame buffer and then we use a different shader to just render you know a quarter on the screen using a vertex buffer with kind of outpost effects applied here but also you know for post effects is common that we might need additional data that we wouldn't need immediately so for example very simple post effects might be something like a color correction that's easy because what do we do we render our scene to a frame buffer and then we render we render that frame buffer again using a different shader onto another quad and then display that on our screen and then that shader might do the color correction whereas if we want to do something like screen space ambient occlusion right not a great effect but let's just go with it we need some other data for Mallory from our original render pass such as like position or normal data right we want those normals into like a separate buffer so we can then read them in our post effects shader because we need that we need that in way we need that extra information that we didn't need before we ran that post effect so things like that right so it's kind of it does tie in together a specifically I do mean it's post effects system the way that it might handle those render passes and there's additional data that we need is something that typically would be implemented by the renderer because this renderer just uses this API that we've created just so that it doesn't need to use OpenGL specifically or any actual Ram for API specifically I hope you guys know which render rapier I'm talking about when I just say render API sometimes I'm talking about this list sometimes I'm talking about OpenGL welcome that kind of thing but hopefully it makes sense um other things so I think you know like I guess render techniques like reflections I mean occlusion you know setting up reflections in terms of like reflection probes you know rendering to like cube maps doing things like ambient occlusion whether that being a first post effects or like kind of more advanced you know hpi-o or whatever other other kind of render techniques we have available to us those things are typically implemented you know in the renderer that's something that we do there because we don't like sometimes we might need extra API support but if you think about it usually we should be able to get this API pretty strong to the point where we don't need to suddenly implement like finer grained things inside the actual rendering yeah every time we want to do like a different technique right now I do want to say though that the reason this has become difficult is because as nice as this might seem like we can just have a render API right that seems great like we have a renter API we make it really strong it handles everything suddenly every time I want to do anything in my renderer I have to write it once and I'll just work on every render API we support that's a great dream to have but in practice that's usually not a very optimized way of doing things the reason is that if you try and make a generalist render API you have the same problems as if you try and make a generalist game engine right it's a game engine that can be used for any game in the water anything you can think of you can make with this game engine but is that going to be better than if I have if I write my own specific engine for my specific game nore it's not gonna be better which is where the starts become dangerous because ideally speaking like if you get into advanced techniques that actually do require more performance are you going to be able to write a generalist API and use that and receive the same level of performance and just optimization as you are if you just use raw DirectX 12 directly of course not right so you have to really be careful with this and we'll kind of as we go along we'll kind of run into stuff like this and have probably some serious discussions and may even like implement things one way and then be like well actually this isn't as good as like this other tests I did over the weekend and it turned out to be better and we might go back on that and I don't know generally should be pretty fun but I'm just kind of warning you guys of some potential problems that will inevitably arise from stuff like this okay so with that in mind one of the things that I didn't mention which we will also have to pretty much design is essentially a like a render command Q all right and this kind of ties in like between the renderer telling the render AP what to do right and I ran to command Q is just basically a way for us to encode every single render command into like a bigger buffer so that once per frame when we actually decide to like render everything essentially we go through that command queue and we execute all those commands and typically the data is all in line we can run it on diff on a different thread if we want to unlike a dedicated render thread so that while it's kind of going through that render queue we could be doing other stuff like submitting more render commands for the next frame or doing game logic or doing AI or anything else on our kind of main thread so that's a render command Q you know an API like Vulcan has command queues right that's what you do in Vulcan you kind of you have a bunch of commands like for example you know bind this and then draw this triangle and then you know this is what my this is my scissors State this is my people at state and what you do is you put all of that into a like into a queue essentially and then eventually you tell Vulcan I want it I want you to actually present that cue right when you actually go through that queue and then do that rendering thing so Vulcan has that but like OpenGL doesn't so we need to create that for up in jail and you know just in general if you have like a choice of like okay I want to implement three api's what do I make them like usually the choices I'll make them like DirectX right because Dredd text like familiy seems to be the most sensible actual API right so we might follow that kind of design but because I like making new things and cuz hazel is a new project for me I might just try and make my own thing and we'll see what happens and that's why that's where we as a community we're going to work together and we try and build something new or something cool so just giving you guys a little bit of insight there so where do we start talking is great but we need to actually start this thing so how are we gonna do that well first of all we're gonna start with up in jail now would i start with opengl if I was just making my own engine like full-on without making a video series out of it probably not probably wouldn't support OpenGL at all to be honest just start with like Vulcan maybe but the problem is OpenGL is the simplest and the easiest API requires a bit of a stronger design from outside to make it actually good but it's the simplest and easiest thing right we want that because this is a game engine series it's not necessarily a graphics series I am obviously heavily interested in graphics and I love graphics since my favorite part of this whole thing to be honest but I do recognize this as a game engine we have kind of two goals first of all we want to see stuff on the screen as soon as possible and see beautiful things on the screen as soon as possible because that's gonna be great to actually see results and obviously a lot of you guys actually want results because this is a youtube series and it has to be fun in that case and it has to be marketable from my point of view as well I have to be able to make pretty thumbnails with like hey look hey zhalandin looks great even though you know if I try something like Vulcan the architecture might be amazing we might have done a lot of code work but do we have anything pretty - sure not really so there's that side and the other side is we don't want to spend two years implementing rendering api's we want to get something up and running and then move on to something else like a scene like a scene system or like a level editor like tool set or like sterilization and like data like that or actual logic and scripting or like an empty component system we have so many other systems we need to deal with so using our congeal is not ideal but you will get the job done right not in the best way possible not in the fastest way possible but it will get the job done so I'll step one with that is to build that render API part so when I talked about this and all of these things that's what we begin with we build that stuff once we've done that we build the other side which is the renderer of course these do have overlap they're not dependent on each other right so for example we don't need the render API to be complete before we can begin the renderer once we have enough in the render API to like draw a triangle guess what we're gonna do we're gonna write a renderer that can draw a triangle okay so that we get that stuff up and running and then we'll iterate over it and we'll add more things as we need them you know suddenly we want to be able to render to a texture or something like that we'll add the frame buffer in the texture target texture render talk an API at that point okay after that we have a base to render anything we want right and of course by build renderer I am talking about things like we need to build a material system maybe like a shader Plaza stuff like that that'll be kind of included in step two essentially but after that we can render anything we want you know we want to render lie it's cool like PBR materials and like 3d models and all that my stuff we can we should be able to so once we can do that once in OpenGL I can render like a full 3d scene that looks beautiful right we've probably post effects and all of that stuff that's when we're going to implement other api's now this is and I did write to be decided because as I said we probably want to do things like a scene you know scene level edited civilization all that other stuff I mention in the important system scripting all that stuff we probably want to do before we start diving into other api's but this is a serious for you guys if you guys decide that you know we've got enough stuff let's just implement bulk and I really want to learn about that if the majority of people decide that now I'm gonna do that right I would love to do that to be honest but I'm just thinking about what the majority of people want and obviously trying to make this series something that you guys enjoy and something that like especially like the patrons actually want to see but not just patrons don't think that I'm just I only care about people who support me financially obviously I care about all of my viewers but um of course one of the perks of supporting me I guess is that you get to steer this series a little bit more into where you actually want but yeah so welcome to our text eleven and twelve metal that kind of stuff that I will get to that eventually but this is kind of the stuff I want first right we're gonna start with the simplest API possible and we're gonna make we're gonna make something that looks pretty and then after we have that we can move on to other stuff if we kind of have the time and resources to do something like that okay so that's pretty much it thanks for watching guys hope in your this video if you do do hit that like button you can also help sponsors on patreon as I've mentioned throughout this video I do want to say that like this is a big journey this is something that's going to take us awhile OpenGL is very verse is very simple in a lot of cases it's also because it's so simple it can also be annoying in a lot of cases but I think we're gonna have a generally easy time with that so I'm not too worried about the actual details about like about that kind of stuff we should get something up and running pretty quickly I would think this is a good opportunity to take to check out my OpenGL series if you haven't already there's a playlist of like 20 I think also videos that will kind of cover a lot of these primitives I don't think I've got an idea I didn't get into like advance things like frame buffers um not that's particularly advanced but I didn't get to things like that I didn't think or you know ready targets or just like a lot of that kind of stuff I probably will eventually in that series I think what I'll do is I'll short an implementation of like hey you know we need to write a frame buffer class in hazel let's just do that but then I'll link to an OpenGL stories video that I'll make about frame buffers where it actually talks about how frame buffers work in OpenGL so I'll probably be I'll probably resume the open video series a lot as the kind of move throughout this now next weekend which is usually what I make videos is my bachelor party because as I mentioned in previous video I'm getting married in less than a month now so because of that I'm not gonna be able to make a video next weekend I hope you guys understand I mean if you know kind of balanced my life in this full-time job and all the stuff's going on it's just my head's exploding so this is again as I mentioned a good time to go back and watch that OpenGL series and maybe even try and implement a renderer yourself right try to in front those things that I've outlined in in the design and then with that API that you've created this OpenGL specific or rather platform-agnostic kind of try and then use that API to actually draw a triangle that's a really good place to start of course you can you will be able to draw a triangle if you watch that opengl series so that's kind of my homework and my like what I want you guys to do while I kind of have this have is break next week and then we'll just get back into it and to be honest I'm really looking forward to it so hope you guys enjoy this video and I'll see you next time goodbye [Music]
Info
Channel: The Cherno
Views: 38,516
Rating: undefined out of 5
Keywords: thecherno, thechernoproject, cherno, c++, programming, gamedev, game development, learn c++, c++ tutorial, game engine, how to make a game engine, game engine series, rendering architecture
Id: YPWNNmlIUIo
Channel Id: undefined
Length: 27min 53sec (1673 seconds)
Published: Sun Mar 17 2019
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.