Learn GODOT 4 Compute Shaders with RAYTRACING!!

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
guitar for beta dropped you know what that means we get automatic mesh LOD or maybe the much better physics system what about the brand new Vulcan renderer what no who cares about any of that nonsense we got compute shaders baby [Music] I was looking for a way to explain compute shaders to my viewers but then I thought Ray tracing so I did it I made it I posted it on Twitter so I can pop my brain with some of that juicy dopamine it does better than any tweet I've ever posted have a panic attack but it's fine just hop onto Fabian stream and let's relax a bit right yes nakoda you're gonna go back to Ray tracing now because I'm looking forward to that video too you should feel pressure you should feel pressure God damn it Fabian huh well you can call me Ace roller because my Ray tracing is going to give you a pop you see Ray tracing is actually a pretty good way to learn how computators work and where to use them a copy Trader compute stuff I I know I know amazing explanation that computational task could be anything from procedural map generation and erosion to Ray tracing algorithms as we will later find out the main difference with a computator is that it actually has no purpose unless you give it one for example a Vertex Shader is responsible for transforming vertices between multiple coordinate spaces and modifying them as needed while a fragment Shader serves to texture meshes and calculate their output color on screen this is why vertex shadows and fragment shaders are part of the rendering pipeline but computers sit outside the rendering Pipeline and only do things you tell them to do not forget however that they are sharers meaning they can leverage the entire power of the GPU but we're getting ahead of ourselves why bother using a compute Shader in the first place when we can already calculate anything we want on our CPU well I'll tell you imagine you have a screen with 1920x1080 pixels which is around 2 million pixels in total and let's say you have to calculate a color for every pixel on screen since most CPUs have around 8 threads best case scenario you can only process the colors for 8 pixels on screen at any given point in time we know how many cores the GPU has thousands gpus can crunch through any task that can benefit from parallelism say for example oh I don't know maybe tracing array for every single Pixel on screen oh before we get into Ray tracing Theory let's run over how computators work they execute in work groups you can imagine these as a 3D stack of computers that are each defined by their X Y and Z coordinates in the stack very very important note none of these work groups can communicate with each other and they are executed in a random order so don't rely on sharing information between them now inside each work group is another 3D stack of threads called invocations these can communicate with each other and they are all processed simultaneously again the invocations are identified by the 3D coordinate in the grid which is called the local invocation index however they can also be identified globally by their position in the work group plus the position of the work group itself which is called Global invocation index keep in mind that you ideally want your invocations to be a multiple of 64. this is because of the way invocations are dispatched by the GPU okay just ignore whatever I'm saying here the text on screen is actually what's correct that that idiot who recorded the audio is is incorrectly swapped around the numbers I said just ignore him Nvidia gpus like multiples of 64 while AMD gpus like multiples of 32 and 64 is also a multiple of 32 so that works pretty well so for example a good invocation size could be eight by eight by one for 2D thread groups or four by four by four for 3D thread groups to start out a copy Shader create a new DOT glsl file and place these two lines on top the first indicates the video that this is a compute Shader and the second defines the glsl version we're using much like my will to live inputs and outputs for computers are non-existent you have to define the inputs and outputs yourself you can receive data into the computrator and write out data from the computer using Shader storage buffer object or ssbos for short if you use the vertex and fragment shaders before this is basically like a uniform that you can also write data to the layout qualifier is what we use to define these buffers set 0 means that this is part of the first uniform set binding 0 is basically the index of the buffer within the set since you'll most likely have multiple buffers in your Shader Don't Panic std430 is not a sexually transmitted disease it just tells the compiler to follow the layout standard that was introduced in glsl version 4.3 restrict read only and write only a memory qualifiers that allows a compiler to optimize the usage of this data across invocations there's a link in the description if you want to learn about all of them and when to use each of them inside we Define all the data that this storage buffer will provide similar to a struct there is one rule you must follow though each storage buffer object can only have one variable size array and that must be the last variable inside the storage buffer if you need another variable size array you need to Define storage buffer finally create a new void function called main this is going to be where the entire logic of your computrator goes of course writing what goes here is entirely up to you you can read and write data to the storage buffers as needed right above the main function we will write a new layout qualifier that defines the number of indications this Shader should be run with remember that the invocations are decided by the Shader and the work groups are decided by the dispatcher which is GD script in our case I would also recommend reading up on the atomic functions that are provided to you since you will be working in parallel of course as the saying goes for parallel Computing there be dragons here cool you now have a bunch of storage buffers in your Shadow but you still need to actually pass in the data that you require from GD script into your Shader first let's set up the copied pipeline which is done using the rendering device class this may look complicated but all we're doing is loading up the glsl file compiling it into spur V code which is used by Vulcan and then attaching it to a rendering device pipeline at the end we run this bit of code that dispatches the computrator specifying the number of work groups to be run and then we wait for the GPU to finish executing the copy Shader before we start reading back information in between we Define our Shader storage buffer objects which we will write data to and receive data from these are the same as what we defined in our Shader this code defines a new Shader storage buffer object and then we use the add ID function to add any data to the storage buffer if we need to suppose this is an input all the data must be passed as bytes so you'll need to manually encode them into packed byte arrays yourself also a useful trick is that the pack byte array size method actually returns its size in bytes so you don't need to calculate the number of bytes yourself when defining the storage buffer however if you do need to calculate the bytes yourself just remember that by default all the values in the Shader are 32 bits which is 4 bytes in total which includes both floating Point values and integer values all other data types in the Shader are just a composition of these base types like a vac 4 is Just 4 floats which is 4 into 4 16 bytes now you know how to set up run past data 2 and read data from a compensator you know what that means [Music] let's get to the juicy part of this video The Ray tracing now I cannot take any credit for any of these algorithms frankly I'm too stupid to know how to make this stuff I just learned them from the wonderful Ray tracing series I have Linked In the description I've just done the Godot plus glsl implementation of them now array is basically an object that has two components An Origin and a direction in order for us to draw objects on the screen we need to shoot out array for every single Pixel in the direction of the view Vector to figure out the ray direction we need the camera's Global transform and the camera's projection Matrix which we then use to construct a perspective array video has this cool feature where it does not allow you to retrieve the projection Matrix they have a built-in projection class in Godot 4 that you can pass camera data to and make the projection Matrix but this wonderful class does not show up in any Google search for some reason so I literally did not find out about this until I was in voice chat in the Godot shaders and VFX server but of course being a Shader guy but I reconstructed the projection Matrix using the camera's fov near plane and far plane distances I did not have to Google that I swear what who even walks around every day remembering how to construct a projection Matrix from scratch if the ray collides with the surface we'll get the color of the surface and draw it to the screen and if not we'll just draw in the background color to keep things simple and avoid our brain exploding let's just start out with a sphere now that we have a ray and a sphere we need to do a simple line intersection test here's my a grade math explanation take the equations for a sphere and array X is common between both so we substitute the values or look this turns out to be a quadratic equation so we can use the quadratic formula which has this property known as the discriminant if the discriminant is greater than zero then the line it is exit two points meaning it goes straight through the sphere which counts as a collision Cool Math now turn it into code spend several hours making a glsl implementation Bang Your Head on the table find code online it's in hlsl of course spend even more time translating it into glsl click play [Music] and it looks like well that's because effectively all we're doing is under cheating we just collide with the sphere get its color and place it on the screen we're gonna have to do better than that if we plan on impressing Twitter well we have a ray and a ray headstruct that gives us information about the current Ray and best collision we have so far now that Collision information also includes the hit normal which is actually all we need to do some simple lambertian diffuse lighting which is basically just take the dot product of the normal and light Vector clamp it and multiply it with our unshaded color why stop there though we can practically make a mini fragment Shader in here to color up the screen so let's throw in some specular highlighting using this formula so we get some shiny spheres to do Reflections we just put our Ray tracing code into four Loop so that it traces a bunch more rays and then decrease the opacity of each reflection iteration by how specular the object is finally we can wrap up the ray Tracer with some Shadow casting which is just to trace array from the Collision Point towards the light source and if it hits anything we darken the color of the pixels without its shadow but nikoro this doesn't actually Ray Trace any of the meshes in my actual game first of all you secondly in order to retrace the mesh you have to pass every triangle in the mesh into the computator do an even more difficult intersection algorithm made by Muller and crumble in 1997 on every single triangle that makes up the mesh in game for every single Pixel plus if you want colors and textures you'll have to pass UVS and material data yourself and practically write your own renderer in a computator it's not impossible but it's stupidly difficult although a bit dated in 2014 scratcher pixel made a simple naive implementation of this in C plus plus and it took them 15 seconds to render one frame of a cow there is a reason why AMD and Nvidia are manufacturing gpus that are specialized in Ray tracing so I definitely wouldn't recommend doing this for an actual game unless Godot brings out its own native Ray tracing support besides the purpose of this video is really to teach you about creating and dispatching your own compute shaders take a look at this Conway's Game of Life made in Godot 4 with copy Traders you could even do stuff like GPU cloud simulations and so much more anyways if you learned something new from this video or just like it consider subscribing to my channel I'm a crazy Shader man who loves Godot [Music] thank you
Info
Channel: NekotoArts
Views: 28,010
Rating: undefined out of 5
Keywords: Nekoto, Nekoto Arts, Godot, Shaders, VFX, Compute Shader, Compute, Godot 4, beta, NekotoArts, Raytracing, Ray tracing, RTX
Id: ueUMr92GQJc
Channel Id: undefined
Length: 12min 39sec (759 seconds)
Published: Thu Oct 13 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.