The Graphics Pipeline and You | Writing Unity URP Code Shaders Tutorial [1/9] ✔️ 2021.3

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi i'm ned and i make games would you like to start writing shaders but don't know where to begin or are you having issues with urp's shader graph and would like an alternative in this tutorial series i'll walk you through writing a fully featured general purpose shader for unity's universal render pipeline exclusively in code this topic is pretty big so i'll break it into several videos check the video description for links to each tutorial and a playlist too we'll start with understanding the structure of shaders and then move on to lighting and shadows before finishing with more advanced topics like physically based materials and custom lighting models also if you prefer reading tutorials i've created a written version of the entire series also linked in the video description before i move on i want to thank all my patrons for helping make this series possible and give a big shout out to my next gen patron cubeydoobydoo thank you all so much i'd like to quickly show off a wonderful game from some long time viewers kitty cat squash it's an adorable casual puzzle game where you match cute kitties and try to beat your high scores it starts simple but can quickly become a brain teaser try out a bunch of different game modes gravity was my favorite and unlock 16 different cats i love seeing juliet on my board so precious kitty cat squash is available on steam the google play store and the amazon app store go get your cute kitty fix thank you inspired by madness games for sponsoring this video let's lay out some goals of this tutorial series my aim is to teach how to write shaders i will show several examples writing each step by step and explaining as we go do not view these examples as ready for production shaders but rather as a blueprint that you can customize to your game's needs after completing this series you'll know how to write your own version of urp's standard lit shader as well as customize it with your own lighting algorithm and more you'll also know several optimization techniques and how to leverage unity's debugging tools to diagnose and fix rendering bugs the examples in this tutorial were tested in unity 2020.3 and 2021.3 as you follow the tutorial you will come across many features that are only possible in 2021 however i have written the shaders so that the same code runs in each version there's quite a lot to set up and understand before anything will display on screen which can make learning shaders pretty difficult to speed up the process i'll explain information as needed and i might gloss over some edge cases or technicalities don't worry i'll address them if they become important later on to keep this series from becoming even longer i will assume that you have some general game development knowledge you should be comfortable using unity and its 3d game tools such as models textures materials and mesh renderers and although you need no prior experience writing shaders you should know how to program c-sharp experience will definitely be useful with all that out of the way let's get started in this series we'll use unity's universal render pipeline is one of several rendering frameworks that unity provides but i chose urp for this project due to its recency and ability to support a wide variety of platforms all while remaining pretty lightweight i would highly suggest creating a fresh project for this tutorial you can either select the urp project template or use a blank template add urp manually through the package manager and activate it in graphics settings either way create a new standard scene to work with large shaders like the one will write are made of several files to help keep things organized create a new folder called mylit to contain them all create a shader file by selecting unlit shader from the create dialog and name it mylit.shader if you create a material you'll already see your shader in the shader selection menu of the material inspector it will be under the unlit submenu open your shader file in your code editor of choice if you're using visual studio sometimes unity doesn't generate a visual studio project correctly when there are no c-sharp scripts present first create an empty c-sharp script so your shader will appear in the solution explorer regardless inside mylit.shader is a lot of automatically generated code delete it all for now this part of a shader is written in a language called shader lab and it defines meta information about the actual drawing code the first line opens a shader block and defines the shader's name in the material inspector any slashes will create subsections in the selection menu useful for organization the block is bound by curly braces like classes in c sharp a shader is more than just code that draws a single shader is actually made of many sometimes thousands of smaller functions unity can choose to run any of them depending on the situation they're organized into several subdivisions the top most of which are known as subshaders subshaders allow you to write different code for different render pipelines and unity will automatically pick the correct sub shader to use at any time define a sub shader with a sub shader block and then add a tags block to set the render pipeline tags blocks hold user defined metadata in a format sort of like a c sharp dictionary tell unity to use this sub shader when the universal rendering pipeline is active by setting the render pipeline tag to universal pipeline that's the only sub shader that we'll ever need in this tutorial subshaders are just the first subdivision of a shader below them are passes passa's purpose is a bit more abstract each pass has a specific job to help draw the entire scene like calculating lighting casting shadows or creating special data for post-processing effects unity expects all shaders to have specific passes to enable all of urp's features but for now let's only focus on the most important pass the one that draws a material's color to signal that this pass will draw color add a tags block inside the path type key is light mode and the value for a lit color pass is universal forward you can also name passes which will help a lot in debugging okay we're almost ready to write some code erp shader code is written in a language called hlsl which is similar to a streamlined c plus to mark a section of a shader file as containing hlsl surround it with the hlsl program and end hlsl code words to keep things organized i like to keep hlsl code in a separate file from the dot shader metadata thankfully this is easy to do you can't create an hlsl file directly in unity but you can in visual studio or inside your file system either way name this new file myletforwardlitpass.hlsl and open it in your code editor i just want to mention that many code editors don't work well with urp shaders so while working through this tutorial ignore any errors that you see in your code editor and just rely on unity's console when writing shaders you need to have a different mindset than you would when working with c sharp for one there's no heap meaning most variables work like numeric primitives or structs you can't define classes or methods or use inheritance structs and functions are still available to help you organize your code if you've ever worked with data-driven design you'll also feel at home writing shaders the focus is gathering data and then transforming it from one form to another in the broadest sense shaders transform meshes materials and orientation data into colors on the screen there are two standard functions which the system calls kind of like the start and update functions in mono behaviors these functions are called the vertex and fragment functions and every pass must have one of each these both focus on transforming data from one form to another the vertex function takes mesh and world position data and transforms it into positions on the screen the fragment function takes those positions as well as material settings and produces pixel colors unity's rendering system employs something called the graphics pipeline to link together these functions and handle low-level logic the pipeline gathers and prepares your data calls the vertex and fragment functions and displays the final colors on the screen it's made of several stages running one after another kind of like an assembly line each stage has a specific job transforming data for later down in the assembly line there are two special stages the vertex and fragment stages which are programmable they run the vertex and fragment functions that you write the other stages are not programmable and run the same code for all shaders although you can influence them with various settings let's take a look at each stage starting at the beginning the input assembler it prepares data for the vertex stage gathering data from meshes and packaging it in a neat struct structs in hlsl are very similar to c sharp a pass by value variable containing various data fields this tract is customizable and you can choose what data the input assembler gathers by adding fields to the struct but just what kind of data can the input assembler access the input assembler works with meshes specifically mesh vertices each vertex has a bunch of data assigned to it such as position normal vector texture uvs etc each data type is known as a vertex data stream to access any stream in your input structure you simply need to tag it and the assembler will automatically set it for you for example here's an input struct for our forward passes vertex function it defines a struct called attributes which has a field called position with a type called float3 float3 is the hlsl term for c-sharp vector3 or a vector containing three float numbers we can use semantics to tag variables and the input assembler will automatically set them to a particular data stream for example the position semantic corresponds to vertex position keep in mind that only the semantic determines what data the input assembler will choose to place in variables feel free to name variables however you wish we'll see more semantics later on hlsl uses them pretty often to help the graphics pipeline along with that let's move on to the vertex stage as a programmable stage you get to define the code that runs here defining a function in hlsl is very similar to c sharp with a return type void at the moment a function name and a list of arguments an argument's type precedes the variable name this function only needs a single argument of attributes type the vertex stage's primary objective is to compute where mesh vertices appear on the screen however notice that the attribute struct only contains a single position only data for a single vertex the render pipeline actually calls the vertex function multiple times passing it data for each vertex until all are placed on the screen in fact many calls will run in parallel if you've ever programmed multi-threaded systems you know that parallel processing can introduce a lot of complexity shaders bypass a lot of this by forbidding storage of state information each vertex function call is effectively isolated from all the others you cannot pass the result of one vertex function or any data computed inside to another each call can only depend on the data in the input struct as well as other global data but more on that later in addition each vertex function call only knows about data of a single vertex this is mostly for efficiency the gpu doesn't want to load the entire mesh at once we need to compute the screen position for a vertex described in the attribute struct when talking about positions it's important to determine the coordinate system that it's defined in its space the position vertex data streams gives values in object space which is another name for the local space that you're accustomed to in unity's scene editor if you view a mesh in a 3d modeling program these positions are also displayed there another space you'll often use is world space it's a common space that all objects exist in relative to one another to get world space from object space just apply an object's transform component unity provides this data to shaders as we'll see soon however a vertex's position on screen is described using a space called clip space an explanation of clip space would fill an entire tutorial but luckily we don't have to work with it directly urp provides a nice function to convert an object space position to clip space to access it we first need to access the urp shaders library in hlsl we can reference any other hlsl file with the hash include directive these commands tell the shader processor to read the file located at a given location and copy its contents into this line if you're curious what's inside the lighting.hlsl file or any other urp source file you can read them yourself in your packages folder included files can themselves include many other files leading to a kind of tree structure for instance lighting hlsl will pull in many helpful functions from across the entire urp library one such function git vertex position inputs is located in the shader variable functions.hlsl file its source code isn't really important but it returns a structure containing the past object space position converted into various other spaces clip space is one of them just what we need note that clip space is a float 4 type if you tried to store it in a float 3 unity would give you a warning that data will be truncated or lost this is a common source of bugs so always heed these warnings and use the correct vector size keeping track of which space a position is in can get tricky pretty fast standard erp code adds a suffix to all position variables indicating the space os denotes object space cs clip space etc we should follow this pattern as well next we must fulfill the vertex stage's job and output the clip space position for the input vertex to do that let's define another struct called interpolators to serve as the vertex stages return type write a float4 position cs field inside of it with the sv underscore position semantic the semantic signals that this field contains clip space vertex positions have the vertex function return an interpolator struct declare a variable of interpolator's type set the position cs field and return the structure with that the vertex stage is complete the next stage in the rendering pipeline is called the rasterizer the rasterizer takes all vertex screen positions and calculates which of the mesh's triangles appear over which pixels on the screen if a triangle is entirely off screen the render is smart enough to ignore it the rasterizer then gathers all data for the next stage in the pipeline the fragment stage the fragment stage is also programmable and the fragment function runs once for every pixel the rasterizer determines contains a triangle the fragment function calculates and outputs the final color each pixel should display but of course each call only handles one pixel the fragment function has a form like this it takes a struct as an argument which contains data output from the vertex function naturally the type should match so both should be an interpolator the values inside input have been modified by the rasterizer for instance position cs no longer contains slip space position but rather this fragment's pixel position you can also pass other data from the vertex function into the fragment function through the interpolator struct a technique that we'll use later on the fragment function outputs a float4 the color of the pixel it might be strange to think about but colors are just vectors they contain a red green blue and alpha value each ranging from 0 to 1. to let the pipeline know that we're returning the final pixel color tag the entire fragment function with the sv underscore target semantic when tagging a function with a semantic the compiler interprets the return value as having that semantic so we can finally display something on screen now let's just color all pixels white return a float4 with all components set to 1. see here that you don't need to write new in hlsl when constructing vectors just the type name is fine the last stage in the graphics pipeline is the presentation stage it takes the output of the fragment function and together with information from the rasterizer colors all pixels accordingly there's one last thing to do we need to register our vertex and fragment functions to a shader pass return to the mylet.shader file tell the compiler to read the code inside your myletforwardlitpass.hlsl file using a hash include command next register the vertex and fragment functions using a hash pragma command hashpragma has a variety of uses relating to shader metadata but the vertex and fragment sub-commands register the corresponding functions to the containing pass make sure that the function names match those in your hlsl file and we're now finally ready to view your shader make sure the material has mylet selected and create a sphere in your scene give it your material it should now appear as a flat white circle if it's colored magenta or there's some other issue check unity's console and the shader asset to see if there are any errors okay so we have a flat white circle now that's a good starting place let's make the color adjustable from the material inspector this is possible through a system unity calls material properties material properties are essentially hlsl variables that can be set and edited through the material inspector in unity they are specified per material and allow objects using the same shader to look different if you are wondering what's the difference between a shader and a material this is basically it a material is a shader with specific property settings we can define properties inside the dot shader file with a properties block the syntax for these is inconsistent but i'll explain it to define a color property first decide on a reference name this is how you'll access a property in hlsl by convention property names start with an underscore follow that with a parentheses pair like you're writing a function the first argument is a string this is a label how it will display in the material inspector the next argument is the property type there are various but color defines a color property close the parentheses and then set a default value the syntax is different for each property type but for colors start with an equal sign and then the red green blue alpha values inside of parentheses as mentioned before colors are just four float values each corresponding to a color channel red green blue and alpha these combine to create any color that a computer can display each number ranges from zero to one where white is all ones and black is all zeros for alpha one is opaque and zero is invisible if you'd like more info on how these numbers combine to create a color i have a link in the video description now you should be able to see your property in the material inspector later this shader will have many properties so we should organize them add a header label denoting surface options to do that use this header command strangely the label is not enclosed in quotation marks here there's one last thing that we should do properties can be tagged with attributes like classes in c sharp these give properties special features tag color tint with the main color attribute now it's possible to easily set this property from c sharp using material.color the property is all set up but the value is not reflected on screen open myletforwardlitpass.hlsl although we defined a property in shaderlab we must also define it here in hlsl make sure the reference name matches exactly unity will automatically synchronize this variable with the material inspector earlier i did say that vertex and fragment functions could only access data from a single vertex or fragment and while this is true they can also access any material properties these variables are uniform meaning they don't change while the pipeline is running unity sets them before the pipeline begins and you cannot modify them from a vertex or fragment function with this in mind have the fragment function return color tint as the final color return to the scene editor select your material and change the color tint property the shader should immediately reflect your choice flat colors are great but it'd be really cool to vary the color across the sphere we can do this with textures shaders love working with textures they're just image files but shaders think of them as 2d arrays of color data when you zoom into a texture this does become pretty obvious to add a texture to a shader first add a texture material property defining a texture property is much the same as a color property instead of listing a typos color set it to 2d the syntax for default textures is kind of strange following the equal sign type white with quotes followed by a pair of curly braces if no texture is set in the material inspector unity will fill this property with a small white texture by default you can also set the default color to black gray or red similarly to the main color attribute there is also a main texture attribute tagging this property makes it easily assignable from c sharp using material.main texture your property should now show up in the material inspector notice the four numbers beside it they allow you to set an offset and scale for this texture which is useful for tiling now let's take a look at the hlsl side of things defining a texture variable is a bit more complicated than a color variable you first must use this special syntax to define a 2d texture once again the name must match the property reference exactly texture 2d here is not a type but something called a macro you can think of macros similarly to functions except they run on the text making up code you can create macros yourself using the hash define command before compiling the system will search for any text matching a defined macro and replace the macro name with the text that you specify macros can also have arguments the system will replace any occurrences of argument names in the macro definition with whatever text that you pass in this is a really simple overview of macros but they can be pretty useful in shader code hlsl doesn't have inheritance or polymorphism so if you want to work with any structure that has a position os field but you don't necessarily know the structure type a macro can do the trick here's an example they're also great at handling platform differences which is what unity has done with texture2d see different graphics apis like directx opengl etc use different type names for textures to make shader code platform independent unity provides a variety of macros to deal with textures that's just one less thing that we have to worry about moving on there are a couple more variables that unity automatically sets when you define a texture property textures have a companion structure called a sampler which defines how to read a texture options include the sampling and clamping mode that you're familiar with from the texture importer point by linear etc unity stores samplers in a second variable which you define with the sampler macro the name here is important it must always have the pattern sampler followed by the texture reference name finally remember the tiling and offset values from the material inspector unity stores those inside of a float4 variable the name must follow the pattern of the texture name followed by st inside the x and y components hold the x and y scales while the z and w components hold the x and y offsets and yes the fourth component of a float4 vector is referred to as w in hlsl now we'd like to sample the texture or read color data from it we should do that in the fragment stage to apply texture colors to each pixel use the sample texture 2d macro to get the color out of a texture at a specific location it takes three arguments the texture the sampler and the uv coordinate to sample well what are uvs uvs are texture coordinates assigned to all vertices of a mesh which define how a texture wraps around a model think of how cartographers try to unwrap a globe to fit on a flat map they're basically assigning uvs to positions on the globe uvs are float 2 variables where the x and y coordinates define a 2d position on a texture uvs are normalized or independent of the texture's dimensions they always range from 0 to 1 and by convention 0 0 is the bottom left corner of a texture unfortunately we can't grab uvs out of thin air in the fragment stage they're another vertex data stream that the input assembler needs to gather add a float2 uv field to the attribute structure with a text chord 0 semantic text chord 0 is short for texture coordinate set number 0. models can have many sets of uvs or texture coordinates for instance unity uses tech scored 1 for light map uvs but we'll get to that later the attribute struct is not available in the vertex stage either however we can store data in the interpolator struct which will eventually make its way to the fragment stage add another float 2 uv field there have it also used the text chord 0 semantic in the vertex function past the uv from the attribute struct to the interpolator's tract we can also apply uv scaling and offset there which we might as well this way we'll only have to compute it once per vertex instead of once per pixel it's a good idea to do as much as possible in the vertex function since it generally runs fewer times than the fragment function unity provides the transform text macro to apply tiling there are two interesting things about it first the double hash tells the pre-compiler to append text to whatever is passed in as the macro argument when the macro runs you can see how it replaces name with underscore color map correctly referencing color map st second the x y and zw suffixes give easy access to a pair of components as a float 2. this mechanism is called swizzling you can ask for any combination of the x y z and w components in any order the compiler will construct an appropriately sized float vector variable for you you can also use rgb and a the same way which is more intuitive for colors it's even possible to assign values with a swizzle operator anyway now we have the uv data in the fragment stage but let's take a moment to really think about what's happening here the vertex function outputs data for each vertex the rasterizer takes those values places the vertices on the screen and figures out what pixels cover the formed triangle it finally generates an input structure for each fragment function call but what value will input uv have for each fragment the rasterizer will interpolate any field tagged with a text chord semantic using an algorithm called barycentric interpolation you're probably familiar with linear interpolation where a value on a number line is expressed as the weighted average of the values at each endpoint barycentric interpolation uses the same idea except extends it to a triangle the value at any point inside the triangle is a weighted average of the values at each corner luckily the rasterizer handles all of this for us so the algorithm is not really important to recap the values and interpolators are a combination of values returned by the vertex function specifically for any fragment they are a combination of values from the three vertices forming the triangle that covers this fragment's pixel well all that for some uvs but we finally have all that we need to call sample texture 2d it returns a float for the color of the texture at the specified uv position depending on the sampler sample mode point by linear etc this color may be a combination of adjacent pixels to help smooth things out regardless multiply the sample with the color tint property and then return it in hlsl multiplying two vectors is a component-wise operation meaning the x-components of each vector are multiplied together then the y-components etc all arithmetic operators work like this in the scene editor set a texture on your material and marvel at what you've accomplished note how the texture and color tint properties interact also notice that if your texture has an alpha component the shader doesn't really handle transparency yet so the sphere will always be opaque we'll fix that in a future tutorial so stay tuned i hope that this wet your appetite for shader programming because we're really just getting started in the next part of this tutorial series we'll add simple lighting to finally give objects dimensionality and then we'll delve into shadows learning how to add additional passes to a shader you've gotten past a lot of the theory and can now focus on the fun stuff please stay tuned and subscribe and press the bell to be notified when my next tutorial video goes live if you enjoyed this consider liking the video as well it really helps out with the youtube algorithm if you have any issues or questions please feel free to leave a comment or contact me on any of my social media accounts i want to take another moment to thank all of my patrons for helping make this video possible and give a big shout out to my next gen patron croobiedoobydoo again thank you all for your support it really does mean a lot and if you want to download an example trader from this tutorial or any of my others you can join my patreon as well and finally don't forget to check out kitty cat squash thanks again inspired by madness games for sponsoring this video everyone thanks so much again for watching and make games [Music] you
Info
Channel: Ned Makes Games
Views: 24,270
Rating: undefined out of 5
Keywords: gamedev, game development, development, unity, unity3d, madewithunity, programming, game design, csharp, nedmakesgames, nedmakesgames dev log, indiedev, indie game, dev log, shaders, 3d modeling, blender, tutorial, walkthrough, shader, universal render pipeline, urp, unitytip, unitytips, shaderlab, hlsl, beginner, starting, shader graph, hdrp, graphics, graphics programming, tech, tech art, lighting, shadows, texture, textures, interpolation, struct, swizzling, macros, material, materials, model, modeling, uvs, unreal
Id: KVWsAL37NGw
Channel Id: undefined
Length: 32min 54sec (1974 seconds)
Published: Mon Jun 06 2022
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.