How To Use All 200+ Nodes in Unity Shader Graph

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
shadow graph has a lot of nodes over 200 of them in fact as of shader graph 10.2 if i've got my numbers correct it's not always clear exactly what each of them do and unity documentation sometimes fall short of giving examples so in this video i am going to describe the functionality of every single node what the inputs and outputs are what special modes you can use and whether it has any bonus options there are several nodes that are exclusive to high definition vendor pipeline which i won't be covering here i'm hoping to spend some time becoming familiar with hdrp in the future so that i can do that section of shadow graph justice in a separate video before we start remember to subscribe if you like this video it's like a lot of work to put together it's going to be a long one so grab yourself some popcorn and make yourself a coffee that's exactly what i did while making this video the help of my kofi supporters also check out my patreon there's loads of bonus goodies up for grabs on each tier ranging from ad-free versions of my articles to early access videos and your name in the credits to complementary copies of my shader packs all supporters get an exclusive member of all on my discord now let's tackle the mammoth task of exploring every single node in shadowgraph we're going to start off with the block nodes which were added in shadow graph version 9 along with the master stack as a modular replacement for the old master nodes the stack has two sets of blocks vertex blocks and fragment blocks corresponding to the vertex shader which operates on each vertex of a mesh and the fragment shader which operates on every pixel of the object respectively some nodes throughout your graph only work on the vertex or fragment stage the position block defines the vertex position of the mesh when left unchanged the vertex positions will look exactly the same as they do in your modeling program but we can modify this vector3 to physically change the location of those vertices in the world this can be used for effects like ocean waves the normal block defines which direction the vertex normal points in if we change this then the lighting on your object may change too the vertex tangent vector is usually perpendicular to the vertex normal vector and for a flat surface it usually rests on the surface if you modify the normal vector it's best to make sure that your tangent vector still points in a sensible direction the base color used to be called the albedo color this would be the color of the object if lighting transparency and other effects were taken out of the equation after the vertex stage we have access to the normal vector per fragment and we can change it which changes the lighting on the object tangent space is defined relative to the surface or triangle being drawn so the normal tangent space block expects an input in tangent space relative to the individual triangle this pixel is part of the normal object space block is similar to the normal tangent space block but it expects an input in object space in order space a mesh is defined relative to the other parts of the mesh without taking its positioning in the world into account the normal world space block is also similar to the normal tangent space block except it expects a world space vector input in world space everything is defined relative to a world origin point where multiple objects may exist you can swap between these three blocks using the fragment normal space option in the graph settings the smoother your object is the more lighting highlights appear when smoothness is zero the surface acts as if it's extremely rough when it's won the surface is polished to a mirror sheen emissive light can be used to create bloom around objects it's best used for bright objects such as neon lights by default this block uses hdr color so you'll have the option to bump the intensity higher to get brighter remission ambient occlusion is a measure of how much this pixel is obscured from light sources by other parts of the scene such as walls we can modify the amount of ambient occlusion by changing this float when it's zero the pixels should be fully lit i.e there is no occlusion and when it is won there shouldn't be a lot of light on the object because it's being occluded fully from light sources the metallic block expects a float set it to zero for a completely non-metallic object and one for a totally metallic object the lighting on your object will look more matte the lower this value is this only has an effect if you set your graph up in the graph settings to use a metallic workflow if it's set to specular changing this does nothing the specular block and by extension the specular workflow is an alternative to metallic make sure the workflow of your graph is set to specular not metallic or else this block has no effect unlike metallic this block expects the color because specular highlights can be tinted different colors the brighter the color the larger the highlight alpha means how transparent the pixel is as a float between zero and one rendering transparent objects is usually more computationally expensive than opaque objects so you need to make sure the surface option is set to transparent in the graph settings if you need semi-transparent objects this block does nothing if the surface is set to opaque unless alpha clip is a technique where pixels with an alpha below a certain threshold are not rendered they're cult if you take the alpha clip option in the graph settings then both alpha and alpha clip threshold become active if the alpha value is below alpha clip threshold the pixel won't be rendered this is useful for fake transparency effects where you use opaque rendering but you kill certain pixels to give the illusion of transparency we can use properties to provide an interface between the shader and the unity editors inspector the properties list gives us an easy way to expose variables to the user and it's a tidy place to store these variables for use on your graph you can search for any of them by name in the graph to add a new property use the plus arrow on the blackboard and set the type of variable you want a float or a vector 1 as it's called in earlier versions of shadow graph is a single floating point number we can give it a name which will show up on the graph and a reference string which is the name that we can use to refer to this property via scripting changing the default value here will change the value a material with this shader uses by default we can also change the mode between default which just lets us set the float directly slider which lets us define minimum and maximum values to bound the value between integer which locks the value to a whole number and enum which i'm not sure what to do with because it's totally undocumented on unity's site we can set the precision of the property to inherit from the graph's global settings or override it to use single or half precision single generally uses 32 bits while half typically uses 16 bits although this might differ by hardware this setting can be found on almost every node in shader graph in the node settings but i'm not going to mention it every time we can toggle whether the property appears on the inspector using the exposed checkbox and override property declaration lets us decide whether this property should be declared per material or globally vector 2 is like float except we define two floats for the x and y components of the vector we don't have any alternative modes with this one but we do have the same reference default precision exposed and override property declaration options vector3 properties are the same as vector2 properties except we now have an extra z component to work with and vector4 adds an extra w component color adds a motor gall between defaults and hdr if we pick hdr we get extra options in the color window which allows us to increase the intensity of the color beyond zero i'll talk about this more when we discuss the color node a boolean property can be either true or false this will be useful in nodes which use boolean logic i'll cover booleans in more detail at that point gradients work much like they do anywhere else in the unity editor we can add handles to the gradient window to set the color and alpha at each part of the gradient although the exposed checkbox appears i am not able to click it so it can't be exposed to the inspector the texture2d property type lets us declare a texture2d asset we want to use in the graph the mode dropdown on this one lets us set the default color if no texture is supplied white gray or black the fourth option bump makes this sense to a flat normal map by default that makes it appear blue a texture 2d array is a set of 2d textures with the same size and format that have been packaged together so that the gpu can read them as if they're a single texture for increased efficiency we can sample these in special ways as we'll find out later a texture 3d is like a texture 2d but with an added dimension we don't have access to the modes like we do with texture 2d either a cube map is a special type of texture that conceptually is like the nest of a cube six textures that have been stitched together they're useful for something like a skybox i won't go into too much detail here but a virtual texture is a way to reduce memory usage if you have several high-res textures they are only supported by hdrp so on urp attempting to sample a virtual texture will act like a normal texture sample and doesn't yield any performance benefits we can declare a stack of textures in our property and add or remove texture layers from the stack and then acts like every texture layer is a separate texture 2d i seem to be able to add up to 4 but i don't know if this varies by hardware or other settings a matrix 2 is a 2x2 grid of floating point numbers the default value when you create a new property of this type is the 2x2 identity matrix which has ones down the leading diagonal and 0s everywhere else similarly a matrix 3 is a 3x3 matrix of floats with 3 rows and three columns and a matrix four is a four by four matrix of floats with four rows and four columns matrices are useful for transforming vectors in a graph in interesting ways although none of these property types can be exposed in the inspector the final property type is sampler state a sampler state can be used to determine how a texture 2d is sampled the filter determines how much smoothing is applied to the texture point means there's no smoothing linear smooths between nearby pixels and trilinear also smooths between mipmaps the wrap mode controls what happens if uvs outside the textured panels are used repeat copies the texture past the bounds clamp brows the uvs to the edge of the image mirror is like repeats but the texture reflects each time the boundary is crossed and mirror once is like mirror but gets clamped past the first reflection sampler states can't be exposed to the inspector either we also have keywords to use level graphs in order to split one graph into multiple variants based on the keyword value a boolean keyword is either true or false so using bond will result in two shaded variants depending on the definition the shader acts differently shader feature will strip any unused shader variants at compile time thus removing them multi-compile always builds all variants and predefined can be used when the current render pipeline has already defined the keyword so that it doesn't get redefined in the generated shader code that otherwise may cause a shader error the enum keyword type lets us add a list of strings which are the values the enum can take and then set one of them as the default we can choose to make our graph change behavior based on the value of this enum and we have the same definition options as before the material quality is a relatively new built-in keyword which is just a built-in enum based on the quality level settings of your project this allows you to change the behavior of your graph based on the quality level of the game's graphics for example you might choose to use a lower lod level on certain nodes based on the material quality now we will talk about nodes that you can place on the main graph surface by right-clicking on the graph outside the master stack unity will display a list of every node available in shader graph i'm going to go through each subheading one by one and try and mention the most useful nodes within a heading first although by no means will this entire list be totally ordered in that manner the input family of nose covers basic primitive types sampling textures and getting information about the input mesh amongst other things the color node comes with a rectangle which we can click to define a primitive color as with most color picker windows in unity we can switch between red green blue and hue saturation value color spaces set the alpha or use an existing swatch or we can use the color picker to select any color within the unity window by changing the mode drop down to hdr we gain access to hdr colors which let us raise the intensity beyond zero which is especially useful for a miss of materials not every node which accepts the color input will take hdr into account however it has a single output which is just the color you defined the vector1 node or float as it's called in later versions of shader graph lets us define a constant floating point value it takes one float input which we can change at will and a single output which is just the same as the input vector 2 is similar to vector 1 but we can define two floats as inputs the output is a single vector 2 with the first input and the x component and the second input in the y component vector 3 follows the same pattern with three inputs labeled x y and z and one output which combines the three and unsurprisingly the vector4 node has four inputs x y z and w and one output which combines all four into a single vector four the integer node is slightly different to the float node in that you use it to find integers but it doesn't take any inputs we just write the integer directly inside the node the single output of course is that integer the boolean node is like the integer node insofar as it doesn't take any inputs if the box is ticked the output's true and if it's unticked the output's false the slider node is useful if you want to use a float inside your graph but you need some extra ease of use for testing purposes we can define a minimum and maximum value and then using the slider we can output a value between those min and max values the time node gives us access to several floats all of which change over time the time output gives us the time in seconds since the scene started sine time is the same as outputting time into a sine function cosine time is like using time and a cosine function delta time is the time elapsed in seconds since the previous frame and smooth delta is like delta time but it attempts to smooth out the values by averaging the delta time over a few frames the constant node gives you access to widely used mathematical constants using the drop down menu with a single output those constants are pi tau which is equal to two times pi phi which is the golden ratio e which is also known as euler's number and the square root of two the texture family of nodes is all about texture sampling the sample texture 2d node is one of the nodes i use the most in almost every shader i build it takes in three inputs one is the texture to sample the second is the uv coordinate to sample the texture at and the third is a sampler state which determines how to sample the texture the node provides two extra options when the type is default the node samples the texture's colours and when it's set to normal we can use the node to sample normal maps the space is only relevant when sampling in normal mode to determine the space to output normal information for we have several outputs which looks intimidating at first but the first output is the red green blue alpha color of the texture and the next four outputs are just those individual components this node as with most texture sampling nodes can only be used in the fragment stage of a shader the sample texture 2d array node acts much like the sample texture 2d node but now we don't have the type or space options instead we now have an index input to determine which texture in the array to sample the samples extra 2d lod node is the same as sample texture 2d except we now have an added lod input we can use this to set the mid map level with which to sample the texture because we manually set the mid map level we can actually use this node in the vertex stage of a shader sample texture 3d is conceptually the same as sample texture 2d except we provide a texture 3d and the uv coordinate needs to be in three dimensions instead of just two we can still supply a sampler state but we don't have the extra drop down options and for some reason we only have a single vector4 output without the split channel outputs found on sample texture 2d the sample cube map node takes in a cube map a sampler state and an lod level all of which we've seen before and a direction which is used instead of uvs to determine where on the cube map we should sample think of a cube map conceptually as being a textured cube but inflated into a sphere shape the dirt input a vector in world space points from the center of the sphere outwards to a point on the sphere the only output is the colour since we specified the midfap level through the lod input we can use this in both the fragment and vertex stages of a shader but beware that you might encounter issues if nothing is connected to the direction input this would be great for use on a skype box the sample reflected cube map node is like the sample cube up node except we have an extra normal input and both that and the view direction need to be an object space conceptually this node acts as if we are viewing an object in the world and reflecting the view direction vector of the object using its surface normal vector then using the reflected vector to sample the cube map in contrast to sample cube map the sample reflected cub node is great for adding reflected light from a skybox to an object in the scene the sample virtual texture node has two inputs by default the uvs with which to sample the texture and a virtual texture slot once you connect a virtual texture the number of outputs from the node changes to match the number of layers on the virtual texture object we can use any of those outputs we wish it's worth noting that this node has extra options in the node settings window too we can change the address mode to wrap or clamp the texture when we use uvs below zero or above one and we can change the lod mode here automatic will use lods however you've set your project up to use them lod level adds an led input and lets us set the mid map level manually lod bias lets us control whether to favor the more or less detailed texture when blending between lod levels automatically and derivative adds dx and dy options although unity doesn't document what these do anywhere we can swap the quality between low and high and we can choose whether to use automatic streaming if we turn off automatic streaming and set the lod mode to lod level we can even use this node in the vertex shader stage as far as i can tell this replaced an earlier node called sample vt stack and is only available on recent versions of shader graph and as mentioned outside of hdrp this node acts like a regular sample texture 2d node the sampler state node works just like a sampler state property it lets us define the filter mode and wrap mode to sample a texture with we can attach one to most of the texture sampling nodes we've seen so far the texture 2d asset node lets us find any texture 2d defined in the assets folder and use it in our graph this is useful if the shader always uses the same texture no matter which material instance is used but we don't want to use a property the texture 2d array asset node is the same as texture to the asset except we grab hold of a texture 2d array asset instead and as you may expect the textured 3d asset node can be used to access the texture 3d asset within your graph without using a property and to finish off the set we can use a cubemap asset node to access the cubematt texture in the graph the texel size node takes in a texture2d as input and outputs the width and height of the texture in pixels texel in this context is short for texture elements and can be thought of as analogous to pixel which itself visual for picture element the scene family of nodes gives us access to several pieces of key information about the scene including the state of rendering up to this point and properties of the camera used for rendering the screen node gets the width and height of the screen in pixels and returns those two as its outputs the scene color node lets us access the frame buffer before rendering has finished this frame and it can only be used in the fragments shader stage in urp we can only use this on transparent materials and it will only show opaque objects and the behavior of the node can change between render pipelines the uv input takes in the screen position you'd like to sample and by default it uses the same screen position uv as the pixel being rendered i'll talk about the other options on the drop down when we get to the screen position node the output is the color sampled at this position in urp you will also need to find your forward renderer asset and make sure that the opaque texture checkbox is ticked or else unity won't even generate the texture and you'll only see black this node is great for something like glass or ice where you need to slightly distort the view behind the mesh similar to the scene color node the scene depth node can be used to access the depth buffer which is a measure of how far away a rendered pixel is from the camera again in urp this can only be used by transparent materials the input it expects is a uv coordinate this node also contains a sampling option with three settings linear ob1 will return a depth value normalized between 0 and 1 where a pixel with value 1 rests on the cameras in a clip plane and 0 is the far clip plane although this might be reversed in some cases and an object halfway between both planes is at depth 0.5 the raw option will return the raw depth value without normalizing between 0 and 1 so a pixel halfway between the near and far clip planes may actually have a depth value higher than 0.5 and finally the eye option gives us the depth converted to ispace which just means the number of units the pixel is away from the center of the camera relative to the camera view direction the camera node is only supported by the universal render pipeline it gives you access to a range of properties related to the camera that's currently being used for rendering such as the position in world space the forward direction vector and whether the camera is orthographic if so one is output otherwise zero as output we have access to the near and far clip plane as floats as well as the zed buffer sign which returns one or minus one depending on whether we are using the standard or reverse depth buffer you might want to use this node if you are making depth-based effects for example using the scene depth node finally the width and height outputs get you the width and height of the screen in world space units but only if your camera is orthographic the fog node is also not supported by hdrp it returns information about the fog you've defined in the lighting tabs environment settings we need to pass in the position in object space and we get the colour of the fog and its density at that position we can use the node in the vertex and fragment stages of your shader the object node returns two outputs the position and scale of your object in world space as vector threes the lighting nodes gives us access to different types of lighting impacting a given vertex or fragment the ambient node returns three colour values each of which is a different type of ambient light from the scene but it's only supported by urp these values depend on the values in the environment lighting section of the lighting tab the node's equator and ground output always returns the environment lighting equator and ground values regardless of which source type is picked even though they only exist when gradient is packed the nodes colour sky outputs the sky colour when the mode is set to gradient or ambient colour when the source is set to colour the reflection probe node is only defined for the universal render pipeline we can use this to access the nearest reflection probe to the object by passing in the surface normal of the mesh and the view direction of the camera if you remember the way i described the sample reflected cube map node it works in a similar way we can also specify the lod to sample at lower qualities if we want blurry reflections the single output just named out is the colour of the reflection from the reflection probe as a vector 3. the bates gi node can be used to retrieve lighting created by unit as bait light mapper we need to provide a position and normal vector in world space so that unity knows where to access the light map information and then we need to provide a set of uvs so unity knows how to apply the light map to the mesh light map uvs come in two forms the static uvs which occur the uv-1 slot usually offer mapping lights which stay stationary for the entire game and dynamic uvs which are found in the uv2 slot by default are used for lights that might turn on or off or even move during runtime both sets of uvs can be generated automatically by unity during the light mapping process or created manually but if you don't know how to do that then it's nothing to worry about there's an extra tick box on the node to apply light map scaling which will automatically transform the light map texture if ticked it's usually best to keep it ticked the sole output is the colour of the lighting or shadow at this location the matrix family of nodes can be used to create new matrices or to access some of unity's built-in matrices matrices can be used for operations such as multiplying vectors i won't go into too much detail about matrices here because it's a very dense topic but all you need to know here is that we can define our own matrix constants inside the shader the matrix 2x2 node lets us define a square matrix with 2 rows and 2 columns similarly the matrix 3x3 node lets us define matrices for 3 rows and 3 columns the largest type of matrix supported in shaders is the four by four square matrix which we can create with a matrix four by four node matrices are super useful for transformations and unity defines many of the matrices involved in transforming from one space to another sometimes these matrices are used in the background but we can access them all here using the drop down we can pick between the following matrices the model matrix converts from object space to world space whereas inverse model converts the opposite way the view matrix transforms from well space to view space which is relative to our camera and inverse view does the opposite the projection matrix transforms from view space to clip space where parts of objects out of the camera's view can be clipped the inverse projection matrix does the opposite and finally the view projection matrix takes us straight from world space to clip space inverse view projection does the opposite the only output of the node is the selected matrix the geometry node family provides positions uvs directions basically different kinds of vectors the position node will grab the position of the vertex or fragment whichever stage you're using only one vector free output exists and that will be for the position but there's a drop down that lets us pick which space the position will be in we've talked about the object view and tangent spaces previously and absolute world is the same world space or the vertex of fragment as we've described world space before the world option differs by render pipeline and it uses the pipeline's default world space in urp it's the same as absolute world but hdrp uses camera relative rendering by default so the world space becomes relative to the camera position the screen position node gets the position of the pixel on the screen with a single vector4 output representing the screen position the mode influences exactly which screen position is used by default we use the clip space after dividing by the w component this is called the perspective divide raw mode however returns the screen position before the perspective divide which is useful if you want to perform a projection of your own center will return the screen position such that 0 0 is now in the center of the screen instead of the bottom left corner and tiles also put zero zero in the center of the screen but takes only the fractional part of the position the number past the decimal point so you end up with tiles the uv node can be used to get the uv coordinates of a vertex or fragment unity allows you to bake more than one texture coordinate into your mesh's data so we can use the channel drop-down to retrieve one of four sets of uv coordinates most meshes will only use uv 0 but you can use the other channels to hide more data you will need to bake the uv data into a mesh yourself using external means one unfortunate limitation of shader graph is that we can only access uv 0 to uv 3 although shader code can access uv4 to uv7 the vertex color node can be used to get the color attached to the mesh's vertex data despite the name this can be used in both the vertex and fragment shader stages but you'll have to set up your mesh beforehand to have vertex color data baked into it in the fragment stage the colors between vertices get blended together the view direction node gets the vector between the vertex of fragment and the camera the drop down lets us change the space between world view object or tangent we've talked about all those before the normal vector node gets the vector perpendicular to the surface pointing outwards away from the surface like view direction it gives us the option to pick different spaces and only outputs the single vector the tangent vector node gets a vector that lies on the surface this vector is perpendicular to the normal vector and like the normal vector node we get four space options the by tangent vector node gets another vector that is parallel with the surface if you take the cross product between the tangent vector and the normal vector you will get the same result as the by tangent vector node we will talk about the cross product in a little while there's three nodes under the gradient tab and i'm sure you can guess that they involve creating and reading color gradients the gradient node lets us define a gradient of our own to use inside the shader by clicking on the rectangle on the node we get access to the gradient editor window which is the same as the one used elsewhere in the unity editor we can modify the top row of handles to change the alpha and use the bottom row to tweak colors the only output is the gradient itself which brings us to the sample gradient node which is the only node that currently takes a gradient as an input it also takes an input called time which is a float between 0 and 1 which determines which position to sample the gradient at the output is the color sampled at that point the blackbody node is interesting it takes in a temperature in kelvin as input and outputs the color of a blackbody at that temperature don't know what a blackbody is then like me you're probably not a physicist a black body is an idealized completely opaque non-reflective object so the thermal radiation emitted is a function of its temperature the two pbr nodes involve reflection highlights for physically based rendering the dielectric specular node requires a bit of explanation dielectric materials are electrical insulators so in this context think of them as non-metals this node outputs the strength of specular highlights on certain types of material based on its refractive index we can switch the material type and values are defined for rusted metal water ice and glass there's an option for common materials which you would use for common materials like fabric plastic or maybe wood which gives us a range to pick between and a custom option where the output is based on the index of refraction if using the custom option look at the refractive index of material you want to use online for example the index of refraction for ice is 1.3098 which gives the same strength as the preset for ice the metal reflectance mode is similar to dielectric specular but now it outputs the colour of the specular highlights on certain metals the key difference is that the specular highlights of metals are coloured rather than grayscale as they offer dielectric materials unity provides values for iron sulfur aluminium gold copper chromium nickel titanium cobalt and platinum with no further options for custom metals the following three nodes are in the high definition render pipeline group but they're included in the base shader graph package so i'll still mention them here like all nodes under the high definition render pipeline group the diffusion profile node is of course not available on universal render pipeline this node is used to sample a diffusion profile asset which is exclusive to hdrp and contains settings related to subsurface scattering the output is a float which is an id used to pick the correct diffusion profile the exposure node is an hdrp exclusive node that you can use to get the camera's exposure level on the current or previous frame the only output from the node is the vector 3 representing that exposure level there are four exposure types that you can pick from using the type drop down the two labeled current get exposure from this frame while the previous ones get the exposure from last frame the two called inverse return the inverse of the exposure on a given frame the hd scene color is the hdrp exclusive counterpart to the regular scene color node unlike scene color hd scene color has an extra lod input which lets us pick the mid map level we use to access the color buffer this node always uses trilinear filtering to smooth between mipmaps we also have an exposure checkbox to choose whether to apply exposure it's disabled by default to avoid double exposure the only output from the node is the colour that gets sampled the next two nodes are used for the dots hybrid renderer the compute deformation node is exclusive to the dots hybrid renderer and can be used to send deformed vertex data to this shader you'll need some knowledge of dots to get this working and i certainly don't the three outputs are deformed vertex position normal and tangent which usually get output to the vertex stage's three pins the linear blend skinning node is also exclusive to the dots hybrid renderer we can use the three inputs for position normal and tangent vectors and this node will apply vertex skinning to each and give us the corresponding results as three output vectors the channel node family is all about messing with the order and value of each component of a vector the split node takes in a vector 4 as input and outputs the 4 channels of the vector as separate floats if you supplied a vector with fewer than 4 components then the extra outputs will be 0. swizzling is when you take the components of a vector and output them in a different order this node takes in a vector of up to four elements as input and provides four options on the node to determine how to swizzle the input this node always outputs a vector four and each option lets us choose an input channel to use for the corresponding output for example changing the green output drop down to blue means that the second output component takes the third input component the flip node takes a vector of up to four elements as input and for each input component the node provides a checkbox to decide whether to flip that input flipping means that positive values become negative and vice versa the output vector has as many components as the input the combined node lets us feed up to four values into the rgbna inputs and the node will combine those individual elements into vectors the node provides three inputs with four three and two components respectively depending on the size of the vector you want to create the uv family of nodes can all be used to transform the uvs we use to sample textures tiling an offset is another node you'll see me use quite often as the name suggests you can use this note to tile and offset your uvs which is especially useful for texturing the tiling input is a vector 2 which controls how many times the texture is copied across an object and the offset vector to input can be used to scroll the texture in whichever direction you want the other input is the set of uvs which the tiling and offset is applied to the output is a new set of uvs after the tiling and offset have been applied the rotate node takes in a uv as input and will rotate around the center point which is another input vector too by the rotation amount which is a float input this node also has a unit drop down which determines whether the rotation is applied in radians or degrees the single output is a new set of uv coordinates after the rotation has been applied the spherize node distorts the uvs as if they're being applied to a sphere instead of a flat surface the unity documentation describes it like a fisheye lens the uv input gives us the base uvs before the transformation and light rotate the center gives us the origin point of the effect the strength determines how strongly the effect is applied and the offset is used to scroll the uvs before the transformation has been applied the only output is the uvs after being spherized the twirl node has the same four inputs as spherize except now the transformation is that the uvs spiral from the outer edge the single output is the new set of uvs after the twirling the radial shear node also takes those same four inputs as 12 and spherize but now the transformation is a wave effect from whatever the center point is the output is a new set of uvs after the transformation has been applied the triplanar node is a bit more complicated to explain the idea is that we sample the texture three times along the world space x y and z axes which ends up with three mappings that look great applied from those three directions for that we supply a texture and a sampler as input then one of those mappings is planar projected onto the mesh based on the normal vector of the surface the one that results in the least amount of distortion is picked with some amount of blending we supply the position and normal vectors for the mapping as inputs too as well as a blend parameter which controls how much we smooth between the three samples at edges the higher this parameter is the sharper the mapping is finally we supply a tile float parameter to tile the uvs before the mapping is applied to the mesh the output is the colour after blending has taken place we can use the type setting in the middle of the node to switch between default and normal which tells unicy which type of texture we're expecting to sample the polar coordinates node is used to transform a set of uvs from a cartesian coordinate system which is the coordinate system you're most likely familiar with to a polar coordinate system where each point is described by the distance and an angle away from some center point the uvs and center point are both inputs and we can set how much scale the angle and length using the radius scale and length scale float inputs respectively the output is a new set of uvs in this polar coordinate system certain kinds of panoramic images can be decoded using polar coordinates which means we can use them for sky boxes or reflection maps the flipbook node is very useful if you're trying to make a flipbook animation especially for sprites the uv input is the same as the uv input on any of these nodes so far and we can also supply the width and height as floats which should be the number of flip book tiles on your texture in the x and y directions the tile input will determine which tile you want to sample and unicy will calculate new uvs which pick only that part of the texture which becomes the output the direction of the uvs in other words the order in which the tile input picks tiles is determined for the invert x and the invert y options by default invert y is ticked and tiles are picked starting from the top left and moving horizontally first typically you would use the output uvs in a sample texture 2d node to sample whatever texture you had in mind the parallax mapping node can be used to fake depth inside your material by displacing the uvs we can supply a height map which is a grayscale texture controlling how high or low each part of the surface should be together with that we can add a sampler state the amplitude float is a multiplier in centimeters for the height spread from the height map and the uvs are used for sampling the height map the output parallax uvs are the modified uvs which can be used to sample another texture with parallax applied the parallax occlusion mapping node acts the same way as the parallax mapping node except the latter doesn't take occlusion into account this is where higher parts of the height map can obscure lighting on lower parts now we have three added parameters the steps parameter controls how many times the internal algorithm runs in order to detect occlusion higher values means more accuracy but slower runtime we now also have an lod parameter to sample the height map at different mipmap levels and an lod threshold parameter mid map levels below this will not apply the parallax effect for efficiency which is useful for building an lod system for your materials the parallax uvs are a similar output and now we have an extra pixel depth offset output which can be used for screen space ambient occlusion you might need to add that as an output to your master stack math nodes as you can imagine are all about basic math operations ranging from basic arithmetic to vector algebra now we can take a rest with some super simple nodes bet you can't guess what the add node does it takes two float inputs and the output is those two added together the subtract node on the other hand takes the a input and subtracts the b input the multiply node takes your two inputs and multiplies them together although this is more in depth than other basic maths nodes if both inputs are floats then they're multiplied together and if they're both vectors it multiplies them element wise so that they return a new vector the same size as the smaller input if both inputs are matrices the node will truncate them so they're the same size and performs matrix multiplication between the two outputting a new matrix the same size as the smaller inputs and if a vector and a matrix are inputs the node will add elements to the vector until it's large enough then it multiplies the two the divide node also takes in two floats and returns the a inputs divided by the bmpence the power node takes in two floats and returns the first input raised to the power of the second input and finally the square root node takes in a single float and returns its square root the interpolation family of nodes are all about smoothing between two values to get a new value the larp node is extremely versatile lerp is short for interpolation we take two inputs a and b which can be vectors of up to four components if you support vectors of different sizes unity will discard the extra channels from the larger one we also take a t input which can be the same size as those input vectors or it can be a single float t is clamped between zero and one interpolation draws a straight line between the a and b inputs and picks a point on the line based on t if t is 0.25 the point is 25 percent of the way between a and b for example if t has more than one component the interpolation is applied per component but if it's a single float then the same value is used for each of a and b's components the output is the value that got picked the inverse slope node does the opposite thing to lap given input values a b and t inverse slope will work out what interpolation factor between 0 and 1 would have been required in a lerp node to output t i hope that makes sense smooth step is a special sigmoid function which can be used for creating a smooth but swift gradient when an input value crosses some threshold the in parameter is your input value the node takes in two edge parameters which determine the lower and higher threshold values for the curve when in is lower than edge one the output is zero and when n is above edge two the output is one between those thresholds the output is a smooth curve between 0 and 1. the range node family contains several nodes for modifying or working within the range of two values the clamp node takes in an input vector of up to four elements and will clamp the values element wise so that they're never fall below the min input and they never go above the max input the output is the vector after clamping the saturate node is like a clamp node except the min and max values are always zero and one respectively the minimum node takes in two vector inputs and outputs a vector the same size where each element takes the lowest value from the corresponding elements of the two inputs if you input two floats it just takes the lower one and the maximum node does a similar thing except it returns the higher number for each of the component of the input vectors the one minus node takes each component of the input vector and returns one minus that value the remap node is a special type of interpolation we take an input vector of up to four elements then we take two vector two inputs one is the in min max factor which specifies the minimum and maximum values that the input should have the outmin max factor specifies the minimum and maximum values the output should have so this node ends up essentially performing an inverse lap with the input value and in min max to determine the interpolation factor then does a lob using that interpolation factor between the out min max values those values are then output the random range node can be used to generate pseudo-random numbers between the minimum and maximum input floats we specify a vector 2 to use as the input c value and then a single float is output this node is great for generating random noise but since we specify the seed you can use the position of for example fragments in object space so that your output values stay consistent between frames or you could use time as an input to randomize values between frames the fraction node takes in an input vector and for each component returns a new vector where each value takes the portion after the decimal point the output therefore is always between 0 and 1. the round node family is all about snapping values to some other value the floor node takes a vector as input and for each component returns the largest whole number lower or equal to that value the ceiling node is similar except it takes the next whole number greater than or equal to the input and the round node is also similar except it rounds up or down to the nearest whole number the sine node not to be confused with a sine wave takes in a vector and for each component returns one if the value is greater than zero and zero if it is zero and minus one if it's below zero the step node is a very useful function that takes in an input called in and if that is below the edge input the output is zero else if in is above the edge input the output becomes one if a vector input is used it operates per element the truncate node takes an input float and removes the fractional part it seemingly works the same as for except works differently on negative numbers for instance minus 0.3 would floor to -1 but it truncates to 0. the wave node family is a very handy group of nodes used for generating different kinds of waves which are great for creating different patterns of materials the noise sine wave node will return the sine of the input value but will apply a small pseudo random noise to the value the size of the noise is random between the min and max values specified in the min max vector 2. the output is just the sine wave value a square wave is one that constantly switches between the values 0 and 1 at a regular interval the square wave node takes in an input value and returns a square wave using that as the time parameter if you connect a time node then it will complete a cycle each second a triangle wave rises from -1 to 1 linearly then falls back to -1 linearly the curve looks like a series of triangular peaks hence the name this node goes from -1 to 1 to -1 again over an interval of one second a sawtooth wave rises from -1 to 1 linearly then instantaneously drops back down to -1 the curve looks like a series of sharp peaks like a saw this node completes one cycle of going from -1 to 1 within a second the trigonometry node family invokes fear in the hearts of school students everywhere if you ever wondered when you'll ever use trig in later life this is where the sine function is a basic trigonometric function this node takes a float input and just applies the basic sine function to it the input should be in radians the cosine function is of course also a basic trig function and the cosine node applies the cosine function and the tangent function does the same thing but with a tangent function not very interesting i guess trig functions can be great for smoothing out values and in particular they're fantastic when working with angles between vectors we'll talk about vector operations shortly but for now i'll say that under the hood the dot and cross product between vectors can be defined using cos and sine respectively arc sine also known as the inverse sine function can be used to get back the angle that would have resulted in the input if that angle were passed into a sine function the output of this arc sine node will be an angle in radians in the range between minus pi over 2 and pi over 2 and the input should be between -1 and 1. the arc cosine node does a similar thing except this time it inverts the cosine function it accepts inputs between -1 and 1 but this time the output is between 0 and pi and the arctangent node is also similar but for inverting the tangent function its output is between minus pi over 2 and pi over 2 and this time the input can be any float value arctangent 2 is the two argument version of the arctangent function given inputs a and b it gives the angle between the x-axis of a two-dimensional plane and the point vector b and a the degrees to radians node takes whatever the input float is assumes it's in degrees and multiplies it by constant such that the output is the same angle in radians the radius to degrees node does the opposite of degrees radians the hyperbolic sine node takes an input angle in radians and returns the hyperbolic sine of the angle the hyperbolic cosine node does the same thing but instead of hyperbolic sine it's hyperbolic cosine and finally we can stick an angle and radians into the hyperbolic tangent node to perform the hyperbolic tangent function to the angle sorry for sounding like a broken record there's not much you can say about some nodes the following vector nodes can do several basic linear algebra operations for us the distance node takes in two vectors and returns as a float the euclidean distance between the two vectors that's the straight line distance between the two the dot product is a measure of the angle between two vectors when two vectors are perpendicular the dot product is zero and when they are parallel it's either one or minus one depending on whether they point in the same or opposite direction respectively the dot product node takes in two vectors and returns the dot product between them as a float the cross product between two vectors returns a third vector which is perpendicular to both you will probably use the cross product to get directions so magnitude doesn't matter as much but for clarity the magnitude of the third vector is equal to the magnitude of the two inputs multiplied by the sine of the angle between them the cross product node performs the cross product on the two inputs which must be vector threes and outputs a new vector three the direction is based on the left hand rule for vectors in other words if vector a points up and vector b points right the output vector points forward the transform node can be used to convert from one space to another the input is a vector 3 and the outputs another vector 3 after the transform has taken place the node has two controls on its body which you can use to pick the space you want to convert from and to you can pick between many of the spaces we've mentioned before such as objects view worlds tangent and absolute world you can also choose the type with a third control option which lets you pick between position and direction the fresnel effect node is another great node which can be used for adding extra lighting to objects at a grazing angle specifically it calculates the angle between the surface normal and the view direction if applied to a sphere you'll see light applied to the edge which is easy to see on the node preview the inverse to the node are the surface normal and view direction both of which are read to freeze assumed to be in world space and a float called power which can be used to sharpen the fresnel effect the output is a single float which represents the overall strength of the fresnel the reflection node takes in an incident direction vector and a surface normal as the two inputs and outputs a new vector which is the reflection of the incident vector using the normal vector as the mirror line the projection node takes two vectors a and b and projects a onto b to create the output vector what this means is that we end up with a vector parallel to b but possibly longer or shorter depending on the length of a the rejection node also takes two vectors a and b and returns a new vector pointing from the point on b closest to the endpoint of a to the endpoint of a itself the rejection vector is perpendicular to b in fact the rejection vector is equal to a minus the projection of a onto b the rotate about axis node takes a vector3 input and a second vector3 representing the axis to rotate around as well as an angle to rotate by as a float we also have a control on the node that lets us choose between degrees and radians for the rotation input the node outputs the original vector rotated around the rotation axis by that amount the sphere mask takes a coordinate a position in any arbitrary space and a sphere represented by center point and a radius if the original position is within the sphere the output is one else it is zero although there's another hardness parameter which is designed to be between 0 and 1 which you can use to smoothen the falloff between 0 and 1 outputs the higher the hardness parameter the sharper the transition if you want it to be a hard border set it to 1. these derivative nodes evaluate a set of nodes on adjacent pixels and provide a measure of how different the results are between those pixels the ddx node can be used to take a derivative in the x direction this works by calculating the input to the node for this pixel and the adjacent horizontal pixel and taking the difference between them the output is that difference you can do this without sacrificing efficiency because during the rasterization process fragments get processed in 2x2 tiles so it's very easy for a shader to calculate values on adjacent pixels in this group of tiles the ddy node does a similar derivative except vertically it takes this pixel and the adjacent pixel vertically and returns the difference between their inputs to this node and finally the ddxy takes the derivative diagonally by returning the sum of the two derivatives horizontally and vertically in effect it's like adding ddx and ddy on the same input all three derivative nodes are only available in the fragment shader stage you might use them for something like edge detection by reading values from a scene color or scene depth and detecting where there's a massive difference between adjacent pixels use the matrix node family to create matrices or carry out basic matrix operations the matrix construction node can be used to create new matrices using vectors the node has four inputs each of which is a vector four corresponding to the maximum matrix size of four by four the node has a setting to determine whether the inputs of row or column vectors and three inputs of varying size so you can use this node to construct a two by two three by three or four by four matrix the matrix split node on the other hand takes in a matrix and lets us split the matrix into several vectors the input matrix can be between two by two and four by four and the output vector fours will be partially filled with zeros if the matrix is smaller than four by four as with the matrix construction node we can choose whether the output vectors are row or column vectors the determinant of a matrix is a common operation in maths and this node calculates it for you the input is a matrix of any size between two by two and four by four and the output is its determinant this can be a bit costly for large matrices so use it sparingly the matrix transpose node reflects the elements of the matrix in its leading diagonal such that the rows become columns and vice versa the input and output are both matrices of the same size this group might be called advanced but many of these nodes are basic maths operations the absolute node returns the absolute value of the input in other words if the input value is negative the sign becomes positive the input can be a vector and if so the operation is performed to each element that applies to a lot of these nodes so sometimes i'll just mention a float input even if it can take a vector the length node takes a vector as input and returns its length which is calculated using pythagoras theorem modular arithmetic works by counting up until you reach some value at which point you start counting with zero again in other words the modulo node gives the remainder after dividing input a by input b the negate node flips the sign of the input float the normalized node takes in a vector and returns a new vector pointing in the same direction but with a length of 1. the posterized node takes in an input value and a step value this node will clamp the range of the input between 0 and 1 and quantize its value so that it can only take a number of values equal to the number of steps supplied plus 1. for example if the number of steps is 4 then the output is rounded down to the values naught 0.25 0.5 0.75 or 1. the reciprocal node divides 1 by the input flows we have an option to pick the algorithm used for the calculation either default or fast which is less accurate but good if you're using reciprocal a lot the reciprocal square root node is similar to reciprocal except it calculates 1 divided by the square root of the input unlike reciprocal there's no extra option to choose different methods if you're interested in a bit of history the fast inverse square root method is the famous piece of code pioneered by john carmack but discovered earlier for calculating the reciprocal square root of a number it's no longer necessary because this functionality is provided at the instruction set level but it's an interesting footnote the exponential node raises a particular number to the power of the float input we can pick what the base number is by using the base drop down which lets us choose between 2 and e e is euler's number which is approximately 2.72 the log node does the opposite process as the exponential node if 2 to the power of 4 equals 16 then the log base 2 of 16 equals 4. we take in a float and return its log under a particular base we can choose the base using the base drop down except now we have the choice of 2 e or 10. artistic nodes usually operate on colours or individual color channels or textures the blend node is normally used to blend one color into another in this case we pass the base color and blend color into the node and we blend the blend input onto the base input according to a third input which is a float called opacity when opacity is zero the base is unaltered and when opacity is one the blending is at its strongest there is also a mode drop down which lets us choose the method used for blending there's a lot of options so i won't go over every one the only output is the color after the blending has been completed dither is another of my favorite nodes we use it in screen space to apply intentional noise in some way internally the noise defines a neat pattern of noise values which are used as thresholds the input is a vector of values and for each element if its value is below the threshold defined by the dithering pattern the output is zero otherwise it's one we also require the screen position as input and we can multiply this to scale the dithering effect the color mask node takes in an input color a mask color and a range float if the input color is equal to the mask color or within the range specified then the output of the node is one else it is zero however there's also a fuzziness input if we raise this above 0 there will be a soft transition between 1 and 0 for values on the edge of the range the output is a single float representing the mask value the channel mask node takes in a color as input the channels option on the node lets us pick any combination of channels for each one that is selected this node keeps colors in that channel but discards color channels that are not picked by setting their values to zero the output is the masked color the following adjustment nodes are used to tweak the properties of colors the hue node can be used to offset the hue of whatever color is passed as inputs using the amount specified by the offset input this node comes with a toggle for different modes for some reason the documentation lists the options as degrees and radians but on the node the options seem to be degrees and normalized when degrees is picked you cycle through the entire range of hues between 0 and 360 and when normalized is picked the hue range is covered between an offset of 0 and 1. the saturation node adjusts the amount of saturation in the input color by whatever amount is passed into the saturation float input when the saturation value is 1 the original color saturation is left alone and when it's zero the output color has no saturation at all the contrast node does a similar thing except it adjusts the amount of contrast the input color has by whatever amount is used for the contrast float input the white balance node is used for modifying the tint and temperature of an input color temperature is a bit hard to pin down but generally speaking cold colors are more blue and warm colors are more red so reducing the temperature below zero makes the color more blue and raising it above 0 makes things redder tint on the other hand tends to offset a color towards pink or green when it's increased the replace color node takes a color input and we can define a color to a place called from and a color to replace it with chord two whenever the from color appears it's replaced with the two color we also define a float chord range which means that if any input color is within that range of from it's also replaced and finally increasing the fuzziness input means that there will be a smooth falloff between the original colors and the two color the invert colors node takes an input color and for each channel returns 1 minus the channel colors in shader graph are assumed to be between 0 and 1 for each channel so this won't work for hdr colors with high intensity the channel mixer node takes in a color input and for each the red green and blue color channels we can remap the amount they contribute to the output colors red green and blue channels we do this by clicking one of the three buttons labeled r g and b when one is selected modify the sliders which can run between -2 and 2 changes how much that input channel contributes to the three output channels for example if we select r and then make the sliders 0 0 and 2 that means the input red contributes 200 to the output blue the normal mode family is irreplaceable when working with normal mapping whether you're reading from a texture or creating the normals within shader graph the normal unpacked node takes a color as input and unpacks it into a normal vector that said for textures you usually sample it as a normal map anyway so this node is more useful if you've generated a normal texture within the graph somehow and you need to convert from colours to normal vectors you can choose the space of the input between tangent or object space using the drop down the output normal vector is a vector 3. the normal strength node takes a set of normals as inputs as a vector 3 and scales their strength via the strength float input a strength of 1 leaves the normals unaltered while 0 will return a completely flat normal map with all the normals pointing upwards the normal form texture node takes a texture a sampler and a set of uv's as input and uses that as a height map from which it generates normals the offset float input defines how far away the normal details extend from the surface and the strength float input multiplies the size of the result output is a vector 3 representing the calculated normal vector the normal from height node is similar except it takes in a single height value and generates a normal vector based on that and the input float strength we can change the space used for the output normals between tangent and world tangent is useful for when working with textures whereas world is great for working with lighting the normal blend node takes in two normals adds them together normalizes them and returns the result this is great for combining a base normal texture a and a detail normal texture b together we have the choice of two modes here default just what i just described and reoriented will rotate the normal by the angle between the first and second map by doing that the detailed normal texture isn't just layered on top of the base normal texture it acts as if the detail normal texture is mapped onto the surface described by the base normal the normal reconstruct z node takes in a generated normal vector as a vector 2 and calculates what the z component should be for the output vector the ring this lets you package your normal data into the red and green channels of texture so long as you know the normals always point in the positive direction freeing up the blue and alpha channels for other uses to reduce the number of texture samples and texture memory your shader requires for example you could include a smoothness map in the blue channel since it only requires grayscale data but you'll need to create these packed textures manually the color space conversion node can be used to convert an input color between the rgb hsv and linear color spaces we have two drop down options to pick the input and output color spaces the checkerboard node creates an alternating pattern of tiles colors according to the color a and color b inputs the uv is used for mapping the pattern onto objects and the frequency vector 2 is used for scaling the checkerboard in those respective axes the output is the checkerboard color as a vector 3 although as of this video the documentation accidentally lists the output as a uv vector 2. noise is one of the best tools to use within shaders if you want to create procedural content or if you want highly customizable properties on your materials the simple noise node generates a basic type of noise pattern called value noise using a uv input to map the noise onto your mesh and a scale input float to rescale the noise texture in both directions the output is a single float representing a noise value between 0 and 1. the gradient noise node generates a slightly more sophisticated type of noise called perlin noise using the same uv and scalar inputs as simple noise and a single float output once again perler noise is a very common type of noise used in random number generation particularly for textures and terrains the voronoi node is a very pretty and versatile type of noise it works by generating points on a grid repositioning them in random directions then coloring each pixel in the grid based on distance from a point the closer to a point we are the darker a pixel is we supply a uv for mapping the texture plus an angle offset float for randomly moving the points and a cell density float to decide the number of points that are added the out output just gives the distance from the closest point as a float which is usually used as the voronoi pattern the cell's output gives us what unity calls the raw cell data although reading the auto-generated code in the documentation it seems to be colored based on the random x offset for each cell the rectangle node takes an input uv and a width and height float then generates a rectangle with that width and height the width and height should be between 0 and 1 and if you use the same value for both you should get a square texture the output of the node is one if the pixel is within the rectangle and zero otherwise these shapes can only be generated in the fragment stage the rounded rectangle node is exactly the same as rectangle except it adds a radius float option to specify how much the corners of the rectangle shape should be rounded the ellipse node similarly takes a width and height float and a uv vector 2 and will generate an ellipse if you give it an equal width and height you'll end up with a circle the polygon node uses the same width height and uv inputs and also adds a side's input which defines how many edges the shape has the result will be a regular polygon has been stretched if the width and height are different and finally the rounded polygon node has the same inputs as polygon plus a roundness float option which acts like the radius option on rounded rectangle these four utility nodes are for miscellaneous things but as it turns out three of them are extremely powerful nodes which can transform the way your graph fundamentally works plus the preview node the preview node takes in a vector input and outputs precisely the same thing the reason for using this node is that it displays what your shader looks like at this point so it's extremely useful for visually debugging your shaders in previous versions of shader graph which didn't feature redirect nodes which you can add by double clicking an edge preview nodes used to have secondary use for redirecting edges in particularly messy graphs whenever you drag a keyword node onto the graph which are based on whatever keyword properties you've added it will have a number of inputs and a single output depending on the value of the keyword to find all this material in the inspector a keyword node will pick whatever was input to the corresponding keyword option for example if we use a boolean keyword we can connect a range of nodes to both the on and off inputs and the output is chosen based on the value of the keyword a sub graph is a separate kind of shader graph that we can create they have their own output nodes which we can add outputs to and when we add properties to a subgraph they become the inputs to the graph then we can create nodes in the usual way on this graph once we've created the subgraph we can search for them in our main graph and use them like any other node the properties of the sub graph appear as the inputs on the left and the outputs from inside the subgraph appears the outputs on the right of the node a custom function node lets us write custom shader code to run inside the node i won't go into too much detail here because this node is probably one of the most complicated and bespoke of them all but if we click on the node settings we can define a list of inputs and outputs of whatever types we like and then we can attach shader code files or write code directly in the settings window that custom code is written in hlsl and we can write the name of the specific function from the file to use for this node and as a palette cleanser we can deal with some boolean logic nodes the and node takes two boolean values which can be true of force one or zero if they are both true or one then this node returns true else the node returns false the or node also takes two boolean inputs if either or both of them is true then the node outputs true else it outputs false the nod node takes a single input and returns the opposite value in other words if true is input false is output the nand node is equivalent to doing and then passing the result into a not node if both inputs are true the output is false else the output is true the all node takes in a vector of values if every element is non-zero the output of the node is true on the other hand the any node also takes in a vector and returns true if any of the input elements are non-zero the comparison node is used to compare the values of two float inputs based on the comparison operator chosen from the drop down in the middle of the node a boolean value is output those operations are equal not equal less less or equal greater greater or equal for instance if the two inputs are 7 and 5 and your operation is greater then the output is true the branch node can be used to take decisions in your shader similar to an if statement in c sharp if the input predicate is true this node takes the value of whatever is plugged into the true input otherwise it outputs whatever's in the false input beware that both sides will be fully calculated and the invalid branch is discarded so it's not a good idea to have a huge node tree plugged into both true and false if possible move this check as early on in the graph as you can the is any en node is short for is not a number in floating point of arithmetic nan is a special value representing an invalid number this node returns true if the input float is nan and false otherwise similarly infinite is a special value that floating points can take the is infinite node returns true if the input is infinite mesh defines whether faces are front facing or back facing based on the winding order of its vertices that means the order the vertices are listed in the mesh data the is front face node will always return true unless the two-sided option is ticked in the graph settings but when it is ticked we can decide to change the behavior of the shader based on the facing direction of the mesh [Music] and that's every node covered shadowgraph is an amazing visual tool for building shaders and while it doesn't yet cover every use case for shaders most notably it's missing support for tessellation and geometry shaders as well as stencils the sheer number of nodes included out of the box make it a fantastic inclusion in unity this video took a long time to put together so thanks for watching if you enjoyed this or learned something don't forget to subscribe and hit the bell for notifications etc or the standard youtube links and check out my patreon there's a bunch of goodies of graphs for subscribers and my biggest supporters are on screen right now until next time have fun making traders you
Info
Channel: Daniel Ilett
Views: 128,041
Rating: undefined out of 5
Keywords: unity, shader graph, unity3d, daniel ilett, unity shader graph, every shader graph node, unity shader graph tutorial, unity shader graph all nodes, unity 3d, unity 3d tutorial, game development, shader, all nodes explained, all shader graph nodes, daniel ilett shader, shader graph nodes, unity shader tutorial, shader tutorial, unity shader, how to make shaders, unity beginner tutorial, urp, game dev, shader graph tutorial, universal render pipeline, shadergraph, unity urp
Id: 84A1FcQt9v4
Channel Id: undefined
Length: 81min 31sec (4891 seconds)
Published: Tue Apr 20 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.