Shaders in UE4 | Live Training | Unreal Engine Livestream

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
hi and welcome to the Unreal Engine news and community spotlight so last Thursday we had the 2017 game Awards and we were so thrilled with the showing of Unreal Engine developers and developers across the board there are a number of amazing games and titles being announced and taking home some nice prizes we want to give a huge shout-out to the ninja theory team for he'll break hell-blade they took home best audio best performance and games for impact we're super proud of that team we also had a number of other fantastic nominees including Farpoint our very own for tonight we're super thrilled for the pub to you guys they took home best multiplayer rocket League was nominated for eSports we have a whole recup recap up on the block so if you're interested in seeing the presence of Unreal Engine at the Game Awards I do recommend going and checking that out additionally this last weekend we had her PlayStation experience happened in Anaheim and there were a number of incredible games also in attendance there your had days gone had a an amazing booth with a giant bear there's a lovely picture Joe Kriner on her blog if you want to take a look at that concrete Jeannie was in present presence they had 10 minutes of gameplay that you could check out tiny medal wasn't showing a number of really fantastic games I also recommend going and checking out that recap this week we also have our weekly karma earners again these folks are answering questions on answer hub helping out their fellow community and we just want to give a huge shout out to 16-bit gen B e every none be our Marco shadow River the Hiva ninjin a Northstar Center rious and curse zero you guys are amazing and do so much for your community your methods so our first community spotlight this week is a project called sky market so this was done by a student so they have a as a nine week project for their environment class and he based it on a concept called tent city by Kevin Jick and I just thought he did a really really wonderful job on this scene especially loved showcasing student artwork so huge shout out to Michael Mann freedy great job on your project our second spotlight this week is the long way home it's in development by nifty llama games which is the best name for a game studio just name and it's this it's a heartfelt story about Frank who's this old man who's suffering from amnesia and so you're you go through the game as Frank and you're sort of rediscovering your identity and your past and it's very much about the storytelling and experience of going through that and it's a nice real it's a small team I really love their art style it's it's very light-hearted with a very heavy character and story based influence so if this is you know just early alpha footage I can't wait to see their game once we get further along in development but I wanted to give a huge shout out to them you know it's something that I want to keep tabs on and follow on and I think you should too and our third community spotlight is a game called small and it's a survival game where you're shrunk down to the size of tiny little animals and so you kind of have to deal with the the effects of being this shrunk a shrunken character you're dealing with an ecosystem with floods and winds and so it's a different just a different take on the survival genre and game so you have to deal with all these giant animals protect yourself from brain drops and it's up to eight player co-op support so you can play with your friends and just take on the world together but I really love this wonderful playful style but it's gonna be a little harrowing because you're dealing with the world at large as this itty-bitty person so really excited to see how it goes they've got an IndieGoGo campaign going on right now if you want to help support them I'm really looking forward to it so a great job all right thank you for joining us for our news and community spotlight [Music] hi and welcome back to the Unreal Engine livestream I'm your host Amanda bot and today I have lovely the wizard know a lot of folks are really excited to have you back on materials so thank you for joining us my pleasure it was always nice having that last minute can you do Church stream sure sure okay into the knowledge that you've stored away sure or the thing that I was able to come up with like in 20 minutes right before yeah I can do that yeah sure so what I'm gonna talk about today is when Amanda asked me if I would do the the stream I looked to see kind of what I've been playing around with lately and and for the last really really the last year or two I've been doing a lot of stuff with shaders and and how they relate to blueprints and particles and things like that so what I created kind of is just something to play with I kind of wanted to do this effect and and this gave me the opportunity to kind of play around with it so those of you familiar with Lord of the Rings movies and books and stuff like that there's the Palantir which is like this marble sphere kind of onyx dark colors with flames and things like that inside that you know the Eye of Sauron opens and all that stuff so rather than going quite that complex and giving it all that state of being and stuff like that I just kind of wanted to do like an eyeball that you could put inside this kind of flaming sphere and and customize it a bit so what I've got is this very kind of reflective glassy wet looking sphere with this internal effect that no matter what orientation I look at it always tries to kind of move towards the top of the sphere and I'll break down the shader and of how I did this but it has broken out properties for like fading it down so you get that kind of deep Fornell appearance changing things like the size of the iris and the pupil so you can give it that kind of looking at you from any position behavior and then there's things like the amount of distortion that gets applied to the the fire itself so it starts to warp around and get an odd kind of chaotic behavior along with things like changing power and stuff like that and when one nice thing was in playing with the I I realized I wanted the pupil to distort the burning around it so when the pupil is open the iris just there a little bit it actually expands that flame pattern around it so you get some consistency to the behavior that you're getting and then of course I can change the amount of normal that gets applied both to the individual elements so this is like the surface is glass and it's kind of changing that pupil and iris or completely zero it out and no matter what the surface is doing my eye is always very very clean so I'm gonna go ahead and jump into the shader itself and and break it out a little bit and talk about it so by default you can see the until I apply a lot of those material settings a lot of the parameters are defined you can kind of see where it's going but it doesn't really look like what it's going to end up as so I'm gonna maximize this and I've commented really quickly on how the material is laid out so a lot of it relies on methods of doing for now shape whether it's doing a dot product to get the the shape of the iris or to do the fall-off the the color and the interior heat so I find that there's two different ways that I typically do this one is by manually doing the dot product which I'm doing in a couple ways a couple different places and the other is to use a sphere mask so the sphere mask and I've talked about this in other streams essentially given two vectors it gives you a fall-off based on where those vectors overlap well we use it a lot with just simple texture coordinates so that you can say at the halfway point of the texture coordinates or the center of the texture I want a round fall-off point from that center point so that's kind of what you're seeing here I'm looking at the camera vector or the the direction that I am looking at the the surface and dot parting getting a sphere mask that relies on the halfway point right in the center that's fear so wherever I'm looking and this can be bringing the normals a bit as well wherever I'm looking at the sphere essentially the sphere is looking right back and that's how I define the iris now the yeah the iris now to get the pupil I'm basically doing the exact same math but with a smaller radius and then I'm subtracting that from the iris so the combination of those two things here gives me my iris with the pupil in the center and I've parameterize some of this stuff as you saw so I can control the final look I find that in building complex materials it's pretty rare that the material in the material editor is going to look like what it's going to be on the surface because a lot of the the tuning to look right under lighting conditions and in the right place and a level and and especially when you get special effect things like irises that can open and close and things like that you're gonna want to drive a lot of that after you're done building the concrete material and so that's what I did here for the fire itself so I'm taking the camera vector and just one of the channels it's it's parameterize so I could play around with it and this is giving me that that fire effect essentially what I'm doing is projecting the texture into camera space so it always moves up towards the top of the the camera view and then I add a little bit of pan to that to get it to move in that vector this may not actually look like much yeah you can't really see what's going on here but this is basically projecting the texture to tile this way and also move it vertically which is why there's a line right down the center in parallel with that I'm also displacing it and this is so that I start getting that warping flame effect and that's partly based on the camera vector as well but it's also being modified by the normals themselves and this is where that pupil displacement comes in so this sphere mask gives me a radius that's slightly larger than the regular pupil but I'm using the same parameters as inputs so as I increase or decrease the pupil size you see there's a little bit of an increase to that radius so that the this pupil is always slightly bigger than the other one I could do this as a percentage or as a ratio but it was easier to just do it as a flat percentage or a flat 0.1 ad and then that gets added into the distortion mapping and so you can start to see it here so as the distortion happens this is the shape that's going to change the UV coordinates all of that is here so I've got a texture object called the flame texture this is just a texture that was in the engine content folder and that's being fed into a function called four-way chaos this is something that's actually included in the engine but it wasn't flagged a show in the function library so if you do want to find this the easiest way is to go into your content browser go to engine content and type chaos it'll show up a little quicker if you have your filter set for material functions but this motion four-way chaos if you drag it in we'll add that function to your your shader graph what this function does if I preview it and I'll get rid of the incoming coordinates so you can see that what it does is it takes the texture object that has been applied and chaotically moves it for diagonal directions and by chaotically I mean that I add a random offset to each layer of it so that they don't the tiling doesn't line up if you do this without the the chaos what happens is when the the loop time comes around when all the UV coordinates happen to line back up you'll actually see that happen you'll see like the texture suddenly converge become really strong in one way and then and then move off to eliminate that occasional kind of tiling overlap inside of the the the function it just adds some value to the texture coordinates for each layer so feeding this in does all of that to the incoming texture coordinates so that's where I start to get this kind of 3d space where it's projecting with some depth it looks nice on a sphere it looks like it's kind of always following me this distortion comes from the UV coordinates of the sphere not generating or not storing data in a way that allows the camera projection to work quite correctly it's a totally normal thing we I could fix it by doing a loop between two separate coordinate spaces to find the top and the bottom of the sphere but it wasn't worth it for for this particular thing so this fire one's once once I've got kind of that chaotic but always upward motion done I just have some math in to let me adjust the the contrast and brightness for that because I'm using it as alert between these two colors so this gives me my primary and secondary colors which you can see here and then I have a little more shader math to adjust that and then there's more so the but the darkening here I mentioned that there's two ways that I typically do a fall-off steer mask is one of them's and sphere mask works really well for kind of a procedural fall-off that you can control the the the core of and the fall-off for this is much more of a I have a single point in space and I want to fall off from it but if you take the camera vector transform it back into tangent space because it starts off in world space and then mask for just the blue Channel you get this nice Fornell fall-off without actually using the fernell node it saves one instruction typically it's just one of those things that I've found works better for me in the way that I work and then I have controls for changing the power of this so if I increase this or decrease it you can see my my fall-off and this is used just to modulate the fire so it doesn't reach all the way the edge which gives that kind of glassy curved look this gives me the control that I need inside the shader to set this fall-off and then this just gets added on top of all of that iris and pupil behavior that I created before that one thing that I did was give me give myself control over the the strength of the surface normals if I didn't do that it would be a really really strong normal surface by looping between zero zero one as a three vector which is basically just the the unmodified surface normal of the polygon and the normal map that I'm using and this happens to be something along the lines of a like a moon unwrap some kind of planet texture this one was used for skies but it worked really well for wrapping it around the default sphere and so this parameter allows me to control how strong those normals will be applied to the final surface which comes in really handy when when you want to tune something to to look nice instead of being excessive knowing that I'm feeding this into the normal input means that anywhere I want to invoke those normals on a surface I can use the pixel normal world space node what this does is after the normals have been calculated for the surface it outputs that as the the surface outside of the material editor so once I compile it we can go into the instance that I have applied and I can start playing around with these values so if I wanted to increase or decrease the amount of normals that are being applied to that flame move this here again let me increase my visibility of the fire so by increasing this flame normals I'm controlling how much of that is is being passed into the distortion so I start to get more of the surface of it distorting the flames not just the displacement any questions so far they're wondering what a shader like this be reasonably programming in a game it really depends on how you use it there's a lot of things you can turn on here because the way I built the shader I knew that I wanted to play around with what the iris could do so I actually made a master version of it so going to translucent on a shape like this is going to be not the cheapest thing in the world the actual shader itself it's not the kind of thing you'd want to put on characters all the time but for a special effect where it's the only one or two of them in the scene it's not taking up full screen all the time so you don't have to worry about the pixel cost it's not that bad it's not it's 185 instructions in the base pass 202 with surface light Maps and 268 58 with volumetric lighting which I'm clearly not using in here there's a few things I could do to optimize it I'm doing the same shader math two or three different times to do different fall-off effects I could probably take that down to one and just do a little math to push the results around so I could lock the IRS and the pupil to some kind of ratio to each other so you didn't have to expand the iris and then expand the people to catch up things like that but it does mean that my ability to do things like this is maintained so now I am actually looking right through into the sphere if I turn off two-sided and the shader recompiles once this is done I can look right through this eye so you could see I could end up using this as something like a portal that opens based on my progress through the game or I could do it as some kind of a special case weapon impact something like that so it allows you to play around with the the utility of the shader long after you've created it I could easily go and say well the ones that are translucent like this are masked I want those to be a completely different color and let's set the interior color to be a little more green and I can do the same thing with the iris color so now I've kind of changed my my state on that one sub instance it's still inherits most of its its behavior from the original instance because it's a child of it but it does let me do interesting things to change the behavior so for the performance question it's not something that I would do on every surface but it is definitely performant enough to use it on special events special case things they're also know what if they don't want to use camera space so you're using that so I'm using the there's a couple different things you do other than use camera space if you use the regular Fornell node I use camera space and then transform it as I as I showed you could change pretty much anywhere I I use a specific vector you could play around with using other vectors you could use reflection vector you could use the the surface normals transformed into world space there's a lot of things you can play with there you could also and I've got this off to the side because I wanted the the full shader to be working and not special okay something but this would actually let you use something like the material parameter collections to pass in a point in space you wanted this to look at so instead of just doing my sphere mask this looks at where the object is where the sphere is in the world this other tracked location and generates a vector to that so basically it's saying given these two points in space this is the the direction I have to move along to to move to from point A to point B I can also use that vector to generate a sphere mask and then you end up with something like this where the vector here is basically just some arbitrary point in space that this is going to look at so if I make a strong positive Z with a little bit along X I'm basically saying point at this location in space and then I could easily take that and feed it with a blueprint say drive this parameter wherever the player is so now this thing always looks at the player but it doesn't use camera space to do it so you don't end up with the oddities of camera space where if I move here you can see the the iris is still looking basically at me even though it should be center of the sphere kind of looking passed me because I'm I'm just using the point of view of the the current rendered camera and the surface it's always looking at kind of the center of my screen it's not really looking at where my players body is and using the object tracking kind of logic that I've got kind of orphaned off up here I could update the parameter collection to be the players location and maybe move vertically a bit so that you're looking roughly at the players eyes or if you had some other event and you wanted the eyes to track that you could do it that way as well so you can play around with this you could feed this with a blueprint you have a timeline that adds noise to it so the I kind of twitches and doesn't look directly at something you could easily do something with like pupil dilation where you look to see you know how close you are and as the closer you get the bigger the pupil gets or the smaller so you can kind of play around with fun stuff like that so there's a lot of different ways you can kind of push this once you've got it doing something interesting I was just playing around with the mass version and I noticed that if I leave two sided on see if I can turn this back on and set the pupil normals to be one I now have the ability to move through the surface so it's still two-sided but the the mask result is succeeding on the inside of the sphere as well as the outside so now I have this kind of portal hole through the object I'm inside the sphere right now looking around so you can see I could use this for any number of things you know I'm in spirit space and then I'm looking through it or I'm teleporting through some kind of you know space portal whatever there's a lot of ways you can take that somebody asks what would I say is the standard polygon limit for average devices to form at 60 frames per second without counting textures if you're just pushing triangles you can push tens of millions the cost for that starts to and how the triangles are used are they lit and shadowed how complex is the material applied are we doing anything real-time like updating transforms if it's skeletal asset then the the CPU burden is higher because the CPU has to recalculate the new positions for every vert on bone influences and stuff like that so unfortunately it's not it's not a question you can answer with this a single number because you know are you talking about Xbox one PlayStation 4 or mobile devices Nintendo switch PC what brand what level of PC right you have older CPUs newest fuse older or newer GPUs there's a lot of things that go into all of that we have a large amount of documentation online about how to look at performance and and how to see where your time is going because it does end up being the sum of a large number of things lighting shadowing game thread things like AI and the decisions that makes and so on and so forth any other questions [Music] they could expand the on the pixel okay the pixel normal world space is all of the results of what goes into the normals modifying the actual normal value for the surface in the material editor it it doesn't show the exact results on the node because it kind of has to feed it through again for the pixel normal to actually be valid so if I make my normals really strong here so there I've got a really crunchy kind of surface normally like if I just preview this lerp node here that's what I would expect to see that's the current state of the normals on the surface but most of the time I'm going to see this because the material editor doesn't know all of this is done until the final shaders compiled so when I right-click and preview this I'm previewing this without any of the rest of the shader being compiled so no normals being fed in to see what this is doing so you need to have the shader fully compiled to see what that result is but essentially what it's doing is its allowing me to bring the current normals for the surface into other positions in my shader somebody asks aqua effects how would you combine this with a regular diffuse texture as in just have this effect on a certain part of character that wouldn't be too hard you could use material layering you'd have to reconfigure the way this works a little bit to go into material attributes really uh I almost dropped my mouse for most of it it's just going into the emissive I've got a simple black going into the base color and and setting it to me nonmetallic just so that I didn't have any unusual lighting conflicting with my emissive channel back to what it was before but you could easily then use this as a layer or you could just lurk between textures if I were to textures already filtered here you find a good example there we go so just a texture grid black and white squares nothing fancy I could easily use this to lurk between zero and my normal emissive just feed this into the Missis texture and now I'll have half and half half an eye and half just a black texture so you could easily use this tool Earp or even additively modify an existing shader or something like that other questions I can catch the ones that are at the bottom but a lot of them don't show up there they're honoring if you show the shader complexity view in the editor viewport so you can see where the the mask is removing the the pixels to allow the surfaced render the other side it's a little more expensive it's not the world's cheapest shader but it's not horrible and we're not talking about like real translucency or anything like that one of the reasons that I did it as an opaque surface and have that fernell to darken the edges was so that I could get the look of a kind of translucent glass sphere without actually having a render translucency if I modify the material instance and start playing with the the center out fall-off you can see I can get a pretty glassy looking sphere by just reducing the the distance that it can exceed the center so it starts to look more like there's a curvature fall off to the sir to the effect instead of just where it's running on the screen is it possible to make shaders with GLSL and are the performances as good as blueprint shaders all if you switch to OpenGL but because of what the platform you're on then yes it would compile the the shader compiled handles that for you there's a different set of trade-offs because OpenGL is a different shading language than direct3d so there's different strengths and weaknesses in both I don't know enough about the the heart of that to really tell you what's going to be better and OpenGL typically if what you'll see on on projects that like Unreal Tournament that straddle that quite a bit is will have what are called feature level switches and that looks like this so I know basically if I'm running in OpenGL it'll be ES 2 or 31 and I can say this machine this the shader should be compiled using this set of nodes or this set of features based on what platform whether it's mobile PC OpenGL direct3d things like that and then if you use this intelligently your shaders will just work on most of the platforms that you use them with any other questions what is the gain of speed what is the gain of speed compared to matrices and I have no idea I'm not even sure where to start with that were you gonna walk through some like setting these up so I'm going to it's not going to get to this point but by the time I'm done but if I was going to recreate it so just starting off what I did was I built the eye at the pupil and iris first so I know I'm gonna use a sphere mask and then I'm going to use the surface normal this is an inevitable thing it's like there's three different ways of doing something how do you start really it's it's it's it's like a lot of other things where there's a lot of complexity to the system I could come at it in a number of different ways just picking one and going with it seems to be the best thing to do and then later on refining if I need to for example when I was building this shader it started off as basically just a couple fragments here and then over time it it grew and then none of this was documented so I went through and did successive passes to reduce that as much as possible so for example like this I originally wasn't using the pixel normal for anything so all of this slowly became a little different so let's say we'll do shameiess between the surface normal and will do the camera vector I'm transforming the surface normal into world space so that both things match and there's the beginning of our eye so in setting the pupil size I'm basically saying what percentage of the sphere do I want this dot in the center to consume right at one it takes up a lot of the sphere it doesn't take up completely because there is some fall-off if I go to box you'll see when I'm looking almost completely at the surface it completely fills but since the sphere does fall off some of this is not pointing close enough to me to get which is totally fine so I could then say do this as a ratio will call this iris ratio and we'll say it's 0.9 so it's nine tenths of the size of the pupil will be the size of the side - this is the eye our size not the pupil so we'll do the same thing and I literally am recreating the exact same math here but now my pupil is always a percentage of the iris so we will now subtract and we'll make this emissive compile it and this is where I start using material instances to make sure that my assumptions about how things are gonna work are correct so I can say the IRS is this big and I can see my ratio is working correctly so if I wanted to actually maximize it and you know have an extremely dilated eyeball I can do that or I can set it to be a small percentage and do a very fine ring well you'll notice that because this the math results are procedural everything is extremely crisp I don't have any compression artifacts or anything like that so it's one of the reasons that I like using sphere masks to build a lot of the round gradients and things like that that that we tend to use pretty frequently it's because it gives me that extremely clean surface now I don't want the iris to be a completely sharp edged like that so I'm gonna bring the hardness down to 75% and now I start getting that fall-off there if I do the same thing inside of my pupil I start to get that softer fall-off ring then of course I can change my ratio so there so now I've got this nice kind of adjustable eye line that I can play around with so this is where we start doing like the the fire stuff find a good texture for this we'll use something a little different we'll use this low-res blurry one we'll call this fire texture there so now again I need to find that chaos motion I could rebuild it but why bother so here we go and now I can connect these up and preview so I'm not sure how well it's coming across on the screen but there's a kind of a gentle flow of kind of smoky noise going on here and this again is where I start to add in my shader math for allowing me to adjust these on the fly all right so now we have these two things that will eventually become my mask for the fire and increase the power significantly so now you can start to see that kind of plasma like flow and I could keep these in lockstep and basically just increase the contrast by doing this but I prefer to have independent control this is plenty of times where you want to bring the power way way up so that you start to get just little dots and then use the multiply to kind of reduce for that so that gives you more control to do things that wouldn't have been terribly easy just right off the bat so now that we've got this I can just use this as my lerp now I a lot of times I'll clamp this just because you can get some odd results out of lips that are above or below the the normally expected range but for this I'm pretty sure I'm not going to go too far there's been a few questions around the your chaos no sure I'll open it up in a second but basically what it does is it takes four copies of the same input texture offsets them randomly in some direction and then pans each of them in a different diagonal direction it then reduces the contribution of each one by 1/4 so each one sums up to a possible value of 1 and that's where the the divisor comes from so opening it up it looks like this I've got my input coordinates and my input speed here's the texture input that's being used from this here's the texture samplers being used from this input so you give it a texture it feeds that into these four nodes your coordinates get fed in an offset gets applied to three out of the four of them because the fourth one doesn't need it it then goes into a panner and each of these panners goes in a different dragon little direction at the same ratio so this is going 21 X&Y this is going negative point one x and y negative point one and positive point one positive point one and negative point one so you basically get all four diagonal directions add them all together multiply them by the divisor so 0.25 would be taking what could possibly be a total value of 4 and reducing it to a potential maximum one and then we output the result and that's where you get this so if the divisor wasn't there it would be very very blown out with this divisor you end up with something that looks like it has the same value range as the original texture in RGB but has that motion to it so then we just put it in here feed in the appropriate values and then we can set up our fire colors work between them and then I can actually use the result of this to lurk between the fire and the IRS so right now I've got one for the iris I can make this a color or pupil no iris I promise I actually do know the difference between two things so normally I think what what I'm going to do is this I'm going to use just the IRS and then I'm going to subtract this maybe there we go so now I've got an iris that is contractable I can expand it like that and my fire is there but the probably adjust the tiling would help a lot and also making two different colors there so now back in the shader if I wanted to I could say texture coordinate multiply it by another scalar for fire tile pass this in to my material function and now back in my instance I can reduce that tile now I can actually see my my flame effect a little bit and then again if I really wanted to I could do that hammer vector transform mask and say the fall-off amount help if I type even remotely correctly and punt good I was hoping that I'd fail at English and typing again today so this should give me a fall off that I can use to modulate my fire before it goes into there we go modulate my fire before it goes into the the IRS calculation just the blue Channel not the red and green and we'll go back to here we've got something wrong Oh was transforming it from world from tent from world space to tangent space doesn't work if you're only leaving it in the other space there we go okay so now I should have a fall off that I can apply to get that kind of glassy sphere look if I set my roughness to be the inverse of my yeah it should just work as the pupil there we go so now I should have the glassy sphere which I can modify that fire eNOS play around with it so again it's not exactly the same as as the the other one that I built I haven't done all of the complex stuff that I fed into the four-way chaos that you see in here but this was just to reproject that fire texture into camera space instead of just leaving it wrapped around the sphere like this is doing but you'll notice that my roughness dies before or my my surface reflection dies around the iris so you're not being distracted by that if I wanted to I could leave it just like this you'd end up with more of the cow eye look well flaming cow I the only really good kind so there we go look there's my office actually up there somewhere questions since I've been looking at this let's see they're wondering is there a way to have an emissive material on a skeletal mesh mesh cast regular shadows emissive material and let's go it's the a regular emissive material as long as it isn't set to translucent should shouldn't modify it now on skeletal meshes it's not going to emit light unless you have some kind of oxidation going but static static geometry can cast emissive light if you turn it on in light mass and compile it that way but skulls of geometry doesn't do that but there's no reason you couldn't have this I mean I already have it casting a shadow this would work exactly the same on the skeletal surface this is so much easier actually feeding through so if you have multiple UVs is there a way to create a parameter to control which UV the texture maps to sure so you these are defined which UV set you use is defined inside of the texture coordinate node there's there's one way that I would do it there's a couple of ways to approach this most of the time you I would use a static switch parameter and have something going like this where I tell it one of the UVs one of the texture coordinates is one the other is zero and I have a switch that says use alternate UVs so now the state of this switch would feed out which UV channel is being used I prefer static switches because in a lot of cases different UV sets don't behave at all the same when your if you're trying to like look between them or something like that if I look at like UV channel 2 on this sphere that's what it looks like I look at one this is what it looks like looping between these any kind of real-time lerp would smear pixels and move them all over the place it's we very very frequently have like a second UV set that's kind of planar map so that we can pan things across the the character or wrap them in a shield belt or something like that so we do store multiple UV sets for a lot of our characters for specific use but usually the material explicitly knows which UV set a part of it's going to use and we don't end up having to parameterize those too much we we do sometimes for more complex projects because you may not know how your material may be used on something in this special case so we may end up adding switches like this so that you know Paragons a good example there's a ton of different characters a lot of them use the same behavior we just turned certain things on and off to accommodate how works on a particular character just sit here and dance yeah okay Thursday how would you go about distorting the sphere Matt the sphere mass to look more oval say for like a cat um so it depends on on where you feed it in there's there's a couple different things you could do one just using the sphere normals like this wouldn't work very easily because everything is based off of the world projection of vectors on that however there are ways that you can adjust things so for example I have a three vector coordinate here if I feed this in I'm adjusting the sphere mask results beforehand I can then start modifying these and start getting some strange results again you can see because this is actually changing the entire basis for the the vectors that are coming off of the surface it doesn't quite work right it no longer tracks me correctly it does strange things when I move up and down so this is probably not the best solution if you want that kind of cat I look for something like that I would more likely have a clamped kind of pupil mask that I scaled to and from its center point that I then use explicitly and maybe use the camera vector as the initial texture coordinates for placing it correctly but not try to use shader math to adjust this to work on the fly that being said there's plenty of things you can do to add some chaos and behavior so for example if I find that for another texture it doesn't really matter what it is if I add this texture and use it to lurk between the say the iris ratio so I'm taking that ratio and making another ratio out of it but now a little less subtle look at the right thing so now by adding that bit of lerp noise to it I'm not changing the calculation I'm just changing the ratio between the iris and the pupil based on where ever this texture falls all right so the darker areas are more likely to get the original iris ratio the brighter areas are more likely to get the further reduced iris ratio so this this is one of those things that lets you add some chaos and interesting behavior into what would otherwise be a very normal thing and of course I could do the same thing for my original sphere mask and just do the exact same thing there so now it's not quite the same ratio so there's a little bit of difference in how the distortion is being applied but since I'm not doing anything unusual with the texture coordinates the proportions are different but the distortion is the same so anywhere it starts to buckle up on the pupil it's also doing that on the iris and so off any other I guess last questions sure how we're doing on time see is there a way to encode normal maps as bump maps and then we transform them so there's a couple of different things you can do if you look at oddly enough the same chaos filter stuff there's also a normal map from heightmap chaos which given a texture will generate the normals from the bump of that texture which is what's being done here and then outputs it as a normal map there's there's a couple other things where you can a couple of the material functions from that are included for doing height map type stuff that I just want material functions yeah so here's normal from a height map given the default texture which is not very high res and this works much better with grayscale height maps than it does with RGB because the RGB doesn't actually encode the the height that well it typically has lighting and stuff like that but with a real bump map you can get you can extract normal data from that and pass along hopefully that's what you're asking and useful information maybe this is not so much related shaders but they're asking if the tessellation of material can be taken into account as collision there's no way all of the tessellation and displacement is done on the GPU so that's after we know we no longer have that data on the physics side or at least we're no longer updating the data on the physics side to reflect what you do on the GPU so currently to the best of my knowledge there's no way for us to displace triangles and have a collision unless the the like certain things like spline are designed to use a vertex shader and kind of approximately collision but that's because that feature has been implemented inside of that system by default vertex shaders and the GPU based tessellation are completely render side only neat well I was gonna say I'd like it it really doesn't affect me one way or the other whether or not there's collision on vertex offset but it is actually important for some people but for my demo purposes like this it's it's not being used I can't actually make this thing movable simulated physics so there goes my my Palantir thing going rolling away there were some questions in the forums and we're getting a little low on time but briefly about just bevel shaders in general or things like that bubbling within I haven't done anything there's over the last couple years there have been a lot more attention paid to how surface normals are being encoded and stored on the meshes to help like edge and face waiting for surfaces but most of that is done to the normals in the mesh in the modeling package before it gets into the engine so the a little bit that I've done along those lines has been outside of Yui for the other last questions I see someone is asking they've heard that having a texture sample with an alpha channel is equivalent to having another texture sample when is it appropriate to have a combined texture with an oxygen so what we end up doing a lot is packing textures let me see if I can find a good example of that so this is a pack texture if I look at the individual channels they're all storing something different from each other for some reason the Alpha isn't used here there are plenty of times where we will use the Alpha as well and pack four different masks into a single texture one way to see what your approximate cost is going to be for a shader like this has an alpha channel up here it shows you information about the texture what size it is what it's currently being displayed at what the maximum size is and the resource size as you change compression schemes if I go to greyscale and shot to be ah there we go so different compression schemes understand how to use the channels differently some of them discard some channels some of them store them but they don't always use them because that's how the compression scheme works it's pretty easy to see if you import a texture with or without alpha that that's not doing anything in this case the resource size will change it's not just about what channels you have it's also about how you compress them and how you use them so a lot of it is up to the needs of the the surface a lot of our mass textures if it's for a single asset like this is the chair if you need another channel you pack it into the Alpha and that's fine if it's something where and and this happens a lot in effects where we have textures where it might be fire in one channel lightning in another a water splash in in the third and then we realize we need one more it's it may be worth it to put one more channel in the in the Alpha it only adds each channel has a fixed amount of data that is going to be stored in it adding one more channel adds that much data it doesn't add the same amount as another RGB texture it may change how the texture can be packed depending on texture settings but it's not it's not true that adding an alpha is like adding an entire another texture it's like adding an entire another channel do I first see the material editor getting's first up with organizational options and rewrite nodes etc it seems to be lagging behind blueprint improvements significantly it does have reroute nodes there because the the core of the material editor is a different feature implementation than blueprint while we there are a lot of things that we'd like to bring from blueprint over it's not as simple as just implementing the feature that's using the blueprint editor in the material editor because they're there they're different core concepts and how it's displaying nodes how it's displaying lines things like that we do have a wish list of things that given time and priority and and and things like that we'd love to add I don't know the tools team has a much better idea of what's on their plate for that I don't know whether or not they have it on their active schedule is something they they know when they'll be approaching but they're there we have a list internally of things that we'd like to be able to do like collapse groups of nodes into aggregate no it's like the blueprint does being able to define variables and then use them multiple places in the graph more easily things like that thank you so much all the questions that you're willing to answer absolutely my pleasure I'm going to be dropping a survey link into each of the chats always let us know how we're doing what you'd like to see on future streams it's really helpful to us to get your feedback there keep submitting your Nvidia edge projects we like to see those we'll be doing her another round here at the end of the month and it's gonna be our six-month anniversary we're really excited for that program so keep posting those and next week is our year in the review so we're gonna have Tim Sweeney and Joe Kriner coming on I'm gonna talk about all the amazing things that we've had happen in 2017 I can't believe it's just about over thank you so much again Ellen it's always a pleasure having you we'll see you all next week take care Oh [Music]
Info
Channel: Unreal Engine
Views: 102,962
Rating: undefined out of 5
Keywords: Game Development, Shaders, UE4
Id: mig6EF17mR8
Channel Id: undefined
Length: 68min 17sec (4097 seconds)
Published: Wed Dec 27 2017
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.