Hair at a Strand Level | Grooming your creations in Unreal Engine

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
MARIO PALMERO: Hi. And welcome to this video featuring the Hair and Fur System ready for you to experiment with it in the version 4.26 of the Unreal Engine. My name is Mario Palmero, and I will get you through the incredibly easy process of bringing your groom into Unreal and setting it up. Instead of using the directional cards approach that we have been using in the video games industry for many years, this new system renders physically based strands in real time. So what you are looking at are thousands and thousands of hair strands being rendered in real time. This of course, is a huge step to close the gap between real time and film quality for applications, tools, or games. The main pillars of this new system are first of all to render the hair following a physically accurate approach with proper lighting and shadow models. And that will include transmission and multiple scattering of the light. So as you can see here, the rays of light are bouncing inside the mass of hair before being absorbed by their hair or getting out of it. But also another of the main pillars of the system is to have a simulation of motion on our strands. For that, the hair is going to be bound to the skeletal mesh and simulated on top after that. Let's have a brief look at that. Let me set this actor as hidden in game. Let's hit play and let's go into full screen to really appreciate it. As I've mentioned, the strands are going to be following the animation that we are playing on the character. In this case, through the Animation Blueprints. So as this character moves, the hair follows. But it goes beyond, because we are going to have a layer of simulation on top, as you can see, on the hair at the back of the head and on the nice swinging of the fur and hair on the tail. Really cool, right? But there is a third pillar for our system. That is to be able to scale down in a transparent way depending on the capabilities of the different platforms we're going to be running our application or game on. So an easy to use level of detail system will handle the reduction of complexity for our grooms. But first of all, and before I start, I would like to credit some people, mainly credit the Unreal Engine engineering team. And in particular, to Charles de Rousiers, who is leading the implementation of the hair system, and Michael Forot, responsible for the physics simulation of it. And also, for the meshes of the two characters I will be using in the video, thanks to Veronica Rubio. And jumping now to the content, as a brief introduction to what we are going to see in this video, I will start by demonstrating how easy it is to have your groom imported and added to your character. I will be working for that with this little friend. Then we will go through the settings for the strands, the creation of a material, the physics simulation that we just saw, and finally, the level of detail for hair to scale down when demanded by platform specifics. So to create the groom for this character I used Blender. But you can use your preferred software to create the groom as you desire. The idea is that we will be exporting that into an Alembic file. So our curves will be a stored in that Alembic final, and later imported into Unreal. So let's visit Blender. And as you can see, here is my character. And here my particle system created as hair. I already edited it, and this will be what I have to export in the Alembic file. I have it here already exported as an Alembic file. So I jump into Unreal. And from my folder, I can drop this file into the Content Browser. And then the groom import options will pop up. I used these settings and have to touch nothing more. But I'm not importing it because I already did it, but that will be the process. So I already have my groom. For this groom, it took me a couple of minutes to import. And then what needs to be done is to have this groom attached to the character, to our little rabbit. It's to create a new component, of course, a groom component. And we have several references here. We just have to drop our reference groom into this groom asset reference. And voila, we already have the groom on our character. So really nice, really simple, and really cool. In order to be able to have groom assets inside Unreal we have to, first, enable these two plugins, Groom and Alembic Groom Importer, and in the settings, Support Compute Skin Cache. This will be required when we bind our hair to the skeletal mesh for our character. So just go to your project settings and look for skin. Then enable that one. Then jump into your plugins, groom, enable the two that will appear. And just restart your editor, and we are ready to go. And now for the hair strand settings, we open the groom asset. And here in the groom asset we can see that there are several tabs. We have the LOD system, interpolation, physics, strands, material, meshes, and cards. In the interpolation will be the importing options. And then we have strands that contain like the basic settings. Here we can change the thickness of the hair, for example. I can go all the way to, let's say, 3. And then instead of having nice, thin hair, we will have something very thick like spaghetties or pasta. Who knows? That could be good for certain situations, right? And then we can also change the thickness of the roots and the tip of the hair. Usually the tip is going to be thinner, and the root is going to be thicker, like in this case. But you can play around with that. We can also clip the length of the hair. So for example, if we want to use the same asset twice, but one of those will have the hair shorter, we can say that we want to have the hair to be, for example, 20% of the total length. So as you can see, we can trim the strands. It is always relative to the full length of each hair. And then we can tweak, for example, the shadow density. For that I'm going to go back to the level, save this. And in the groom component, here, I can set these values per instance instead of per asset. And I can tweak this shadow density value. And as you can see, decrease or increase the amount of shadowing that each strand is going to be casting. And we also have use stable rasterization. This shall be used when we have very thin and short hairs, usually for fur. For small pieces of hair that lets you see through them and reveal the surface underneath, like the skin underneath. So, for example, if I jump back to my other character in the face, we can see a big amount of small, little hairs that allow us to see the skin. So I activated that in this groom component, because it is really useful for when the hair has some pixel density to be rendered properly, and don't miss the rendering of some of those pixels. That will be all for the general settings. And now we can step in the material section. To create a nice material, it's going to be as easy as the importing process that we just saw. And the system is going to provide us with a lot of different inputs. We will be able to have UVs per strand, but also root UVs. So depending on the location on the mesh, we are going to be able to read from a seed value that will randomize our hair appearance, for example. And we will be able to read from the length and width of each strand. Also, we will have a hair color node that will generate a realistic color depending on the melanin amount, a substance that we have in our hair. So we can work with values from the real world. Let's go, again, to the editor. Inside the groom asset, here, in the different visualization modes we can show, for example, the UV. We have the UVs. And as you can appreciate, here we have two dimensions. So we have that we can progress in this dimension, or around the strand. Those values are going to be generated for us. But in the root UV, for example, these values are going to be imported from our Alembic file and will be-- usually will be correlated to the UVs of the mesh. If we go to the documentation just for one moment in the documentation page, we can see that we have a list of the different attributes that we can inject in the Alembic file. And one of those is the root UV value. Then we also have the seed. And each one of the strands will have a different seed value. So we can randomize any output attributes of the material, like imagine roughness or color depending on particular hair strands. Let's now visit a material. Let's go to the asset editor. And, for example, I'm going to go into these two materials. We have several groom components to build these hair groups on the character. So it is quite complex. I have 11 different groom components. So I created several for the hair at the front, for this lock that you can see here with this knot, and another one for the hair at the back of the head. Then four different fur types here in the head and neck, two more fur components here. I have two different assets for this part because I wanted one to cover the skin and the other one to create the outline, the silhouette that you can see. And last, one hair asset for the tip of the tail. So jumping into the materials of the mongoose or the "Meloncillo" like we call it in Spain, what we have is this mongoose material that is set to hair shading model. And as I mentioned, we have a hair attributes node so we can read from it the UVs, and the length, reduce the asset value to randomize stuff. I'm mixing these two together to alter and add this secondary color that I have here on top. So, for example, if I get closer to the fur on the neck here, you can see that there are blond strands every now and then. That is because of this secondary color. The other one that I'm reading is this root UV. And in here, what I have is this nice texture that is providing the fur with a nice looking gradient. And that's it. It's a pretty simple setup, just a few nodes for the base color, and a couple of values to play with, like specular and roughness. And with nothing else, we have this beautifully rendered fur. For the hair, I wanted to experiment with the other node that is provided by the system. So this hair melanin material is using the hair color node that, as I said, it generates a color based on these three melanin, redness, and dye color. So if we want to dye our grooms, we can also do that by setting this input. And here I'm just creating some nice variation, mixing again using the length this time. But I'm using the seed here to create a very small variation of the melanin amount on a per strand basis fitting these two nodes. So the result of that will be that if I go to the material instance and I modify a bit the amount of melanin, you can see that as I lower the amount of melanin, it goes to white, losing coloration. And if we increase the amount of melanin, it will go to a really dark hair. And that is accurate to reality. So you can pick a color or find your color using the amount of melanin on the hair as I showed you. And now, let's dive into the skinning and simulation process. This process is really, really simple. We will see that if I have my groom here it is just right clicking. Sorry, right clicking one of our groom assets to create binding. This will give me like the basic setup. The one that I use here is the target skeletal mesh. Just like select my mongoose asset, and that is all. Just hit Create. It will take less than one second, and will be ready to be referenced in the [INAUDIBLE] asset section of the groom component. And it will follow every animation played by the skeleton mesh. Pretty easy. But we can go further. And if, for example, I want to attach this groom to the mannequin instead of to the mesh that I built this hair for, I can select here the mannequin. And then I can say that the source is this mongoose. So the engine will try to fix the differences between one skeletal mesh and the other. And it will try to find a good way to bind that into our mannequin. But once we have done that probably we will want, in this case, to have physics simulation. But before doing so, let me show you one command that will allow you to see how the skinning process is being done. So what I have to do is go to the output log. I will activate the debug mode using the command hair strand. There are a lot of different strands commands that you can check. They are pretty useful. And we will talk about some more a bit later in this video. But you shall be checking what are the most suited for you. Hair strand, debug mode 12 and r.Hairstrands.MeshProjection Frames, sim, deform triangles to 1. And what we can see here is that how the triangles are being used by different grooms and how that is bound. So the color triangles represent that some strands are attached and with what influence. I had a bit of a rigging problem with this character when I first attached this hair to it. And I wanted to debug the influences that this hair was following and reading from. And for that I just removed the rest of the groom assets, and I used this command to find and filter the error. So I wanted you to know that there is a way for you to do the same if you have the same problem. Let's undo all of this and go for the simulation. So for the simulation, first, you have to know that the engine is not simulating every strand of hair. What it is going to be simulating are the guides. And those guides are a subset of the final amount of the strands that we are actually rendering. So, for example, in this asset, if I go here and I ask it to show the guides, we can see that it has displayed a colored subset of the hair strands that are being actually rendered. The rest of the strands are going to be interpolated among the nearest guides. Once we know how the simulation is going to happen, we go to the physics tab, where the first thing is to, here, enable simulation. As you can see, here the solver is running on Niagara. So the new particle system, Niagara, is going to be in charge of simulating our guides, and then interpolate that to the strands. So we can choose different solvers. We have two already built in. But you can create your own solver, implementing your own behavior. Then you have several variables that we can tweak. To, for example, improve convergence, that will be the capability of approximating the right solution. We have sub steps, iteration counts, and also strand sites will help with that process. The higher the strand size, the higher you need to rise iterations or sub steps to have convergence. So for a game, a value of 8 or 16 should be enough. We also can set up the hair constraint. So I do have it enabled and tweak the bend constraints here in the hair by raising up the stiffness and keeping the damping relatively low so the hair will preserve the silhouette. So I wanted this spiky shape to be consistent. I wanted to experiment with different settings. And in that case I wanted to have, for example, in the tip of the tail I wanted the hair to be more fluid and loosy, and in the hair more consistent. One thing we want to add as a feature in future versions also is to have forces or wind applied to the simulation. Because right now what we have is the simulation is only considering collisions on the physics asset. So nothing from the surrounding environment will actually affect the physics. So in our meshes, in the physics asset, these are the only colliders that the hair simulation is going to acknowledge. And as you can see, because I didn't want to have simulated hair outside the head and the tail, I'm only using body capsules for shoulders, neck, head, and tail. And now, after witnessing how easy it is to port our groom into the engine, create a material for it, bind it to our skeletal mesh, and enable simulation, and enjoy experimenting with it, let's have a look at some numbers. So for the 11 groom components I created for this character I have 400,000 curves. And from those, 40,000 guides are being simulated for a total of 8 million vertices being processed for our hair in real time. Pretty amazing considering that the render is quite unique for the case of the hair. And it conveys certain implications. But my takeaway from this character would be that the fur needs the vast majority of the curves budget. As you can see here, the 93% of curves are dedicated to fur. But on the other hand, just the 65% of vertices of those 8 millions are dedicated to fur. So we can infer from that that the fur needs to be able to cover the skin, and that means a high amount of curves. But the hair, the long hair, the long strands, are going to have a higher ratio of vertices to have a more realistic simulation and rendering. Also, some of the CVars values I've been using for this project will be these. First, regarding the voxel resolution, I tried to increase it as much as possible. For several rendering capabilities the engine is using a voxel representation, for example, for shadowing. So maxing them out will increase the quality. And regarding transmission and shadowing, I raised these values to have a more exaggerated look that could fit with the cartoony stylized look that I wanted to achieve. For the depth of field depth I disabled it, because it gave me a better definition with cinematic cameras. And by enabling super sampling on deep shadows, a better shadow quality was achieved. And finally, to the scalability system. It will allow us to keep our performance at a desirable level. The level of detail system will auto generate different strands representations with different qualities. So it will grab your Alembic file data and will modify it depending on the distance, reducing amount of curves. And when we are far away and we don't want to be rendering strands, for example, because it's quite expensive, the system can switch to a mesh card representation of our hair. And if we get farther away, we can even swap to a solid mesh representation, kind of a helmet representation for our hair. Regarding the cards, the engine is going to skin those in the same way as the strands. And also, it is going to be following our animations in the same way as it is doing right now for our strands. And also, we are going to have a similar material attributes. So it's going to be really easy for us to jump from one material to the other and use similar values. But there is also a feature that, though it is quite experimental at the moment, and the quality can vary a bit depending on the type of groom we are working with, and we are not sure about how this feature will evolve in the future, it can generate procedurally, for us, the card mesh from our groom asset. So we don't have to authorize those in an external package. It's up to you to try it and see if the results are good enough at a longer distance. So follow me back into the editor and see how easy it is to, with a couple of clicks, have your level of detail set up. Let's visit again our little friend. Here we have the rabbit, yep. And let's open the groom and see that in the level of detail tab we have everything set up as we imported it from the Alembic file. We have no curve decimation or vertex decimation. Everything is set up to display the groom as imported. But in the moment I click on this plus icon, the engine is going to generate a new level of detail that has the values halved. So it will reduce the amount of curves and vertices on the distance. So if we get closer and we select the LOD that we just created, it may not be very noticeable. But if I go to the extreme with these values, something like this, then it will be more obvious that the number of strands is being reduced, and the resolution too. But at the same time, the thickness per strand is being increased to compensate. And that's why it looks like there is the same amount of hair in general in global. And if I hit again this plus button, the next level of detail level that is generated has these settings halved again regarding in reference to the previous one. At some point, what I want is to, instead of having to render all these strands to have some mesh cards, what I want is to have some mesh cards to reduce the overall cost of the system. And for that I can create a new LOD level. And instead of leaving the geometry type as strands by default, we can select cards. In that moment, this LOD level will read from this section card settings here. I have to set these LOD index first to the same value as the LOD level that the card settings have. And then what we can do is we can import our mesh with cards. And we can reference that in here. Or we can ask the engine to generate the cards for us. I'm not going to do that in this video, because for this groom asset, that will take some minutes. But let's go back into the mongoose and see how that will look. So if I unhide this other version here, the one on the right has the LOD bias to always show cards. OK, and the one on the left has the system working depending on distance. As you can see, those cards have been generated by the engine. And let's have a look at what different settings we have of that. In the mongoose asset, that will be this one, where the cards will look like this. When requested to do it procedurally, it will generate a mesh, but also some textures. And in those textures we have depth information, coverage, tangents, and also some attributes. And those attributes can be read in the material in the same way that we were able to read from strands, through the hair attributes node. So here I'm reading the seed value for the textures to apply some variation on the color depending on the seed value. So it not only handles reducing our strands and switching to cards that are a skin on the skeletal mesh, but also generates the mesh and provide us with a transparent way of using materials. But we can go even farther, and the engine is going to be doing the heavy lifting again in this process. So if I want to use this mesh for the hair at a really, really far distance instead of having a more complex geometry being a skin and simulate it-- I just want to attach this to the head and have it simple. The system is not only going to handle that change, but will generate for us this texture. If I disable here the alpha channel, you can see this is the tangent values actually. That will help us get a better lighting and rendering result. And to generate those four textures that I have here, we only have to right click on the hair asset, create a strand texture, sorry. And in here, choose the texture resolution, how we want the tracing process to happen, tracing distance, and what type of mesh we will be using. So, for example, I would select here my static mesh, choose the level of detail index, and hit Create. And the textures, voila, again, would be generated by the editor for us. And that will be all. Let me just hit Play once more again. So summarizing, this system not only renders the hair with an unprecedented quality, but it makes the process really easy for everyone to important asset, bind it to a character, apply a physics simulation on top. And also, it will allow us to handle, in a transparent way, the different platform requirements using the level of detail system. I'm really, really eager to see what all of you are capable of using this new Hair and Fur System. I hope you enjoyed and thank you so much for watching.
Info
Channel: Unreal Engine
Views: 86,987
Rating: undefined out of 5
Keywords: Unreal Engine, Epic Games, UE4, Unreal, Game Engine, Game Dev, Game Development
Id: __BScFPJy3E
Channel Id: undefined
Length: 29min 51sec (1791 seconds)
Published: Fri Jan 22 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.