고급 나이아가라 이펙트 | 인사이드 언리얼

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] hey everyone and welcome to inside unreal a weekly show where we learn explore and celebrate everything unreal i'm your host victor broden and my guest today is senior technical artist john lindquist welcome to the stream hi thanks for having me well of course today we're going to talk a little bit about advanced niagara effects but first off i wanted to ask you you mind telling your viewers who you are and what you do at epic hey yeah no problem uh i'm jonathan linquist i've worked in games for the last 13 years or so 10 of those were at epic started off on fortnite and moved over to niagara uh and since then i've been creating the library and working on interesting problems for gdc and other demos yeah but uh yeah that's a i guess we can just like jump right into the presentation let's go the floor is yours let's go okay so um actually could we play the first video yes let's go ahead and do that one moment [Music] [Music] [Music] uh [Music] uh [Music] uh [Music] [Music] foreign [Music] all right i think that got everyone excited great that's awesome uh so what you just saw was a combination of techniques and new technology that have been uh compartment have been packaged together within niagara for users like yourself um so cracking open that package and understanding how everything works together um can be made easy so uh let's just go into how that how that works um so first of all we'll be talking about how particles can interact with the world and how they can [Music] learn more about their environment geometry and then we can start to dive into some of those specific examples that you saw the flocking bats and swarms of insects and then finally the position based dynamics demo which is a very unfortunate made out of corn who gets popped into kernels popcorn dude yeah so um yeah so first we'll talk about world interaction uh so we want to look at the methods that niagara can use to to gain information about the the world the first is uh niagara can read a model's triangles and this is this is a great tool for very specific one-off effects say if you wanted to place particles on a character's skin you'd want to read the character's triangle positions and then place particles on them um and next we can read we can trace against physics volumes on the cpu can cpu collision particles and then cheap sorry gpu particles can create the scene depth so from the point of view of the camera a particle can trace outward and find the closest opaque surface and that's super helpful for certain types of effects and finally we can query distance fields and volume textures so given the given the problems that we're facing within these demos that we just showed we can start ruling out different methods of analyzing the world based on our requirements so first of all uh triangles if uh triangles only work in for certain use cases um niagara would have to be pointed towards a specific mesh to query and that wouldn't work well for the insects which are supposed to crawl across the entire world um there's physics volumes as you can see in this picture a collision for a given mesh can be fairly coarse so uh that wouldn't that wouldn't be good enough for the insect swarms uh they would clip through the visible geometry uh another option is the seam depth but that has limitations and that it's two-dimensional if an insect were to crawl along the scene depth and crawl behind a surface it would no longer have any information to pull about the world so that leaves us with distance fields and as you can see distance fields provide a great combination of accuracy and volumetric representation so for all of the demos that you're going to see here today we're going to be referencing the global distance field inside of unreal engine [Music] um so a global distance feel is a volumetric texture a volume texture that's black and white and it contains values and each of those pixels or voxels say how close that location in space is to the nearest solid surface so inside of niagara uh we might want to query a specific location in world space we'll we'll call that p and p will be this red dot um so with p are with the global distance field we can ask it for its distance to the nearest surface at this location and it might tell us that the near surface is five centimeters away uh that's great but if we really want to place particles on that surface we'll need to know which direction the surface is from our current location so we can sample the global distance field several times in a pattern in a cross pattern around our current location and that will give us a gradient if we compare all those samples together and that will point us towards the surface so if we take the distance from the first sample to the surface and the gradient that we've just calculated from this operation we can move the particle to the surface um that's one of the two ways that we typically use distance fields the second way is to do something called sphere casting so sphere casting operates on similar principles but it allows us to trace in a given direction so say our particle is p0 um and we want to trace along this line to find the nearest opaque geometry what we can do is we can say okay at this position the closest surface is 10 centimeters away so that means that we can safely move along this arrow 10 centimeters without hitting a surface so we haven't found the surface yet so we'd like to do that again so at p1 we can sample again and find the distance and then move along that vector once more and then again and again again until we get to a solid surface so that allows us to basically retrace within within the world and find surfaces and it also allows us to just find the nearest surface easily and you'll see how useful those are in the future so before we go any further you'll see a lot of debug visualizations that i've made and that's really helpful especially when you're working with complex effects so i made this module which is in the tool set called sprite based baseline and basically you can give the the module a point to start at and a point to stop it and it'll produce variables for you that you can use to draw lines here's a uh here's a video showing that in action so these birds are drawing lines to the nearest surface that they can find and it's also raycasting forward um using the other technique that we talked about called sphere casting so um additionally sometimes you might want numeric information so if you go to the material editor you can find a series of of uh material functions called debug float x and that will actually take any numeric input in and display it as a number so in this example you can see that the birds while they flock around are actually telling me more information about what that is being computed internally so what i've done is i've taken information that's inside of the particle effect and i fed it into the dynamic parameter and then i've placed a sprite in the emitter that shows this material which draws the numbers out in space in the future we're going to have uh debug functions that don't require additional work on our part but as of right now this can help you in a number of ways so um there are a lot of high level functions that we've already produced for users which are which are great but if you want to get into the nitty gritty details you can also go inside of a module and then query the distance field directly using a collision query so you just feed one of these functions a location and um the query and you can learn more about the world so we can get into uh phlox now flocks are interesting because they don't only need to know about the world and avoid the world but they also have to avoid each other and they have to interact with each other so each bird needs to know about its immediate surroundings and if you were to do it naively it'd be really expensive but luckily unreal has included tools to make that less so we'll go into those as well so when we're talking about flocking mechanics we have to think about each bird as a individual actor and each one of these actors has a view of the world it only knows certain things and we can model that through different equations so the the steps that we're going to take today is uh we're going to first make each bird particle avoid the environment and then um they're going to look around in their immediate surroundings and find birds that they would like to flock together with or be attracted towards and then when two birds get too close to each other we want to make sure that they don't interpenetrate so we're gonna make them avoid each other and then to mimic real behaviors when several birds are flocking together we're gonna have them match each other's velocity and we're going to add a little bit of extra code to handle special cases like when two birds face towards each other and they're going to collide um we'll specifically write something that'll make them turn away from each other so um using those functions that we talked about earlier we can start to see how they're useful inside of a flocking system so in this case i've spawned a grid of particles and then i've had and then i'm using this avoid distance field surfaces gpu module um to push the particles away from any surface that's nearby most birds wouldn't want to fly into a wall so um that's the first behavior that we're going to mimic and then additionally we know that birds often look straight and then avoid obstacles that they see coming toward them so in this case we're using a sphere trace to mimic that and there's certain issues that one might find as you write a void simulator that aren't immediately apparent so as as the needle gets closer and closer to your reality you start to pick up on little behaviors that you might not notice initially so one of those issues is that say if you were to take a particle and you were moving forward and it you wanted to make it avoid an obstacle in front of it one thing you could do is you could just apply a force to it to slow it down and then push it in the opposite direction but that'd be very unnatural for a bird to do it would fall out of the sky so instead we can redirect the bird um using a cross product operation we can actually just reflect its velocity off of the wall so this forces burns to turn away from obstacles as opposed to just slowing down and then stopping in certain cases um that's that'll be a little bit of a theme during today's presentation is that not only will we want to mimic certain behaviors but we also want to be very careful with um how they're exhibited so so our particles right now if we were to use the example that was just shown would avoid the world but they're not avoiding each other they don't see each other so what we can do is we can within within our content examples you can find easily understood examples that will show how this is done but for this presentation i would just like to cover the theory um behind it so say that this first group of dots is your particles in 2d space uh if we're going to mimic a bird's avoidance system we might say only the particles nearby will really affect it so we can say that we want to collect the nearby particles and then react to them in some way in a naive solution you might just query every particle in the system and then ask it how far away it is and which direction it's facing and then react to it accordingly but that would be really expensive so we can use something called a spatial hash or inside of niagara it's called a neighbor grid basically we can take all those particles that we're talking about all those birds and we can throw them into a volumetric three-dimensional grid and once they're in a grid we can then ask for certain grid cells and locate any particles that are within those grid cells so that narrows down the problem for us we now instead of having to pull 10 particle locations we can now pull say 4. this is really important when it comes to efficiency now that we know what other birds we have to interact with at a very coarse level we want to get into uh a much more narrow band of what the word can actually see so in this case we're making a cone of vision so we're taking the particles toward velocity and we're using a dot product to locate any particles that are right in front of this bird's path and we're also measuring the distance between this bird and all the other birds and we're able to ignore the other birds and just focus on the ones that matter luckily like all of this is done for you inside of our module sets so you can just drop down a void force module and much of it's taken care of so i also have to note that some of the modules that i'm pointing out here aren't a part of the official niagara module set yet so in 426 we have a contents examples map and you can find all of this content there and eventually as additional iterations are made on the modules that get incorporated into the official library i should mention that the content examples for 426 will be available once 426 is out in full release which is pretty soon soon so if you're excited to open up this party take a look at it you will be able to in just a couple of weeks cool so see good printers so this is the actual model that's used to mimic bird flight and uh you can see that we have an avoidance force uh cohesion strength which basically means that a bird looks in front of itself and then finds all of the birds that it cares about and then it averages their positions together and then it finds a center location and the bird attempts to become part of the group by aiming towards the center location so there's a force associated with that you can tweak it as you need and then there's a certain distance that each bird wants to keep away from all of its neighbors that he can see so we've exposed that value and then um as a little bit of a tweak what we've done is we've exposed the movement preference vector and that'll force the birds to move more on the x and y axis rather than like straight up or down because it seemed pretty unnatural and the bird's legs are flying uh directly upward and then uh finally the last the last force is uh one that makes them match each other's speed yeah um go back to the presentation so so we say like we got the particles to move the way that we want it's like that's great but we want to represent those particles as meshes and orientation is really important in that site case so we've released a module called flight orientation and what that does is it tracks the velocity of each particle and it uses that forward velocity as a look at direction but it doesn't just do that it also banks the models as it turns or as it rotates which feels like fairly nice and you don't have to know how it works but just in case you're curious this is a little bit of the math behind it we take the previous frames forward vector and the current frames forward vector and then we find the delta there between them so if the particle is like turning this way we have a small arrow that's pointing in that direction and then we find the cross product with the up vector and we can use the result of that to rotate an alignment quaternion and then what we can do is we can limit the amount of rotation that each bird can possibly take so uh and we can also decay that rotation over time and do a number of other things to make sure that each bird tries to stay upright as much as it can and as it turns quickly it should naturally like turn into the curve say you have your models now and they're rotating as you might want um you'd want to animate them so there's a tool out there called uh that or for text animated textures and animation textures and it allows you to take a fully rigged skeletal mesh and bake out the morph targets for that mesh into a texture that can be used inside of unreal as a world position offset for one static meshes vertex positions to provide you with very natural motion there are additional plugins out there that can also be used but this one works best when a mesh deforms in a very organic matter so if you'd like to know more you can search for vertex animation tools online and probably find the link but it's also here right now so we can talk more about the swarms now the insects a lot of what you've seen so far also gets applied to the swarms insects as they're crawling along surfaces have to orientate orient themselves to those surfaces so we actually use the same set of modules that you saw for the flocks in the swarms so first what we need to do is we need to place the particles on the on the surface so we reference the global distance field again and we find we first spawn particles within a box just randomly and then any surf any particle that's too far away from a surface we just kill right away it's called rejection sampling and then we take the remaining particles and we move them to the surface as you've seen before and then whenever we move the particles or update their location in the future we perform this operation again we don't kill them again but we always snap them back to the surface so as these syste as you uh learn more about advanced particle effects you may want to accomplish multiple goals at once and sometimes that requires a little bit of a forceful hand so in this case we want the insects to move around randomly but we also want them to adhere to the surface so after we allow them to move around randomly they often become detached from the surface and so as a post-process we can just take their final position and then pull it back and uh you'll see that type of approach time and again throughout this throughout these effects so in this case we did something a little bit differently the insect legs are rather small and we had hundreds of them running around so modeling those out would have been too costly and it wouldn't have yielded the types of looks that we'd want so there's this approach called spline thickening that uh one can do inside of the material editor that became quite helpful so the legs of each insects was actually modeled out as a strip of polygons and then within the material editor we took those polygons and then stretch them out and thicken them so that regardless of whatever direction you looked at the insects from their legs always seem to have volume so that was like great but it wouldn't have worked well with the previous approach that we mentioned for the bats it wouldn't work well for that animation system there's an alternate animation system that we created uh which basically attempts to make a static mesh move around as if it were a skeletal mesh you can check out the the post about it on my twitter account but you can also look into the unreal engines extras folder and you'll find the you'll find the script there along with tutorials so this shows one skeletal mesh animation being applied to multiple static meshes and that's an additional benefit that one gets by mimicking skeletal much animations if you were to track the movement of individual ferts then you wouldn't be able to apply that animation to other assets easily and additionally since we're tracking phone rotations and location shifts we can actually layer other animations on top of it so when you see the insects wings flap that is actually another a second animation that we've captured that was layered into the walking cycle um in the video that you saw earlier you might have noticed that the insects would crawl around slowly and then as the player approached them or as the player threw a light in their direction they would scatter quickly or search for search out and exit and we might call that a state machine then um the insects do one thing and then they do another when they're provoked so yeah so what i've done here is i've created a struct and that struct has a number of properties to it and uh created multiple strokes and as as the insects move from [Music] one state to another we slowly lerp between all of these values that you see here and each of these values then later get used in the simulation to change their behavior patterns so it becomes a really nice way to consolidate all of your animations into one location so in this one lerp function i'm actually driving say 10 different modules and uh time to talk about the kernel i guess so before we talk about the specific collision technique that was used for the kernel we should talk about alternative collision methods so um there's a method called continuous collision detection which basically takes an object at point a and then finds out where it'll be at point b and then attempts to find any collisions that happen within the traversal from point a to point b so in this diagram you can see that this particle's moved toward the red location but it's now penetrating the ground so to to solve that issue you can move the particle back to the location of penetration or the moment before it first penetrated and then you could reflect its velocity and the remainder of the of the delta time for that update and send the particle on its way that's actually how the collision module inside of niagara works but this demo required collisions that were far more complex so we use another method called position based dynamics and in this example we can see that in one frame two particles are not colliding with each other so it's fine but on the next frame after we've done all of our updates we can see that the blue particle and the red particle are penetrating each other so to resolve that what we do is we move the two particles in opposite directions and based off of the mass of each particle we move them less or more so if a really heavy object hits a really light object we're going to move the really heavy object a small amount of distance and the light object a much larger amount of distance so um this can happen many many times say if you had a hundred particles all bouncing against each other in order to find all of these little issues and to correct them all you might have to like cycle through every particle several times so what we can do is we can find the position that the particle initially wanted to be at and the part the position that the particle ended at and we can derive the velocity from that delta so we can say that over this top small slice of time this particle went from this location to this location so we can say if it moved this far over this amount of time the velocity is x and that's basically how the collisions work inside of the kernel demo that was good thanks man yeah so uh this is just kind of like an overview of the overall uh implementation inside of niagara and it's not meant to um be a bible or anything it's just meant to give you an idea of the flow of data and how these different constraints can be applied to a particle system so here when we solve for newtonian motion we're just saying that this particle was located right here and it was moving at 10 centimeters per second and it traveled for one second so given the laws of physics the particle should be over here that's great like now we know where the particle wants to be and then we want this particle to know about its neighbors so we feed this particle and all of its neighbors into one of those grids that we talked about earlier so this is where position-based dynamics comes into play we have this grid and each particle looks around and finds all of the other particles that it could collide with and then corrects those injured penetrations and after that process is finished we then check to see if the particle has collided with the world afterward and if it does we then pull the particles out of the surface and update the velocity based off of that trajectory and animate it for the rest of the time so if you were to look very carefully at the output you may notice that the particles were fixed up correctly here so they weren't penetrating each other but then if the particle penetrated the world it could be moved out of that collision surface and it could potentially be penetrating other particles again but um you don't actually see this in practice for the most part one could iteratively go through this whole cycle again and again but it would make it too expensive for games so this is actually a decent trade-off um so the physics behind moving the particles around is one part of the problem another part is that we want to represent a character so uh typically with effects one might just like place particles on a mesh's surface and be done with it but in this case since we're working with kernels of corn that'll fill up an entire volume we have to use a volumetric representation of the character so for that reason this is a this is something that you do when you make demos you kind of like go off in the deep end but for that reason i cut the character up into multiple parts each limb was separated from the rest of his body and then i baked out a distance field for that loom i could then use that distance field to to pack it with particles and you can see that process here in this case i'm only using a box so i'm spawning a bunch of particles within a box giving them volume and then i'm allowing position based dynamics to pull them out of each other so that they're no longer penetrating each other and then after after that process is run you can see that the particles no longer take the shape of a box so i can reference that sign distance field that i was talking about earlier again and i can kill any particle that's outside of a surface so you can actually watch the process happening in this video so the particles spawn and then they pull themselves out of each other and then they kill off all the extra elements so having particles that kind of look like a character isn't really going to get you all of the way what we wanted to do is have a laser cut through the cut through the character and to detach particles that would have would have been supported by those those particles that were cut so uh luckily the human skeleton does support structures and it makes it easy to create like a support system from from the uh the alignment of bones so for every for every sine distance field say the hand or the arm um i associated those sign distance fields with a limb so as you can see this arm is associated uh all the particles in this arm are associated with a shoulder and his shoulder can provide me with an arrow an arrow that's pointed along the bone direction uh the hand can do the same and the idea is that i want to find an arrow that travels across the body down to the feet because the feet are going to be what supports the rest of the character so um for to take a step back and look at the whole process again spawning particles inside of a sdf i'm using pbd to like push them outside of each other and then i'm cutting off all of the excess and then i'm copying those particles over to another emitter and then i'm running this process on it this process which injects all of the particles into a grid and then um then finds the nearest neighbors and then looks along that arrow that i defined in this step and finds particles that are in alignment with that arrow that are close by and then assigns this particle its parent so if this particle is located at the top of the character and this particle is located just below it this particle will attempt to look downward to find the feet and the first particle it'll find which is the closest and the most in alignment with that vector will be the second particle so this particle will then say that it relies on the support from this from the second particle which is just to say that if something happens to the bottom particle then something should happen to the top particles this video demonstrates that in process what this emitter is doing is it's taking the particles that have been given parents and have a support system in place and it's randomly detaching them so you might notice a single particle turning red and then falling off what we're doing is we're saying detach from the rest of the emitter and just become a position based dynamics particle and then you can see that happening again and again randomly and each time one of those particles are detached all of its children also get detached so one might ask like how do we how do we attach something or detach it so uh attachment is kind of an interesting and interesting area there's a lot of ways to go about it but the simplest way and the way used in this demo is to just lurk between two sets of values so if these particles are attached to a character then they should just remain in place and their velocity should be zero and they shouldn't be affected or moved by any other particles so that's what we're doing here at the very end after everything else is done we're telling these particles if they are supported which is just a bool that i wrote if they are supported then stay in place and remove your velocity and that's it but if they are not supported then what we do is we um then we allow the we allow gravity and other forces to act on the particles so it becomes like a fairly simple process when you break it down in that way so this is an example of how you might just completely override a physics simulation if you wanted to apply a constraint but you might also just apply other physics to it instead so maybe what you'd want to do is apply a spring constraint to a particle and then as soon as that constraint breaks or it's no longer supported you could remove the force from that spring constraint and then the product can just fly off into the distance i guess we could probably just watch this over and over again and enjoy his pain i mean i already did on twitter the first time he posted it that's funny yeah so uh this like this became an interesting problem uh as things progressed i got to the point where i was cutting the model apart via a laser and since this is all just like math and stuff it's like what is a laser um you could say that a point in space uh you could use a dot product to find particles that are on a line from one point in space to another like that'll work but what if that line moves quickly you could do that multiple times you could say that over this frame uh the line moved from point a to point b um so check here check here check here check multiple times but it's kind of lossy never get great results that way um there's always a particle or two that'll slip through the cracks and you can see that here in this diagram so say this is our popcorn dude and a laser just happens to miss the center point of all those kernels none of the kernels get popped it's really unrewarding another method that we can do is like we can sweep collision or basically say we want to find every possible collision that occurs from point a to point b so in this case i'm using a triangle which is a function that we've written and added to niagara now basically you can take any point in space and find out how far away it is from a triangle in space this is super useful for a number of errors a number of reasons but in this case it allows us to create a triangle uh from where the laser is starting to where it ends on that frame and then we're able to find the distance of every particle in the kernel to to the surface plane of this laser as it's sweeping through through space and once we know what the distance is between the laser's path and the individual particles we can find penetrations by simply subtracting the radius of the particle so if the if the laser were two units away from the center of the particle and the particle is 5 units wide in its radius then then we can say that the particle was hit by the laser okay so one last thing that i wanted to talk about was methods that you can use to cheaply lay particles out in an organic way without seeing penetrations inter penetrations with the resulting particles so given all of these fancy tools that we have now one might say i want to and this would be totally valid uh i want to place a bunch of particles around and then i want to check this particle's position compared to all the nearest particles then i want to delete any particle that interpenetrates with it that's one way to do it another way to do it might be to spawn even though even though you want to like make these particles look very organic in the end you might want to spawn them in a regular grid and then randomly kill those particles and then within each grid cell randomize the location of those particles so no particle ever jumps outside of its good cell so there could never be any cohabitation of particles and then you can trace the distance field find the nearest penetration point and then kill the particles that never penetrated anything and then retain the particles that did so that becomes like a very it becomes a very efficient way of organically distributing your particles through the world um for instance this was used on the bass in the cave in the ue5 demo um so that's it sorry that's it that's it no big deal there thanks a bunch john we have received a quite a few couple of questions um and we definitely have some time left so if you're cool with that we can go into questions yeah that sounds good all right so i'm going to try to start off a little bit with the kernels instead where we ended off and then we'll move back towards the swarms and the flocks after um why have trending in chat and he was curious um let's see he asked me to ask john about whether he's doing more than one iteration to fix up the collisions and how that works in niagara using the new features and he wanted you to talk a little bit about the simulation stages oh yeah simulation simulation stages are great uh from this stack you can see that we have our standard emitter spawn update particle spawn particle update and our event handler and in this specific emitter which is a gpu emitter uh i've enabled simulation stages that allows you to add extra stacks to the to the overall effect so in this case i've added an extra simulation stage that i've called populate grid and that just takes a particle and injects it into a neighbor grid and then i've created another simulation stage just to clarify we have multiple simulation stages to delineate operations we when we're populating a neighbor grid we want every single particle to go through the process of injecting itself into the neighbor grid so you want this neighbor grid to know where every single particle is ahead at by the end of its operation so that the next operation can know for sure that it has a fully populated grid that it's working with so populated the grid and then i've created another simulation stage and this simulation stage handles the collisions and uh each time you have a simulation stage you can tell it how many times you wanted to operate how many loops you wanted to perform so in this case with the position based dynamics i've only had i only have it looping three times and a lot of the it the number of iterations that you need varies greatly based on your particular setup but the kernel for instance only loops 12 times or so so uh if i were to create that simulation and only have it loop once you might see a large pile of popcorn kernels laying on top of each other and then for each iteration each particle check would find it any penetrations and it would correct itself and this is in the case of say one iterations one iteration cycle it would find a penetration and correct that penetration and it would be done with it but that wouldn't yield the results that we'd want because after one penetration is fixed up another has probably started just imagine a bunch of spheres like all penetrating each other and then one thing corrects itself that probably means that another problem has occurred so if we up the iteration count we can correct one penetration and then as another forms you can correct that as well and on onward and so forth so um you obviously want to limit the number of iterations as much as possible for performance reasons um but you'll get more accurate results with the increased number or with larger numbers it's always a balance right yeah definitely um chad was curious if it might be possible for us to make the slide deck available for download after the stream yeah that's um i can go ahead and help you with the the videos and such and make sure they get uploaded so they probably play for everyone else as well okay cool very great yeah i just have it sitting in my google drive so yeah well it might take a little while so not immediately today but perhaps early next week we'll try to get it out for all of you um let's see keep going on the the position-based dynamic stuff here i'll start off where those questions started uh one moment let's see um here we go uh suduken was asked are the collision frame rate dependent or better question how how much are they frame rate dependent and this is in regards to the position based dynamics so um they would be frame rate dependent because um with position based dynamics i wouldn't say that this is true it's all equations or our implementations but with this specific one that we've written we take the particles and we fully update them and we find out where the particle would like to be at the end of that update and then we run our position based dynamics sim off of that new location so if a particle were moving really quickly it could potentially jump from point a to point b and miss a number of collisions along that path so for that reason this particular implementation could be frame rate dependent but um yeah other than that like the the math still works out because what we're doing is we take two particles and we pull them out of each other and we derive the time or the velocity based off of the amount of distance that they traveled over a given amount of time so you shouldn't really see a large amount of popping based off of that reason for that reason krusty kum comb was wondering is position based dynamics doable on the cpu in any kind of way alternatively is a ribbon renderer on the gpu something on the to-do list um i can't speak to the ribbon renderer i know that it's been brought up once or twice before um but uh position based dynamics would be possible on the cpu it would just be overly costly because you don't currently support neighbor grids on the cpu it's a gpu only feature so you would have to rely on the event system in order to communicate the position of each particle to every other particle and that would be probably too costly to actually run but it would be a possibility particular studio is wondering was the bait distance field only used for the initial packing or did it drive the animations as well that's actually a really good question so um i think i might have lost over that so in this instance we use the sign distance field to initially pack the limbs and after that what we can do is we can uh store off the local space location of every particle compared to its root or the anchor point so from then on as the character moves around in space all we're doing is uh storing off the bone id inside of the particle and then uh on update we're reading the bones transform and then we're transforming the particle into the bone space so we're basically attaching the particle to uh that local space position um so in the end the runtime emitter only has to have access to its local space position in relationship to the limb and the lim id that it should be following but um yeah in the actual simulation it isn't just a slave to that position it uses that as an attraction location so each particle attempts to reach its local space position but it doesn't force it so like you might have noticed in one of the videos uh as the character gets up and he starts moving around his feet as they like penetrate the ground um kind of dissolve and and spread out a little bit because their attraction the position that they're attempting to get to is beneath the ground plane and i don't know it looks kind of appealing i guess so i think that these types of loose constraints can be more organic and interesting than some of the rigid constraints we had a another follow-up question from particular studio does the stages run sequentially or will one stage finish all its loops before moving to the next one and then can you chain two together as loop buddies etc um you can't chain them together as the buddies at the moment but each stage does completely finish itself before passing the baton on to the next simulation stage so that's that's really important and that's why a lot of this stuff functions the way it does is because every stage is known to be completely finished before the next stage starts converter asked can the kernel collisions be accurate according to the mesh or are they generalized as spheres oh um there would be additional work to make that a possibility and it's something that i've been kind of interested in doing at some point but the uh the math the the simulation becomes a little bit more heavy at that point and some have questioned whether that's the right thing to do inside of a particle system um it starts to become more of like a physics simulation territory type of thing but i think that there would be value in trying to do maybe soft body sims in the future or something like that all right so we can have another question can we fracture geometry based uh can we fracture geometry based on those sphere locations um so like the question at that point becomes what you're exactly moving around you can do uh say if each particle were represented by a piece of geometry um then you could technically break apart a piece of i don't know like a model or something but you'd have to assign a different mesh to every particle if you wanted it to be unique but theoretically the particles will move around as spheres and will look like fairly realistic so it's a possibility so looking at another question a lot of good questions today suduken um can particles be separated based on strain between them yeah actually um so you're able to query all your neighbors every frame so if i were to uh say like look at my parent on a regular basis and ask how far away it is and what kind of forces are operating on it i might want to detach that that connection between them um if it grows to be too strong so you could actually come up with a very compelling physical simulation using that that method all right let's uh let's go back a little bit to the flocks and the swarms um bobsnot my uncle asked how do you get particles to collide with distance field gpu particles on a transparent material slash mesh such as water all right sorry could you could you repeat how do you get particles to collide with distance field gpu particles on a transparent material slash mesh such as water oh so how would we get phlox to interact with water is that the the gist of it um so i guess like in one of these instances there's certain approaches that you could use when you're querying the global distance field like that is the information that the particles have access to basically their understanding of the world so if a given mesh doesn't contribute to that information then you'd have to rely on another means to emulate that influence so you you could do that through several different processes but in this case what i've done is uh as you can see these like particles are turning red and they're detaching from the rest of the body and they're moving around but they're uh bouncing against this invisible surface i've actually placed meshes around this display that contributes to the global distance field but it's transparent so it can be done just by adding something that contributes to the global distance field that's not visible okay hit and mesh with collision turned on uh yeah actually collision doesn't need to be turned on um there are just settings to be aware of though i think it has to cast some type of shadow because the global distance field ignores meshes that do not cast shadows so that was one of the features that or that's one of the little caveats that i learned about while making this but yeah i use the vertex shader to offset the location of the of the mesh dragon dragonheart079 asked how do you know when to use cpu or gpu simulation for your particles um i guess it depends on what your needs are um in most cases i would rely on gpu particles because they're very efficient and they can push a large number rather easily we also have a great number of features that are available for gpu particles there are a few things that are missing at the moment which i'm sure will get to eventually but i believe ribbons is one of them and events are another but events can be emulated through a particle the particle attribute reader which uh is what we've been using a lot here so like say if i had a particle and i wanted it to know about its neighbors i would create a particle attribute reader data interface that would allow me to via particle id or index uh read information from the payload of another particle so uh i guess like long answer is short is that i would particularly steered toward gpu emitters unless there was something that a gpu meter did not provide that was a another question from sudokun which might relate um can we trigger sound events from particle timeline or events um you can actually there's actually um i believe there's a renderer now or a data interface that plays sound so you can actually do it within niagara itself so there's that there is a let's see somewhere around here a quick fly through through the content examples this is what it looks like by the way if you're if you're watching and you haven't downloaded our content examples yet all levels based on everything from blueprints to particles and i'm sure that uh i'm sure that wyatt who's watching right now who's worked on this example is like screaming my name he's probably like it's the other way go the other direction typing um it's a data interface and there are new content examples for playing audio it's in the niagara hallway yeah um maybe this is it or no sorry i must have gone past it or something he's typing again trying to give us some oh he said not a niagara underscore advanced which is where john is and so i believe it's the niagara hallway map okay it must have moved i guess and for those of you wondering about the content examples you can find them under the learn tab in the launcher they're available for each and every one of our engine versions and this particular one that john is in right here will ship when 426 comes out of comes out in full release but a lot some of these things are already some of the niagara examples already exist in 425 yeah so um i think the the goal of this presentation wasn't to like connect all of the lines for everyone um so that they'd be able to replicate any of the effects that we're talking about here but the uh content examples goes a lot further in that direction so cracking up in any one of these particle effects will um really teach you a lot it's probably in my mind like would be the preferable way to learn and on that topic we are going to have yf on the stream later probably at the beginning of next year we're actually going to go through some of the content examples specifically let's continue with some more questions here seducing not a good question is spatial hash part of niagara or just an overlay built on top of it as an example what controls grid cell spacing oh so uh neighbor grid is a di is a data interface inside of niagara so basically you can uh you can create a grid um and you specify the number of cells that are in that grid and then you have the you have the job of translating world space positions into this grid space so we've done that and made that like fairly simple for you through a number of modules so this is a bare bones example of how that's done and i can bring up this emitter so this one just visualizes that neighbor grid in 3d space so i have a 4k monitor and a 1080p monitor or a 1080 monitor like right next to each other and i'm trying to figure out how to get this over there so here i'm using the initialize neighbor grid module which both defines the neighbor grid which looks like this it asks you how many neighbors you how many what the maximum number of neighbors is per individual cell and 15 is like fairly high but it's just a demo so doesn't matter too much and then the number of cells on x y and z that you'd like and then we have this function here which allows you to give the the grid a location a rotation and a scale and that pretty much defines the transform that you need to move particles from world space into your grid space so this this particular module isn't a part of the library just yet it's going to go through some additional iteration before it gets contributed to it but um it is available within the 426 project and you could potentially move it out of there and move it into another branch but um yeah so just a just to like close that loop there this module creates a number of transforms that you can reference to transform world space into grid space and it also creates the grid itself and the grid object is something that you interact with and you tell it to you um you add particles to it and basically the grid stores off particle ids that you can reference later right yeah last time i looked at the neighbor grid 3d stuff and notes that came with it they had to be used to get it with custom hlsl scripts are these hlsl blocks going to be proper niagara nodes um there's still a lot of work being done in that area so the scripts that we've written so far aren't being incorporated into the package yet but we'd like to maybe refine a few ux uh a few like rough edges and then i think that we will uh bring those over into the library but yeah for the moment right now um you mostly interact with it inside of hlsl but there are the nodes you can do things inside of the node graph if you like i guess it just comes down to preference jumping a little bit between the subjects here but there's also all the cover and we've received questions throughout the stream what's the rough cost for having a niagara void simulation how does the cost increase as the particle count goes up um i haven't run um i haven't run a cost analysis on that yet but the pbd sim was much less than a millisecond so that had many more actors and it was doing a lot more work so i imagine that the void sim would actually be cheaper um in regards to the vertex offsets being baked into textures uh the question from a legend forever is when you have alert driving that many animations uh wouldn't that be resource intent wouldn't that be resource intensive and if not how does it keep it down um so uh it's a so inside of the material graph it's referencing a single texture and each line within that texture is a frame of animation so the material graph actually just reads one line and then reads the next line and then over time lerps between them and then as soon as one line finishes up or one frame finishes up then it just jumps down to the next line and then lerps again so it just keeps on doing that so like the overall cost the runtime cost of performing that operation is very small and the cost of the textures i guess wouldn't be insignificant because you would need uh you need hdr textures basically to store vertex positions um but um it depends on your project and and what your needs are i've shipped many projects using them and it wasn't really an issue are you good to take a couple more questions before we wrap up yeah sure all right let's keep going that's good um helios flame was wondering how is the interaction between the light and the bugs built oh that's cool um yeah there's actually a module that i created and let me see if i can find the video uh so basically what i did was i mathematically modeled a cone and i made it into a module within the library so i believe it's called avoid cone but see yeah so you can see that uh there's a module called a void cone and it basically asks you for the apex of the cone which is the pivot point maybe and like an access and then uh the angle of the cone and that will find find a it'll have it'll create like a fall off a zero to one fall off as to whether a particle is within that cone or not and then it also creates a few additional vectors for you it finds um it finds the closest vector to the axis from your from your particle position um so you can use that to push particles away from the cone in the most efficient way possible so in that video that you saw with within ue5 as the character is walking through the dark hallway and our flashlight is shining on the bugs we're just mathematically recreating that flashlight and each bug is checking to see if it's inside of that cone or not and if it is it finds the most efficient route outside of the cone and then uses that as a force to push the particles away and it also changes the insect state so instead of just wandering around it enters like a scared state and it moves quickly um if if the insect is wandering around usually like just randomly over time it'll walk a little bit and it'll stop and chill out it'll do that over and over again but when it's scared it doesn't stop it just like runs faster than usual its movement is more uh chaotic and um and it looks for an exit too but it's like another story i guess but just wanted to like bring up one more thing i think it kind of uh it leads into an area that we didn't discuss that much and it's the use of constraints through out your effect so i mentioned a constraint that could be applied to an effect at the very end with the pbd sim i was saying you can run off this physics sim and at the very end you can choose to either lock the particles to the mesh or you can run the physics in in this case i wanted to make sure that the bugs behaved in a very realistic way and i was finding that if i just relied on a single constraint at the very end you might find insects that would jump out of geometry or like clear small walls or or something like that so like with the flashlight for instance as i find the most efficient route outside of the cone i also look for the nearest distance field normal so like say this is the insect and this is the ground plane and this is the cone and the light is telling the insect to move this way which would push it through the ground i actually read the distance field normal which is right here and the distance field position and i say instead of pointing down here snap to the ground and push along the surface so that's a that'll be another demo that will be included in the content examples let's see one bulletproof was asking uh how performance heavy is this is it really possible to use something like kernels example in a game um like i was saying uh the kernel example was less than a millisecond on my machine so um the final rendered version was using ray tracing which was actually the majority of the cost so i imagine if you had a millisecond of time available to you you wouldn't be able to do that yeah that seems absolutely seems doable it also depends if your game is all about blasting lasers that popcorn dude one you can probably go pretty heavy right yeah twice as big 10 times as big jace jsafx asked in cascade there was an option to stop rotation upon colliding with the ground the niagara collision module doesn't seem to have it um [Music] uh yeah i would have to look into that i guess we'll follow up on that one folks see here get into some of the last questions that i've had come in as we've been answering the other ones um let's see uh contiv asked regarding attribute reading do you use a tick in another actor to query that uh no it all happens within niagara so just on every frame of execution um you can perform operations and particle update or anywhere else and uh attribute reading is another one of those operations that you can perform let's see the niagara attempts to make the most intelligent decision possible so if you have two emitters and one reads from another it'll find that dependency and it'll make the uh emitter that will be read from execute first and then the second the reader will be executed second uh particular studio asked are there any examples of emulated events using the particle reader oh yeah actually yeah i wonder if wythe can answer that question i know that he worked on that cool we'll let him write that up for me and then uh we'll get to the next one um let me scroll up make sure i didn't miss anything earlier um sudoku was wondering is is chaos using distance field collisions as well um yeah it's using it's using it but it's different than well actually i shouldn't i don't think i should talk to that i don't know it's not like an area of mine we'll uh we'll leave that one for the kl stream which is which is coming up uh and you talked about cone casey casting let's see um in regards to position-based dynamics does pbd check each particle predicted position against stationary position okay sorry and i'm about to sneeze it's tough um does position-based dynamics check each particle predicted position against stationary position of other particles what if two particles move towards each other would they intersect for a single frame uh that's actually taken care of by the iteration loop that we were talking about a little bit earlier so um say if one particle is stationary and another particle is moving and the two would penetrate each other um at the end of the position update then we start looking for intersections and we attempt to correct any of those intersections multiple times in order to make sure that as one intersection is corrected another doesn't form so um it should operate correctly in in one case in one of my videos you might have seen one particle going through another and that was because one emitter is siloed off from another emitter so i actually had multiple emitters sitting next to each other and one was using a pbdsm and the others were not so in that case where it's over here yeah so this is this is one emitter and then this is another emitter and then that's another emitter so if you see any penetrations there it's because um they're not looking for each other all right all right see one brief check over i think i think that's it for now we're also at the uh one and a half hour mark that's actually pretty short for some of the last streams i have done have been almost two and a half hours to running up on three hours so it's great um go ahead and manage this uh one last question for you john um mug17 asked are there any resources that you would recommend to learn about particles or niagara perhaps john has any personal recommendations oh there's this coding train website that i like uh let's see oh processing.org i believe is a good reference that i sometimes rely on for information awesome well john it's been a pleasure to have you on the stream um i hope everyone out there had a good time uh checking out some of the some of the niagara systems that were made for the ufi demo it's exciting especially since you can actually work on them or use all these tools already in uv4 there's there's no reason why if you're interested in this you don't have to wait for uv5 just download for 26 preview and start digging into it all right gonna do my little outro spiel here um if you've been watching from the start of the stream thanks so much for hanging out with us we do this every thursday at 2 p.m eastern time um next week i actually have the team behind what remains of or not the entire team but kind of a large percentage if you're considering uh ian dallas and brenda martinovic is coming on to talk about um the art behind building what remains to be the finch if you haven't played that game i highly recommend you do so before next thursday if you're interested in watching the stream one of my top 10 games of all times i would almost put it in because of how it's such an original original game barely want to call it a game anyway they're going to be on the stream next week if you're interested in that make sure you tune in and we have a little surprise for all of you something that's going to be announced very very close to the stream next week um if you are curious about some of the terminology or you don't remember when we said something throughout the stream uh for all of them uh we have courtney behind the scene and she writes a transcript for the entire stream um what that means is that you can go ahead and download that transcript and there are time stamps next to all of these sentences that were said and so if you remember that we spoke about something and you to you want to look it up even if the stream was only an hour and a half it might still be a little bit difficult to know where that was and uh you can download that transcript ctrl f and you can search for the terminology that you're looking for and find all of the occasions when either john or me mentioned that throughout the stream we do a survey every week i think it's going if it hasn't been pasted in the chat yet it will be pretty much as soon as i say it um please let us know what you thought about today's topic would you like to see in the future and how we did it's very important for us to know what you'd like to see and we take that very close to our hearts we are still virtual meetups are still happening around the world uh if you go to communities.unrealengine.com you can find a meetup group that is either close to you or perhaps somewhere where you would like to move to check out the land um they are usually throwing meetups on discord uh since we are not able to see each other in person at the moment um if there is no community or beautiful meetup community in your area there's a little button that allows you to request to become a leader fill that form out and we will hopefully get in touch with you as soon as possible also make sure you check out our forums there's a great uh community discord called unreal slackers on recyclers.org plenty of people to chat with talk about ue4 and go ahead and jump into the voice chat i was 30 out of week and it's a good time just to hang out with some people uh also of course facebook reddit twitter social media linkedin you got it we're on all the places and that's where you find all the latest and greatest news from unreal engine um if you would like to be featured as one of our community spotlights at the beginning of the stream make sure you let us know what you're working on uh the forums are a great place the discord is not a good place our station is a whole unreligion category right now all fantastic places or you can just add us on twitter they're all good places to let us know we want to see what you're working on because it's always usually very exciting you're so great um we also have countdown videos that we do for every stream um they are generally 30 minutes of development fast forward that to five minutes to send that video to us and you might be featured as one of our countdowns don't put anything on it your logo etc just send it to us and we will edit that together uh if you stream on twitch make sure that you use the unreal engine tag as well as the game development tag those are the two best ways to be featured or for people to find you if you're specifically streaming unreal engine development on twitch make sure if you're watching this on youtube after the stream was live make sure you hit that notification bell and you'll get to uh you get a notification when all of our content is being uploaded to youtube uh sometimes they're big drops like unrealfest online i think we pushed almost 50 videos from all of the talks to were uploaded so there's a lot of good content on our youtube channel if you're looking for it um i already mentioned that next week we have uh the team from giant sparrow that developed what remains we have finished they're going to talk a bit about the environment art and how they will put that together the forum post is up on the forums you can go and check that out if you're interested or already start asking questions for the team if you would like to and as always thanks to our guest today which is john it's been a pleasure um hope to get you get to see some time on the stream maybe next time with wyeth yeah yeah we're going through the content examples um anything else you want to leave the audience with before we tune out oh yeah um in addition to processing.org i think that the coding train on on youtube is great so that's it awesome well with those words we're going to say goodbye to all you we hope you're staying safe out there and we'll see you again next week bye everyone you
Info
Channel: Unreal Engine KR
Views: 5,461
Rating: undefined out of 5
Keywords:
Id: xMfJP2HMXaI
Channel Id: undefined
Length: 96min 42sec (5802 seconds)
Published: Fri Feb 19 2021
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.