Lights, Cameras, FX: Exploring What's Possible with Bifrost

Video Statistics and Information

Video
Captions Word Cloud
Reddit Comments
Captions
[Music] hi everyone and welcome to lights cameras effects where we will explore some of the things you can do with bifrost my name is marcus nordenstam and i'm the product manager for bifrost and the bifrost extension in maya i will give you a quick overview of bifrost is and presents some of the work that's being done with it i'm also going to give you a sneak peek at some exciting upcoming features then i'll turn the mic over to my fellow bifrosters jonah karen ian fey and costa who will give you some deeper dives into some of the things you can do with bifrost right now as well as a glimpse of upcoming features so what is bifrost what can you do with it in a nutshell bifrost gives artists and tds the power of c plus plus developers without needing to code by using visual programming you can break past the barriers of melon python scripting and use nodes to construct anything from custom deformers constraints or scattering tools to production-ready effects you can do this by creating nodes to define the logic of your tool or asset and package it into a compound that can easily be shared or reused by other artists more on this later bifrost in maya also known as the bifrost extension for maya was originally released in august 2019 during siggraph a version that we informally called episode one as it kicked off the initial development cycle since then we've released nine versions the current one being bifrost2100 during its first development cycle most of the team's focus has been on eliminating capability gaps such as missing point cloud lookups or raycasting and native maya spline curve support while continuing to improve the general performance and usability of the visual programming environment we're now nearing the end of this first development cycle which we call season one with the final release beings episode 2. in the deep dives you'll hear more about what you can expect without version and with the release of episode 2 a new cycle begins which we'll call season 2. i'll give you a few glimpses of what you can expect during season 2 in a bit in case you're unaware by the way you don't need to buy a special bifrost license to get going your current subscription of maya or a collection will cover your biforce usage since the release of episode 1 there has been an exciting amount of user activity and we're now beginning to see significant adoption of bifrost around the world and not just in fluid simulation in fact i'd like to take a minute to showcase just a few examples of bifrost usage that has no fluid simulation going on at all in this example fxtd bruce lee has cooked up a bifrost compound that you can use in your own myosins to add blistering lightening effects his compound makes use of bifrost particles and strands bruce posted this compound and videos on the bifrost forum and since then many users have downloaded it and tried it out another example is artist jason lab at one animation who recently created a skin collider using bifast this is exactly the kind of tool that would normally require a c-plus plus plug-in developer but with bifrost motivated their technical artists can make this themselves using the interactive creation environment that bifrost offers finally phil radford also known as maya guy has been getting fancy with bifrost strands creating procedural imagery that would have been difficult in maya in the age before bifrost i had mentioned a sneak peek at what's coming in season two here's one upcoming feature nist generation you input a maya mesh and bifrost generates a vendorable knit structure composed of strands these structures are also able to be simulated using the npm fiber solver we think this will be seriously cool for character effects artists another work in progress is usd in bifrost this will mean that you can build procedural graphs that generate usd stages which can be explored in the outliner directly through the also in progress maya usd plugin here's an example of our working prototype that uses the new bifrost usd nodes to procedurally generate point instance data rendered with arnold in the maya viewport through a usd render delegate thanks to guillaume laforge for this graph and image i should also mention that for those of you interested in participating or testing out this procedural usd uh in bifrost we will be kicking off a beta test of it in the bifrost beta upcoming so just reach out to us and we can get you onto that beta finally it's my honor to hand the mic over to jonah and karen who will take us through the first of the deep dives i hope you've enjoyed this overview and to see you soon in the bifrost community thank you thank you marcus this is bifrost scattering and visual programming my name is jonah friedman i'm the product owner of bifrost and this presentation is going to be all about the creation of this image of a mountain forest and how we used bifrost to achieve it and we use some third-party software here as well bypass is a great tool for production but it is of course not the entire production so the mountain geometry comes from gaia and the tree geometry comes from speed tree and as we'll see both of these are processed and bifrost and brought together in bifrost and then rendered using arnold and one thing worth noting is that this is a one tree forest every tree's geometry is identical that makes it a bit of a stress test to see how much we can get away with but of course in production you'd want to use a library of tree variations and not just one so this image is using a bunch of bifrost features mesh processing for the mountain and trees fields and volumes for the mist but the most obvious feature that we're using here is scattering and instancing instancing is a way to get massive complexity by generating copies of geometry in the renderer it's very scalable with fast loading scenes and has a small footprint at render time the instances of the trees are generated procedurally by scattering them onto the landscape there's a couple of things i want to call attention to here the mountain has snowy and non-snowy areas and the trees are varied in the same way where some are more snow covered than others and before we dive into the mountain we're going to switch gears a little bit and look a bit more closely at the creation of our tree in particular so we're going to go over to our look i've seen where we're preparing the tree asset the tree is on the left and is surrounded by the usual look dev stuff including an 18 gray sphere multiple lighting conditions including backlighting that's very important for trees and this scene has something a bit non-standard as well in addition to the standard stuff a miniature forest in the background because i want to be able to see how my tree works in the context of a forest not just one tree on its own and on that forest i want to see a gradient from snowy to not snowy seen here from right to left and i also want to see us that certain trees are dead i want certain ones to be brown and in a way this little forest is a microcosm of the mountain forest since it needs to do all the same things including creating data to drive snowiness and deadness so the small forest is pretty simple and here is a sped up video showing how to create it and i'm using nodes here that are currently in beta testing first thing i do is i bring in the geometry we want to scatter on by dragging to the graph that's the triangle shape here i create a node called scatter points blue noise and more on this node in a minute but this creates very nice point distributions that are based on a scattering radius and now i'm exposing that scattering radius to the outside of the graph so i can dial it in the maya channel box if you look carefully you can see the point cloud is a bunch of white dots when i select them but we want to instance some geometry onto them so i create a set instance geometry and a create mesh sphere and together these will instance a sphere onto each point now i'm adding a scale points node and setting the scale to be the same as the scattering radius and this lets us see the kind of packing size of these points to visualize what blue noise is doing and i also want a little more out of my spheres in particular i want to be able to see the orientations and so i need some colors on these so i take my sphere that i'm instancing get the point positions get the bounds and then rescale those values into a zero to one range and then i set my colors on the sphere and i plug it in and this will now be my diagnostic geometry that lets me understand my scattering a little bit better and i can take this and collapse this down into a compound called diagnostic instant shape so now i've made a new compound which is a reusable building block for graph programming and i'm going to expose some more parameters on this compound by connecting them to the inputs so now i can control the resolution of my sphere from the outside and i can go back inside and also add an exponent uh or a power node to apply a gamma on top of the colors just so i can see them a little bit more clearly so now i'm changing the size again to match the trees a little bit better because those would have been tiny trees just now and i'm adding a randomized point orientation node and now thanks to my colors i can see those orientations thanks to the diagnostic geometry and i'm also adding a randomized point scale node so i can adjust the min and the max sizes i can adjust the f curve to change the distribution of little to big ones in this case that curve means that there are more small ones than there are big ones and now i'm going to bring in my tree preview geometry and once i plug that in we'll see that it's too big so i'm going to decouple the scaling so that the packing size is now no longer controlling the scaling of my trees and i'm going to increase the or decrease the packing radius so that we get denser trees i don't mind if they overlap a little bit i only want them to be some distance apart so now i'm adjusting the scale again now that i see it on the actual trees of my forest and now i'm bringing in the full tree geometry plugging that in and we now essentially have a renderable forest uh all that's left to do here is to unhide the light hide the tree geometry and we have our look dev forest going so that scattering node that we were just using is called blue noise so what is blue noise blue noise is the name for a point distribution where the points are packed as tightly as they can be without touching but without forming patterns like hex grids and this kind of scattering is really good for plants especially because they like to grow some distance apart and they don't form geometric patterns um so our blue noise node is programmed in the graph and is the combination of these three papers seen up here that's a parallel poisson disk sampling that's a way to parallelize the search for colliding points to make it fast there's dart throwing on surfaces and that's a way of continually subdividing a mesh into triangles and keep subdividing them and removing them in order to kind of iteratively find every spot where you could possibly pack a point in and then there's sample elimination which is kind of a clever sorting algorithm that removes samples based on weights the uh two famous paintings over there in the corner are blue noise driven by weights so you can see how well the blue noise is creating uh point packing that is uh really recreating these images well last year i showed how you could implement some simple algorithms like walking up and down strands in the graph but these algorithms here are in a different league entirely and these are also computationally intensive these are these are expensive algorithms to execute and uh this can do the scattering from my mountain forest in about one second and it can do a million of the non-overlapping points in four seconds and that's fast for this algorithm so it's possible to create things in the graph that are very nontrivial and perform just as well as the c plus implementations visual programming is programming and that's something that we're very serious about just to kind of show what this looks like in the graph i can double click on my blue noise node and i can go inside and if we go inside we see a few nodes like parallel poisson disk and dart throwing and these kind of match the names of the papers and these actually are implementations of at least parts of those papers uh for example this is the one that does all the triangle sorting and iteration where it's continually re-subdividing and removing the triangles and this is the actual implementation right here in the graph so if you need this for something else you know you can go in there you can grab it modify it for your own purposes it's yours so going back to the scene here we also want the variation in the trees from more snow to less snow and some of them are dead so i'm going to show how that's set up and we're going to set that up in the graph again so the first thing i'm going to do is create a randomized geoproperty node and i'm creating a property here called point dead and i can hook up a point scope here to visualize the result of that and i made a mistake here which is why it's all black i forgot to set the max value so that's fixed now and i'm also going to adjust this f curve to be very very steep here because what i really want is almost all the trees are alive except for a few outliers and this is kind of the distribution that i want for this now the next thing i'm going to do is create the snowiness gradient from left to right and i'm going to create a gradient across the bounding box this is uh much like what we did before with remapping the colors onto the spheres and to do this we're remapping the values from the min and the max and the bounding box and mapping them to zero and one and if i go ahead and wire this all up properly i can also visualize those values here and once again i can grab that i can collapse it down make myself a compound i'm going to call that set x gradient property because that's kind of what it's doing and i'm building an f curve here to allow a little bit more control over the distribution and i can now take this compound and put it in my toolbox and use it for any other purpose that i need so point properties are very useful for controlling instances um our instances are actually point clouds where one point that gets scattered is one instance and we can use this data to control our look dab as if it were a texture in arnold this is done using the ai user data float nodes and similar in the case of deadness it's driving a color correction that makes the trees brown by just color correcting the color map the snowiness on the other hand is slightly more complicated we'll need a little bit more information to achieve that result so what we need is we need to know how much snow lands on each part of the tree i call this data snow prone meaning how much snow is likely to collect on any given surface and on the left is what it looks like branches near the periphery or near the top collect more snow than the undersides or the interior areas which are dark to generate this in the bifrost graph we have yet another bifrost graph to uh to process the tree and it uses a feature called raycasting which is one of g by frost's geometry queries and this is very similar what i'm doing here is very similar to ambient occlusion uh where we kind of spray a bunch of rays upward from each vertex and if the ray hits nothing that means it found the sky and the fraction of rays that can see the sky is how snow prone a part of the tree is in this visualization the colorful rays are the ones that found other geometry while the blue ones fell into the sky so this particular example has nothing to do with this project but i built a ray tracer that visualizes refraction and i kind of couldn't resist throwing this in just to show how good our raycasting actually is as i increase the depth and the resolution here you can start it starts to turn into a volumetric caustic and not just a visualization of ray tracing each of those lines are strands so here's what the snow mat on my shader looks like snow is a somewhat binary thing for any given spot there's either snow or there isn't because snow isn't transparent so we're going to use the snow mat that we just generated as if it's a texture and we're going to offset that using the point snow parameter so we have data coming from the tree itself and we have data coming from the point cloud that's in instancing it and when we offset the snowmat by this point snow parameter then we increase the contrast a lot and this gives us the equivalent of a reveal and compositing with a mask and this is the result here from left to right that's what our our snow mask looks like that we're feeding into the shader and this is also an aov that the render generates compositors love this kind of stuff and uh here is another view of the trees um so these trees have have leaves that are kind of cards with opacity maps and uh we can color correct the opacity map to grow those to grow the leaves that have snow which kind of creates the illusion that there's like that the leaves that have snow are bigger than the ones that don't um so it's like a like a hunk of snow sitting on top of the leaf and another thing worth pointing out is this is all procedural so i have a library of tree assets i can put a library of tree assets through this and do this to every one of them and have all of them respond to the deadness and snow settings in the same way so back to the mountain so this uh this scene's forest is actually a lot like the other one but with a few key differences so here's the myoscene for the forest and uh this graph is using a lot of the same pieces as the look dev scenes graph but is a bit more complex so from left to right in the red we've got the mesh processing to deal with the point properties on the mountain itself and these are a lot like the point properties we were creating in the other scene we've also got the scattering using blue noise that's again the same as the other scene except this time we're using waiting and we're transferring the data over to the instances using the sample property nodes that's the blue in the bottom so the blue is the scattering and the blue at the bottom is the uh transferring the properties over to the instances and this is how we're setting up our snowiness and deadness and then we're doing scales and rotations that's in the green and that's very similar to what we were doing in the other graph and then we're setting instanceable geometries and the difference here is that we're instancing an arnold render archive known as an as file so it's it's all the the tree with the ray casting onto it is all baked down into one ass file so we don't need to have that happening in this scene here this one can stay nice and light here's kind of an aerial view of our mountain and our scattering is driven using drainage data seen at the bottom right gaia produced our landscape here and also generated this really nice drainage map that shows where the ground sees water the yeah if you use that to drive your scattering you get something pretty natural looking and after that i drive the scale of the trees using the same data but processed with an f curve so that where the ground gets drier i make the trees just kind of start to get a bit smaller the snowiness of the mountain is of course driven by elevation higher elevations are have more snow but it's also broken up by raycasting to figure out where there's areas of shadow in the mountains so areas that kind of can never see the sun are going to be a little bit the snow is going to linger there for longer because the sun won't melt it away and that helps break up the snow line with some pretty natural looking patterns and then there was another mini project in here in order to create morning mist on the mountains and i wanted to have my mist have this kind of raggedy torn up feeling so i created a ragged clouds compound to do that for this project i use another feature called fields fields let you create a function that varies over space like that swirling noise field that you see on the right there these kinds of noise fields are endlessly customizable and fields really are an expressive tool to let you do that i then use the field to distort the volume of the torus along the flow of the field and that's what gives me my ragged cloud result uh here's the cloud that i generated for the mist but with the density turned way up it doesn't quite hold up with this density but i wanted to show it in order to show what the exact 3d form of it is because the 3d form is still important and apparent in the more subtle version and that's this one here so if i go back and forth you can see that it's it's not flat at all there's trees kind of poking through it a little bit and and the differences in density really help sell the effect and now we're going to shift gears and show an unreleased feature that's still a work in progress and that feature is terminals which are render outputs and one thing you may have noticed in this presentation is that very often i've shown diagnostic views which is a visualization that's not the final output just in order to demonstrate what the graph is doing these visualizations are generated in the graph and are a very important part of the graph authoring process installing them in your graphs all the time while authoring is a bit cumbersome so we would like to build them into the compounds themselves and for that i'll hand it over to karen senior software developer on the bifrost ux team one of the features of terminals is the d flag which is short for diagnostic the diagnostic flag when activated allows a node to explain itself visually here a node that randomizes rotations is showing those rotations as a circle around the axis of rotation with a curved arrow for the amount that was an example of how terminals can be utilized as a user of the graph if a compound author you can use terminals to make your own compounds explain themselves by placing a terminal node inside a compound you can hook up any kind of geometry you want in this example we have a compound that does a closest location check to generate weights which are used to drive instance selection as a compound author you can set up the visualization of what it's doing in the graph and add the terminal node which causes the flag to appear one level up the result a compound that explains itself there are two other flags as well final and proxy these are even more transformative to the graph programming experience with these flags a graph with only one compound in it can generate renderable geometry we separate final from proxy to use a lightweight geometry for a viewport proxy and a heavier geometry only for the final offline render for example we can collapse the entire generator for our ragged cloud into one compound and wire the outputs into terminals now we can create these ragged clouds from a single node complete with visualizations of the noise field and a fast low resolution mode with a preview of our wisps for the sake of interactivity and the full volume is only computed when it is time to render now let's go over to ian to talk about features and improvements to the bifrost user experience hello my name is ian hooper user experience architect on bifrost and together with senior software developer karen tam and senior experience designer faye egg badger we want to share with you some of the exciting things we're working on to improve the user experience in bifrost computer supported human creativity represents a new way of working but to unlock this power for everyone we need to make the process of creating and directing these systems understandable and aligned with how people like to work typical artist tools favor direct manipulation and real-time feedback but they lack scalability larger projects work by coordinating the collaborative efforts of many artists working toward a common goal pipeline efficiencies and communication improvements have helped here but there are diminishing returns as the number of people increases the next step is to use simulation and proceduralism to provide a massively accelerated starting point to build upon creativity also means flexibility which is why we believe that the system should not be a black box starting with individual artists bifrost graphs can integrate into existing maya workflows in ways that build on existing knowledge and experience explore a little more and see in a clear visual way how the tools and the data flow at each stage new knowledge builds upon previous learning building a scaffold to mastery today we're going to talk about a small number of recent improvements and future plans that give a taste for how we are improving the user experience in bifrost watch points in bifrost are the mechanism used to interrogate the data coming out of a port they are handy and useful but they are simultaneously too large and too small they are too large because of how they can block your view of the graph the new design we're working on will have detachable watch points that can easily be toggled open and closed and pulled away from the wire for easier viewing the current watch points are also too small because they can often only show a fraction of the full data for times when you need access to all of the data you will be able to open up the data browser window this spreadsheet-like view will include all of the data passing through the node while also offering filters and other tools to make it easier to find what you want one truism of experience design is that the small details can have an oversized effect on the overall workflow karen tam will now share work she has done to add little shortcuts and conveniences i will greatly improve the graph building experience the power of visual programming is that the data flow is made literal in the form of wires connecting different operators in a sequential format at some of the individual nodes the data is processed in a specific way that can be helpful in the understanding of the graph as a whole knowing that the data will be processed in a programming loop and how it will be iterated on is an example of this in bifrost when combining arrays of data with other arrays or single values you do not need to explicitly add a looping node ports that have been auto looped are now identified with a special icon to assist with the readability of the graph another feature that was recently added is the ability to display the node type as well as the name this is especially valuable when an author has given a node a custom name we also now have the ability to show the data value of some nodes right in the title of the node making it easier to see at a glance what is happening in the graph now i'll have a look at some of the things we've done to make graph building workflows more efficient one of the ways to streamline user workflows is a port feature called type wrangling type wrangling a type allows the user to conveniently set any immediate value they want directly on the port while not losing any of the type inference advantages the compiler provides if the user connects the port type wrangling can be performed either through the standard dialog to set value type or through a handy context menu providing port-specific suggestions of types managing the data flow in a graph can be complicated and tedious features such as overloaded compounds and auto-type ports allow the user to be less concerned with selecting the correct type in cases where the data type needs to be explicitly set this new option of showing suggested supported and recent types helps speed up the workflow by reducing the need to go into the full set value type dialog in a similar way another port feature that provides a list of suggested nodes to create and connect to the port can also accelerate the graph building process in both cases suggestions have the added benefit of providing a way for people to learn while doing i'll now hand it back to ian to talk about some future functionality that will further streamline the process of creating and using graphs in a typical maya pipeline thank you karen a typical bifrost workflow is part of a continuum between the host application and the node graph changes that you make in the graph such as the ones karen just described will affect the functionality and appearance of a part in the parameter editor some of those inputs will be exposed in the host application let's look at an example of a port that specifies the base color on some geometry from the perspective of data this is a float 3 or float 4 but from the user's perspective they want to see how those number values are represented as color applying type interpretation either automatically or by the user explicitly setting it determines what sort of ui options are available at that port if that port is promoted to maya's attribute editor the ui options set in bifrost are respected but are rendered in a format appropriate to that context in this way the bifrost two builders design intent is preserved in a way that also makes the experience familiar and efficient for maya artists building on prior knowledge is a core design principle for us a corollary to this is that new knowledge must be discoverable and incremental bifrost is not meant to just be a backend for building custom tools for your digital content creation application we want to erase the arbitrary line that separates creators from consumers the way to do that is to let people build upon prior experience learn from others and feel confident to explore it might start with using a custom tool someone else built but with the underlying mechanisms being available at every stage curiosity or necessity will lead many to discover more the next phase in the adoption of bifrost requires creating and sharing purpose-built tools here is the forest building scene that jonah recently presented we can imagine that jonah as an expert bifrost user has shared this forest tool with me even though jonah has taken the time to carefully organize the graph the whole experience of wiring together nodes might not appeal to me in that case jonah could expose some controls up to maya so i could use this bifrost powered tool as i might use any other maya plug-in today we will continue to make improvements in the look and feel of the graph but the gap between traditional workflows and computational creativity as it exists today is still too great there needs to be a more gradual step in between standard tools and visual programming looking at the graph that jonah has made you can see that much of what is presented here is a linear sequence of data flow there are many scenarios where the data processing can be understood as a simple stack of operations this is a mental model that will be familiar to many users who may have experimented with existing procedural tools like mash or 3ds max's modifier stack in bifrost there will be areas where nodes can be constructed in this way specialized backdrops will introduce people to contexts where there are more restrictions to what can be added and how the data flows this will also make it easier to use and learn imagine that force tool again now i have a bridging experience that is more familiar to me but at the same time opens up the opportunity for deeper understanding to make the integration of bifrost with maya or any dcc as fluid and seamless as possible the node stack will have analogs in the host application that reinforces the connection to the graph and builds on prior learning now to bring us back to a very pragmatic solution for making graphs more accessible feyi eggbajje will now tell you about how presets and defaults will make compounds and graphs safer to explore and easier to learn and use thank you ian everyone understands the importance of undo not only does it protect you from accidents but it gives you the courage to experiment of course bifrost is already on do safe and as a system that is primarily about using reference nodes it is easy enough to reset things we now want to extend that basic foundation with a system of safe default values and customizable preset overrides compounds have many parameters and it can be hard to get the best settings by allowing compound authors to have a set of default values with each node they can ensure that their compounds always work as expected users consuming the compounds will have the security of knowing that you can reset the compound back to their original values at any point in time indications on the change values will help users to learn or attributes have changes and helps with graph readability by letting them see what they need to focus on after learning how a node works a user may want to change the defaults to meet their own requirements they can then publish the node with new defaults or they can save the new set of values as the custom presets presets are a way of saving parameter values on an individual compound or even the whole graph presets allow you to keep a collection of settings for quickly switching between one loop or another compound presets are useful because they allow users assign safe sets of parameter values to compounds this allows for quicker easier graph authoring here you're able to modify your parameters as desired and then save these as a new compound preset and you can do this for as many um presets as you want for that compound and then in the preset step you're able to easily view and edit and compare all existing compound presets for the compound in question graph presets are different from compound presets because they see a complete record of the top level compound and their nodes and their connections and their parameters so here we have the graph preset dropdown which is similar to what we have for compound presets where you can easily select from the list of graph presets or even create a new graph preset and the presets tab you can see all existing graph presets for this graph and each group here represents a graph preset expanding each group will reveal the parameters for each compound that is part of the preset you can also compare the difference between your current graph preset and the original graph so here we have deleted this deleted compound which um is um deemed with the red outline and you can see there's no connection because it's being deleted and then a newly added compound is displayed with the green outline here um you can also see the compounds that are part of the preset because they are outlined in yellow here and now we're going to hand over to costa who will walk us through some improvements to character effects and volume tools thank you hello for this part of the autodesk bifrost vision series we'll be focusing on the effects and simulation specific features and updates available on episode 2. i'm constantinos stamatellos bifrostfx product designer and product owner and i'll be your host so let's jump right into it today we'll be covering three broad topics of discussion starting with improvements and new features for our simulation systems followed by updates to the bifrost volume toolkit and ending with a brief overview of what to look forward to after episode 2. let us now explore each one of these categories more thoroughly in case there are some of you who are not yet familiar with simulation in the bifrost graph i'll just take a moment to mention that there are three main simulation systems available the material point method also known as the npm solver for character effects like cloth and shells along with granular simulations like sand and snow the aerodynamic solver for everything smoke fire and explosions and finally a particle solver for generic artistic effects for episode 2 we focused a lot of our efforts in bringing important improvements and new features for character effects via npm cloth and shells for aero in general and the influences system which affects all simulations and finally i'm also extremely excited to announce that we've implemented escape termination for aborting any simulation mid-frame one of the biggest challenges we've had with the mpm solver in both granular and cloth simulations revolves around providing accurate surface collisions and behavior for self-collisions when working at low resolutions we've invested in a significant amount of research and work to rethink the collision distance calculations in the npm solver and implement changes which provide accurate surface and self-collisions without requiring very high solver resolutions which make artistic iterations long and difficult thanks to these important changes to the surface collisions it is now easier and faster than ever before to leverage the power of the npm cloth solver to create advanced effects like detailed clothing for fast-moving characters without any kind of interpenetrations along with tackling other complex scenarios like confined layered solves these improvements also benefit plastic shell type simulations where accurate collisions with the ground and with other types of animated collision objects are crucial even at low resolutions where most of the physical parameter tweaking occurs granular simulations also profit by allowing for fine collision objects to have the intended effect even at coarse particle sizes as for the specific behavior of self-collisions we had noticed that at coarse solar resolutions there was trouble with the detection of cloth discontinuity boundaries such as thin slits and this was preventing the expected soft collisions from the separations to occur this clip demonstrates the issue on the left where the cloth is behaving like one continuous piece even though there are clear cuts in the fabric and shows the updated version on the right where the slits are detected and behave as expected this also applies for better collision interaction of fine pieces of cloth between each other on the arrow side of things our area of focus for episode 2 has been two-fold first to continue bringing important performance enhancements was emphasis this time on parallel scaling and memory allocation and second to continue working on how to obtain a maximum amount of non-diffusive fluid detail while also eliminating any possibility of encountering unwanted voxel artifacts our era solver already boasts superior enhanced accuracy which tracks velocity and acceleration over time for automatic physical results and it's also equipped with a bespoke revised method for cubic transport interpolation designed to minimize voxel artifacts this combined with vorticity influences can produce stunning detailed realistic results however depending on the type and scale of the simulation obtaining the desired amount of fine detail is not always easy to improve on this we've added two new features to the aero system the first is a type of detail refinement or up resin which is done during the course of the simulation it works by taking the temperature and fog density properties and solving them a number of resolution levels higher than the velocity solve where the user specifies the number of levels the results are then fed back into the simulation for the computation of the next frame this works to refine the finer details while maintaining the overall shape of the simulation it's also less expensive than a total resolution increase and it can also be applied towards resolving situations where voxel artifacts might be encountered the second detail enhancing feature we implemented for arrow is the ability to emit and advect uvw texture coordinates along with the aerofluid which can later be used at render time to bolster detail via noise shaders applied as color transparency displacement and so on and also to achieve other fun artistic effects to handle the issue of the texture coordinates stretching and twisting during the aerosimulation this feature comes equipped with the option to source multiple sets of ubw coordinates which can be offset from one another and periodically reset while fading in and out so as to give the impression of continuity and to mask the moment the reset occurs combining both the detailed refinement and the use of advected texture coordinates opens the door to an array of exciting artistic possibilities we've also added support for physical viscosity to the aerosolver enabling the types of simulations like cigarette smoke and carbon vertices the next simulation component to get a major upgrade for episode 2 is the influences system it has been completely rebuilt from the ground up using a system of fields where fields are defined as being essentially just mathematical functions that vary through space and can be sampled at any point we can now use fields to procedurally define forces and masks for those forces to create drag to modify the values of any simulation property and anything else required to influence the simulation in a multitude of artistic fashions with the ability of using mathematical operations and the best part is that it can now all be accomplished at the top level of the graph without ever having to dive into the solver and without utilizing any black box type compounds all of the simulation influences that have been available up until this point are still available in episode 2. except that now you can dive into them and see and learn how they were built you can even explode them and modify them with added field operations and ultimately build up your own custom field influence from scratch right here at the top level of the bifrost graph for example this array of field operations with rotations can be packaged up into a spin influence similarly we can use fields to make an align to velocity influence and another to drive color with orientation in this final example the highly sought after pyroclastic arrow detail is accomplished by using fields to add fractal noises to the voxel velocities last but not least i'm very happy to announce that episode 2 brings escape termination forwarding any simulation mid-frame and quickly regaining control of the application this is generally applicable to all compute intensive graph computations such as auto and for each loops for example with that we've wrapped up the section about the simulation specific updates let's now have a look at what's new with volume tools episode 2 sees some major updates to existing volume tools and also introduces some new ones let's look at the four major swiss army type multi-function volume operations compounds convert to volume which can process arrays of meshes and points as inputs has been there since the beginning but we've brought some improvements to the default point conversion level set quality we're also working very hard on providing new powerful functionality which enables the creation of fully adaptive multi-resolution volumes right off the bat which can't be found anywhere else the volume to mesh compound has also seen its share of enhancements most notably the ability to create a mesh from any volume property and not only from a level set this enables you to do things like create a mesh from an aerosimulation volume which doesn't contain a level set by instead using the fog density values for the meshing opening the possibilities for creating anything from crispy chicken to shiny raspberries this compound is also fully adaptive offering several options for multi-resolution meshing the new merge volumes compound provides functionality for combining level sets using customary boolean operations and also for mixing fog densities with a variety of modes which when used in tandem with previous compounds provides important end-to-end artistic geometry manipulations the merge volumes compound can also be used to produce controlled sourcing for simulations such as only at geometry intersections finally if we also throw in the new field to volume compound which allows you to use the same fields we saw earlier for simulation influences but now as volume geometries the possibilities are endless before i go i'd like to quickly mention what kind of things we can look forward to after episode two keep an eye out for potential particles bonding workflows in or around episode two but beyond that we have our sites set on things like initial states for all simulation systems hair and knit simulations through npm fibers rigid bodies and physical fracturing of materials thank you so much for watching can't wait to see the amazing things that you create all right so i think we are in the q a session so i'm marcus nordstam and here's the by frost team live with you uh i'm just gonna start going through these questions and we'll get through as many as we have time for and uh we'll make an effort to try to answer the other ones in some other way so the first question is would you recommend class simulation with bifrost or ncloth costa go ahead and take that one yes um i would absolutely recommend bifrost mpm and cloth sr npm cloth as you saw from my video we've been working really really hard on bringing the mpm cloth simulation up to par as a matter of fact we did look to nucleus to see what kind of things they were doing better than us um and so things like our accurate collisions that we've been focusing on better self-collisions um or were heavily influenced by the way that nucleus works along with the detection of thin slits so the idea is to bring people into the bifrost graph with familiar things like cloth simulation but offer a much more powerful version because you're now in a fully procedural system and so we've been working really hard to make mpm cloth up to par with anacloth and even take it beyond that thank you acosta uh from hermann we have a question does bifrost provide the whole vdb library i'll just quickly answer that one it does not currently do everything that the whole vdb library does but we certainly are well we certain our goal is certainly to provide the equivalent operations uh but of course you know keeping in mind that we can do things on a fundamentally adaptive multi-resolution way as costa has been showing so so that's the caveat but we don't currently do everything we do quite a lot though so do check it out how it works today uh what about rigid bodies effects i'll let costa speak to that one yeah as again as you you probably saw it very briefly at the end of my presentation it's something that we do plan to do we it's something that's slated for post episode two so do keep an eye out for that great uh from akil hotra sorry if i mispronounced your name um how does biofrost compare with mash and extgen for scattering in instancing uh great question and uh jonah maybe you're the right one for that sure um in the viewport display of instant saying by frost in the viewport is a lot faster i think it's also a lot more flexible with the full visual programming and instancing render archives and so forth you can really get far with it um of course we don't have the super high level workflows like in the attribute editor and things like that like you will be doing a bit of programming in the graph but i think it compares uh very favorably it's a little bit of an apple storage oranges comparison but um i would say overall very favorably and i just to add on that as well i you know jonah didn't mention the visual programming procedural difference and i think that one is quite big because uh with only with bifrost can you really put together this custom instance you know scattering you know compound or setup that you can then reuse so it's that reusability is the ability to customize a scattering setup package it up distribute reuse can't that's not easily done in mash or in maya in any standard way right now so um yeah from hernan is asking the rendering sdk coming soon we want to render with other engines such as redshift vray uh are you guys working to make this available so uh basically the this comes down to uh what we are looking at during season two which is the sdk so uh the you know we're not announcing a public sdk yet but we're certainly working on one uh so i think that the answer to this is that as soon as we have the public sdk available or further along with that uh we are planning on you know obviously continuing as we do already work with our rendering partners including companies like uh chaos group uh and and work with them to make sure that rent that sdk the api uh will work for them and their vendors and work with them to to make really great fifrost integrations in these other vendors from maxime hi maxine we have a question in the scattering demo can you talk about the uh text area on the top left of the graph that is displaying some informations uh jonah i don't know do you know what he's referring to there i think so um yeah i think it's in the mountain forest uh where i had the mesh processing stuff all the way on the left and um in backdrops um i did kind of hand wave over a bunch of detail there uh that was the part of the graph that i talked about a little bit later that was doing all of the raycasting and processing of the um of the wetness data and all of that like it was processing that data through some f curves and then also raycasting against itself to see which areas were in shadow and things like that in order to build all the data that i needed on the mesh to drive the scattering and then also creating a diagnostic of that to show in the viewport uh if you needed it great another question from akil also related to instancing i think for you jonah as well um hi can bifrost instances be painted on geometry uh yeah and if you don't mind i'll answer the next one at the same time which is is it possible to mask instances with textures um because the answer is kind of the same um we can bring in color sets from maya into bifrost so if you get if you paint a color set and then you bring it in you can absolutely use that um and if you read a texture onto your mesh as a color set then you can use that as well as far as like actually reading the textures themselves and the graph we don't have that yet but that's something that we're definitely looking at closely great thank you from phil radford phil how are you doing uh are there any great sorry are there any great uh i can't read are there any plans for destruction solvers or great destruction scholars perhaps i'm trying to say here uh so yes as you could see uh in the end of the demo there uh from costa we do have that planned uh during season two so uh keep uh you know stay tuned uh all right another question uh here from manny wong can bifrost support nurbs services now answer is no you cannot support well if you're if you're asking can you bring a nurb surface like drag and drop into the bifrost graph uh that is not possible currently uh we uh we did not show show this uh but one of the other items that is uh in development currently and is is slated for episode two is the ability to drag and drop uh mayan nurbs curves into the graph into the bypass graph and at some point we will look at porting over the nurbs surface as well but but that's not currently scheduled so the answer is for the moment no from a keel we have a question will compound presets be shareable exportable for others in a team ian do you want to tackle that one yeah um so that's definitely i mean this is still um ideas and and sort of future vision things but that's part of the plan we already have the ability to publish compounds and share them and so uh having that as something that can be shared within your team is definitely part of the thinking thanks ian from hernan is asking are the match nodes already ported into bifrost if not when can we expect them so the answer is no the mesh nodes have not been ported not by us anyway into bifrost we don't have concrete plans on doing exactly that but we have had many discussions about that internally i think you know before it would make sense to try to do something like that so the answer is that we're not sure if we're going to do exactly that but but i think uh but the nuance is basically that you know mass uses a lot of uh mayas you know a lot of the other features of maya such as the viewport selection manipulation stacks uh outliner commands you know all kinds of things like that so as ian was was showing in his presentation uh you know we need to have a deeper integration with bifrost to that to the my dcc in this case in order for it to make sense to be able to build compounds in bifrost that can be that easy to use and that interactive and integrated and feel it integrated maya ian do you have any more nuance you want to add them on my answer yeah that's the idea and the other thing is is that it's going to be part of the bifrost ecosystem so then it wouldn't be just this black box but you'd have that ease of use combined with being able to uh if you so choose open it up and and make modifications directly in the graph uh so from david uh we have a question hi there do you plan to support fuller procedural scene uh scene building and lighting uh with usd by frost similar katana so i guess the answer is that we don't know exactly yet uh what the final form or design will be around how we expose usd in bifrost we what we have done i can simply say is we've basically taken the use the api and expressed it as nodes in the visual programming environment of bifrost so that basically means that anything you could do uh in c plus through the usd api if you were a programmer you could likewise do with bifrost and usd or using bifrost to program use day that way so uh so the idea is that basically we want to make it as flexible and powerful and deep and and limitless as we can as a starting point and then from there kind of figure out what the best you know scene assembly or layout or lighting or whatever uh what those workflows could look like but kind of starting with the ground truth of the usd itself uh deeply or deeply embedded into the visual programming system uh so that's that's where we are right now uh and in terms of building like a higher level workflow whatever usd based that's procedural on top of that uh is is something we'll be working out but you know that's the direction we're heading in from aqil does by far us have any plans to delve into crowd simulations that is also something we have been talking about i mean i'll let jonah or costa jump in if or anyone who has an opinion on this but i mean basically we we don't have anything on the schedule of roadmap specifically around crowd simulations but very briefly from my point of view uh you could uh begin to roll your own at this point at least basic ones you know particle systems you can do instancing as you could see you could do and so but in terms of higher level of you know more professional to ready-to-go systems we don't have it on the schedule yet but certainly something that we do discuss and um just one comment there we do have uh some key pieces and that would allow you to build a certain kind of crowd system yourself right now like you could build a stadium crowd i think pretty well because we do the render archive instancing and we allow doing time offset render archive instancing so if you had a uh a sequence of your character cheering then you could very easily instance that into a stadium and then time offset it and select between a bunch of different variations of it and so forth um and that's kind of what you need for a stadium crowd it's not of course a full crowd system with like interacting actors and uniquely deforming geometries um which we don't have any kind of solid plans to make at the moment but a certain kind of crowd is totally viable right now for maquila the question does bifrost plan to support simulating on the gpu in the future we do we would love to do that we do have projects like that that at least in terms of testing and prototyping may happen in the near future but uh but that's that's where we are we're in the beginning to think about that bidding beginning to do technical explorations around that uh but nothing firm to to really um say uh yet beyond that from nitin vadday can area lights or shape-based lighting sorry shape-based lights populated using bifrost uh so can you basically create presumably instancing or or procedural geometry as lights uh if you want to populate buildings with lighting along the yeah uh i think that's what they're asking uh so i i think jonah you're the right person to ask that one sure uh yeah um so you can't generate lights directly but you can generate meshes and so you can generate a mesh light so if you wanted to generate a mesh that has different data on it for where the what the intensity and the color of the light is you could hook that up to a mesh export to maya and then make a mesh light and you have the equivalent of a procedural light and feel free to answer that question on the forums and we'll provide an example because it is a little bit of a tricky setup great um do you see paint effects being rewritten in bifrost from david i uh that is also something that has been discussed uh i uh um so yeah we're about time here i'm going to uh simply say that uh i think once we once somebody could like if somebody once by fast has matured to the point where somebody could basically recreate paint effects exactly as it is today more or less same interface and everything that would be a good management test that would be a good way of validating where we are i think that would be one great achievement to be able to say that someone could do that including of course duncan who created in the first place but it again is not something that specifically we have on the on the schedule at least on our team so far um and i think there are yeah we have a like five pages of other questions here it's great to see all these questions uh very excited uh and i'm very happy to see that there's interest in in our project here at audesque um we are out of time i think for our q a but like i said at the beginning uh we will attempt to answer these remaining questions uh one way or another uh and and give those back to you in the community so uh stay tuned uh maybe keep an eye on the on the autodesk area site or some or blogs uh we'll see where we can put these answers thank you very much have a great rest of your day
Info
Channel: Autodesk
Views: 15,648
Rating: undefined out of 5
Keywords:
Id: 4D02lB-Tq4Y
Channel Id: undefined
Length: 61min 44sec (3704 seconds)
Published: Mon Sep 14 2020
Related Videos
Note
Please note that this website is currently a work in progress! Lots of interesting data and statistics to come.