>>Amanda: Hey folks, welcome to this week’s news
and community spotlight. The release of Unreal Engine
4.23 is coming right up. In the meantime you
can try new features in the Previews, now available
on the Epic Games launcher. Try out the new Chaos physics
and destruction system, the Skin Weight Profile system, the new Unreal Insights Tool, and loads of virtual
production upgrades! You’ll also see
significant improvements to the Niagara workflow,
and many Editor updates. Check the forum thread for a
full list of upcoming changes and we’d love to hear
your feedback there. Do keep in mind that
Previews are intended only to provide a sample
of what is going to be released in the update, and
they are not production-ready. Make a copy of your
project for now! Recently, we caught up with the
creators of Last Oasis, Donkey Crew. Starting as a
modest modding team, they’ve grown into
a 30-person studio and are leveraging their
experience to reinvent the survival genre with
an MMO experience. They share how they
came up with the nomadic survival MMO concept, how they’re building over
100- kilometer-squared worlds, and how they came up with
such a unique setting by both looking at the
past and the future. Read more about Last Oasis and watch for its
release in September! Starting today, we’re making
changes to how we label the status of new UE features. Moving forward, the term
“Early Access” will be retired and replaced with
“Beta” to help avoid confusion about their meanings. Briefly, features in Beta
give you the opportunity to test content, but we also
support backwards compatibility and the APIs for
these are stable. Experimental features are
exactly that - experimental. We’d love feedback on them,
but the APIs are subject to change, we don’t guarantee
backwards compatibility and functionality -or the
entire feature- may be removed. Operating with the motto “By
enthusiasts, for enthusiasts,” Dovetail Games dives into
the subtleties and details of topics ranging from
fishing to trains in order to create digital
hobbies that are enjoyed by hundreds of thousands of
fans across the globe. In the midst of
explosive studio growth, with the newly-announced
Train Sim World 2020 on the horizon and Fishing
Sim World: Pro Tour launching, we visited the team at
its new headquarters where we were able to learn more
about its principles and products while discovering the
role Unreal Engine plays in bringing them both to life. Read our full interview
to hear all about it! In the latest Visual
Disruptors' podcast, part cinematographer, part game
developer, part previs expert and one-hundred-percent
creative Matt Workman talks about his innovative
new Unreal Engine-based product Cine Tracer. He discusses how he’s designed
the realistic cinematography simulator to be accessible
to non-3D users, what it brings to virtual
production workflows, and how he sees it
developing in the future as hardware and engine
technology evolve. Listen to the full podcast, or
read our overview on the blog. Just a reminder, Our "Cinematic Summer"
is still going strong! We encourage you to
create a cinematic short inspired by summer. We’ve partnered with
DXRacer once again to offer their snazzy
UE-branded chairs. Submissions are
due by July 26th. On to this week’s top karma
earners. Thank you to: dptd, Shadowriver, T_Sumisaki, Przemek2222, GarnerP57,
DanielOrchard, Bariudol, Everynone,
YourDownfall and ShdwKnghtLT. Want to see your name up here? Hop over to AnswerHub and
help out your fellow devs! First up in our community
spotlight is a beautiful Office Archviz project
made by Gökhan Çalışkan. It clearly visualizes what the
environment would look like when in use by the
hypothetical tenants. It's a beautiful
scene, great work! Next up is The Sitter,
an upcoming new horror game being developed by
Sijawa Production. We don’t know much about
the story, but be advised - the trailer and the game is
not for the faint of heart. our last spotlight this
week is Sacred Siren, a psychedelic VR experience
created by Nanoshrine. Produced for Swim Souls's
debut album, "Goddess," it can be downloaded
from itch.Io right now! Thanks for joining us
for this week's news and community spotlight. >>Victor: Hey
everyone, and welcome to the Unreal Engine Livestream. I'm your host, Victor Brodin,
and with me today I have Ben Mears, Games Community
Manager at SideFX, as well as Paul Ambrosuissen,
Technical Artist. We're here to talk a little
bit about all the work that you, as well as your suite
of tools known as Houdini, helped shape some of the world,
as well as a couple other pipeline and workflow
improvements that you did for Quixel
and us when we-- when Quixel produced Rebirth. So welcome to the stream! I think Ben Mears is
going to kick this off. >>Ben: Hey Victor.
Thanks for having us. Always great to be on the
Unreal Engine Livestream. Always a pleasure. I'm going to get through
some stuff real quick and let Paul get to
the cool presentation. Game jam stuff first. SideFX is sponsoring
a couple game jams. I'm sure a lot of the viewers
probably already know about UE4 Jam and SideFX is
sponsoring Summer UE4 Jam. I think that'll be
announced pretty soon. Another game jam
we're sponsoring-- This is the first time
we're sponsoring this one-- Is Extra Credit's game jam. That, I think,
will be announced on Monday. For both of those game
jams, game jammers can get a temporary Houdini Indie licenses to use for the game jams. Find the details for that
and request your license. A couple other things coming up: We have SIGGRAPH here in LA. SideFX has a big presence there. We do what we call Houdini
Hive, which is three days of of Houdini presentations. A lot of cool stuff being covered,
some of it's game related. A lot of it's film and
TV because that's more what SIGGRAPH is about. But really cool
stuff going on there. LA Houdini user group is
throwing a party at SIGGRAPH. Anybody is welcome to come,
so find LA Houdini user group on Eventbrite and
register there. Also, Unreal Engine is
having a user group Meetup at SIGGRAPH that I will be
attending, so if anybody is going to be going to the
Unreal Meetup at SIGGRAPH, find me. I'll be wearing
lots of Houdini stuff. One other event that is
coming up in about a month is Gamescom. I will be at Gamescom.
Anybody who is going to Gamescom and wants to meet
up, talk about Houdini, let me know and we can
arrange a meeting there. As always, I always want to see
Houdini in Unreal projects or you're making games
using Houdini, get a hold of me,
show me your stuff. No matter if it's a game jam
game, prototype, whatever. I want to check it out. With that, we can kick it over
to Paul. I should also note that we have a few SideFX
tech artists and developers in the chat. We've got Mike,
Luiz, and Damien in there. They're going to be answering
some of your questions, so put questions in chat. I think now we can kick
it over to Paul for the Rebirth presentation. >>Victor: Nice. Thank you, Ben.
To let everyone know, Ben mentioned the game jam.
We'll be kicking that off on August 8th.
That announcement will happen next week,
but the date is locked down, so if you're planning to
do it-- We will open up access for you to be able to
use Houdini for the game jam. To give you a week or
two, almost to get familiar with the
tool, if you'd like to use it during the jam. Cool, thanks a lot, Ben.
>>Ben: Yeah, thank you. >>Victor: Alright, over to Paul! >>Paul: Yeah, cool! Thank you for everyone
that's joining, listening in on the livestream or
watching it after the fact. Thank you as well, to Epic
and the Unreal Engine team for having us today,
so we can talk about how Houdini was used in the
awesome Rebirth cinematic produced by Quixel. Like Ben said,
in the chat we have Mike, Luiz, Damien, and Ben,
so they can answer any Houdini questions that come
up, and later on, at the end of the presentation,
we also have time to cover lots questions that you might have. So let's start. To begin with, not directly
jumping into Rebirth, I want to briefly mention
that this is not the first time we did a project
for GDC together with Quixel. Last year we built a
game called Brimstone. This was a Quixel and SideFX
project, it was led by SideFX and was assisted by Quixel.
Victor in this case. With all the art and all
the building of the Levels. The link you see on Th slides
is a link to all the talks that came out of
this presentation. Which is why I wanted to mention
this. It shows how you can use Megascans Assets together
with Houdini to build awesome games or cinematics,
or anything else you want for Unreal Engine. This year,
Quixel led and we helped. That's what we're
talking about today. This project, Rebirth,
once again, huge collaboration between Quixel, Beauty and The
Bit for visual development, Ember Lab for music, audio,
and narrative, and SideFX. Let's quickly take a look
at the cinematic, so that we all know what it is
and we've all seen it. I'm sure you've seen it before,
but we'll play it right now. ♫ Mysterious music ♫ >>Narrator: Adaptation. The ability to learn
from past experience. The use of knowledge to
alter their environment. These virtues defined
our creators... and drove them to the brink... of destruction. [Music tempo quickens] But we cannot
exist without them. We must save her. [Music swells] [Vehicle traveling
quickly] [Light crashing] [Music slows] What of our creators
exists within us? Humanity has always had the
potential to recognize its flaws and choose a better way. Can we save humanity? [Music swells, stops abruptly] Was bringing her here... the right choice? >>Victor: I mentioned in chat,
I can watch that every stream. With or without audio. The
audio is really good too, but-- We're back! >>Paul: Cool.
That was the cinematic. Last week you had Quixel
on the stream. Galen Davis, talking about all the cool stuff
on the production side of things and pre-production and so forth. Today we'll talk about
the contribution that SideFX Houdini did. I have
the honor to present the work the SideFX Games Team did
together on this project. First, let's quickly talk
about what Houdini is, so we're all on the same page. Houdini is a procedural
modeling, animation, effects, and simulation, rendering
and compositing package. But there's a wide array of
things you can do with it. The most important thing is
that Houdini's power is based on procedural workflows. Working in Houdini involves
creating networks of nodes, similar to working with
Blueprints inside of Unreal, that you connect together to
complete a task that describes what the computer, or in this
case, Houdini, needs to do to complete this task. These operations or
procedures, if you will, together give you a
very powerful toolkit. At any given time,
you can go back in history and modify any of the nodes
before the final node. As you can see here, after
the pipe have been laid out, I can always go back and
modify the initial curve that was generated as sort
of the skeleton for the pipe. This is something you can't
just do inside of Houdini, you can also do this
inside of Unreal with the Houdini Engine,
which we'll talk about later. What it comes down to,
Houdini is all node-based. It's self-contained,
modular, great for pipelines. You can author cool tools
and workflows in Houdini, make nodes of them, and
share them with other people. Basically like Functions
inside of Unreal. Okay, now we've seen more
of a programmer art version of Houdini Assets.
Next up I'm going to show you some examples made by Rush VFX. As you can see here, all these
different Assets you have, they're not just modeled once
and then they're static, as is. These are all live Smart Assets.
Houdini Digital Assets. Meaning that that all of the
properties that define what an object is, for example, the
width of it, the height of it, and all the base shapes,
you can always go back and modify these with sliders,
basically letting you build an infinite amount of variations
of that prop that you have. Shout out to Rush
VFX for this video, if you want to see the full
breakdown, you can go to the Vimeo link shown at
the bottom of the slides. Like mentioned before,
the artist team that was assembling these
scenes inside Unreal was really small. Three people. But the Houdini team
was also very small. At the top we see
Richard from Quixel. Richard is Quixel's
internal Houdini Champion working on all the internal
Houdini pipelines they have. Then we have Luiz, Mike,
and myself on the SideFX team, which is all about the breakdown
we're showing you today. Besides that, the person
not credited here is Damien, who is a developer building the
Houdini Engine Plugin for Unreal. I also want to thank him
for the support he gave us during the project, and just
a general shout out to him. As for the cinematic team,
like I mentioned before, three artists from Quixel. SideFX with the three tool
developers, we did not build any of the shots you saw
during the cinematic. That was all the work of the
fantastic Quixel art team on the project. Without these awesome artists,
tools are pretty useless. They don't do anything,
so you need awesome artists to build cool content for
you, right? Without them, it would not
have been as successful. The things we'll be
talking about today are FX, Mesh Processing, Terrain,
World Building, and the Megastructure Detailing.
For the effects, I took some of the
questions asked last week about the fog, so I made some
videos to show how that works inside of Unreal. The FX-- People said that
it used lots of magic tricks and also some weird things, but it was actually pretty
straightforward what we did with Houdini and Unreal. Most of the fog you see
is really just cards with Texture sheets on them. In a couple of slides,
we'll show how those texture sheets got made, and of course,
also some of these stock Actors that Unreal ships for fog.
For example, Exponential Fog. Here we have a video of
fog, an example of fog. To generate the fog before any of
the rendering inside of Unreal, we decided not to generate
these kind of cards with simulations inside of Houdini. Not because we couldn't do
that, of course not, because Houdini is
awesome at doing that. But because the artists
I mentioned before didn't really have any
effects experience, especially not doing
effects inside of Houdini. So if we were to have to
teach them how to use these-- for example Smoke Solver or
Pyro Solver, inside of Houdini, that would have taken a
couple of weeks maybe, which is time that we really
did not have for this project. So it was really hard for them
to art direct the simulation. They could get it working and
get something cool, but not something that would look
exactly the way the art direction of this
project required. So really, all the fog
in this project is just evolving fog on a flipbook. How we got there
didn't really matter, so the approach we
took is really cool. Here is an actual render
of what was used in-game. Once again, SideFX has built a
tool that allows you to really easily tweak all these things,
and the artists at Quixel took those tools, ran off
with it, and were able to create these cool
looking results without having any
effects experience. But they still achieved the look
they wanted for the project. How did that work? We came
up with a solution where we have multiple levels of
noise at a lower resolution to very quickly get your
main shapes and define an overall look you see here, and then we can blend between
those at a coarse level. Which means that you wouldn't
get any small islands of fog floating around if you
were to multiply more of a high resolution fog, which
you would get if you did that in shader in Engine. Then what we did is we figure
out the volume around the bounds of the frame that's
being rendered, so you can see the square in the screen. Lastly, we affected it with
higher frequency noise, giving us this really sharp--
sharper look of the fog which is more towards the
direction that they wanted. Since this is just noise
that we're multiplying inside of Houdini,
this was really fast to tweak. Almost real-time. Which means
the artists were able to get more iterations out and
therefore more variation and therefore, in the end,
a higher quality result. That's that, and the Texture
Sheets that I mentioned, I think we used a maximum
of five or six unique ones for the entire cinematic.
Here we see four of them. How does this work? You can
see there are basically eight times eight unique images
on one of these sheets, and what it does in the shader
is is takes one of these squares and then for every frame
it moves to the next one, giving you the illusion
that it's evolving, that it's a live video, even
though it's just frames going by. Then together with cool blending
tricks inside the shader and some scattering,
you get something that looks really great, which we
can see here on this slide. This is stuttering,
as you can see, but this is just a raw Texture
Sheet, so moving from one frame to the next,
without any blending. We've also made the fog
somewhat more dense than it is in the real deal, so it's easier
to see for this breakdown. Then here is the actual final
video used in the cinematic. You can see it's nice and
smooth, it's less dense, and it looks a lot
better overall. It's proof that it's cards and
not some stuff I'm making up. I made this video inside
the Unreal Editor. In the World Outliner
I typed "steam" because we call these
cards steam cards. You can see it's literally just
a plane with a Material on there with lots of parameters to
tweak, for example the emission. How emissive is the fog and
it becomes a lot brighter when you turn up the emission
and the speed and so forth. As you see, I moved towards the
card to show you that yes, indeed it is a flat 2D thing. Now I'll show you how
that looked in the end. Here we have Sequencer and
I'll lock into the camera that we used for rendering. There we go. We can see it is
those cards I just showed you. That was the biggest part of the
effects that we did for Quixel. Next up we'll talk
about Asset Processing. I think this is one
of the most important things we helped with. Because if everything comes
from Megascans, there's lots of geometry to process before
it can go into Unreal. So that's what I'll show you. The first type of processing
we did is generating level of details. You might
ask why level of details for cinematics? Well, the
final renders in a cinematic, they are using
cinematic quality LODs. That's the 300,000 polygons
per rock kind of thing. But if you have an entire scene
with hundreds or thousands of rocks even assembled
together, if you multiply that by 300,000 polygons,
it's going to get really slow, meaning it would cost more
time, and time is something we did not have for the project. So
with these LODs-- Let's say the one you see at the
bottom right of 10k, that's the proxy geometry
that we used to assemble the scene and get an overall
look of how we want it to look. Which meant that it's nice
and light, allowing for fast iteration speeds.
Once everything was assembled the way we wanted,
it looked great, then you can switch the toggle and
say for the rendering, use the cinematic LODs,
which is really easy. Another important
piece is the car. This was a concept rendering
from Fausto De Martini. It's a car that was modeled in
extremely high resolution Mesh, which had to be
decimated before we could bring it into Unreal
for real-time rendering. Here we see the same
car from another angle. It has lots of detail,
it's a really high-res Mesh, which we decimated
using Houdini. Here we see the car
inside of the Editor and the Mesh in
the bottom right. And, of course,
the UVs that come with it. Since the car just needed
to be decimated and was not really the centerpiece of
what we wanted to show-- Still an important piece
but not the centerpiece-- we decided not to spend too much
time manually re-topoing the car to quad geometry. Because for simple poly
reduction in a bake, you don't need that if it's
not a centerpiece that's going to get animated or rigged. Since Houdini makes
these things really easy, the decimation, UVing,
and the baking, all procedural, we decided this was a
really good way to see. What you see here on the
right is seven nodes, even though you
really only need five. It's all you really need
in Houdini to get a clean, decimated geometry with UVs. At the top we can see a File
Node. All it really does is just imports the file from disc.
So an FBX, an OBJ file, and lots of other
formats that we support. Then we have this axis align SOP.
What it does is automatically centers the Object in
the middle of the origin, it scales it to the right size
just by ticking a check box. Then what it does is,
we have a Voxel Mesh Node, which takes any type of geometry,
as long as it's closed, and converts it to a Volume-- a
really high resolution Volume-- and then converts
it back to geometry. The reason we do this is
to prevent any garbage, bad geometry that you might
have in your geometry from a CAD Mesh or anywhere
else, to prevent the decimation process from creating
a clean Triangle Mesh. That's what the next node
does, the Poly Reduce Node. On the Poly Reduce Node,
you can literally tell Houdini, take this Mesh from any
polycount and bring it down to, let's say, 30,000 or 10,000
or whatever you want. Or give me one percent of the
number of polygons you have. You can see it says
reduce to .20 percent, which is a huge reduction. You set the normals,
make sure that the normals look great again, and then
plug it into the Auto UV Node, which automatically unwraps
it to the UVs we saw before, and then we stash it,
which is just the Mesh we can output to Unreal. I also quickly wanted to
show you how the baking would look like in Houdini.
Just a disclaimer, we did not do any baking for this
car, for this project. But to show you how easy it
is to do this kind of baking, I dropped down a Baking Node to
show you, you just plug in your low poly on the left side and
high poly on the right side, it bakes and you
have your Textures. To show that in action--
I just wanted to show you a baking process happening. On the left side we
see our low poly Mesh, which is 11,000 polygons, then we have one with
2.5 million polygons. and with baking you're
essentially taking the high resolution Mesh, taking
all the details, its normals, and baking that down to a
Texture on a low poly Mesh. How fast is that in Houdini?
I'm going to show you. I'm going to bake a 4k
mobile Map of the Meshes you saw before. I'm going to click the bake
button now and you will see a pop-up that shows a seconds
timer. Three...Four...Five... It will take approximately
nine seconds for this Mesh, and in nine seconds
we already rendered a 4k Normal Map,
which is super awesome. Once again, the faster
your processing speeds are, the faster you can iterate. Meaning more time for
creativity, really unlocking the capabilities that
artists might have. Another cool tool that we
built, or pipelined, rather, is something we called
Poly Cruncher, I believe, for this project. You can tell it to bring in
a Mesh or a library of files. It would also import a
Texture which had a Fuzz Map which showed where there
was moss on the geometry. It hardened all the normals
and smoothed the regions where there was moss. Then it automatically spit
out LODs. LOD 0, LOD 1, LOD 2, the cinematic level,
the game res model, and the block art proxy model. Also, channel packing
these Textures together. Every single Asset had
a Displacement Map, a Roughness Map, and a couple
of other Maps you can see here, which you just combine together.
For example, the Roughness Map went into the red channel,
something else went to the blue channel, and something
else went to the green channel. If you have to do that
for over 100+ Meshes, which is what this project had,
which each have three to five Textures per Mesh,
that becomes something that is not fun to do. Imagine being an
artist and someone tells you, Okay, starting tomorrow,
you will have to create LODs by hand and channel pack
Textures for over 100 Meshes," you're going to become
really demotivated and bored with what you're doing. So
all the Assets in the Megascans pipeline for this project
went through this tool. How long did it take to
build a tool like this? I talked with Owen, the artist
that was supposed to use this, to process all the tools, and we
spent around an hour and a half for the final tool.
That included brainstorming with the artists. Initially,
we did a call with Owen, we talked about what is it
you need, what do you want to be able to tweak, what do
you want to be automated, and then we built it. Within an
hour and a half approximately, we had this tool we could
give to Quixel and say, There you go, you can
process your entire library. Instead of it taking a week
or more to process everything, it took a matter of hours, which
meant that as soon as something was done, they ran it through
the pipeline in 30 seconds and it was done,
ready for use in Unreal. This meant the artists were
more motivated and had more time for creativity. Once again,
letting the artists do their thing. Being artsy inside of
Unreal, unlocking their full artistic capabilities. Next up, let's talk about
Worldbuilding and Terrain. Since we had a lot of
questions there as well. An important tool that allowed
us to merge these cliff assemblies from Megascans,
these rocks that were all packed together in a
cliff-looking shape. With the landscape, we wanted
to remove any hard intersections between them. If you have,
let's say, a flat plain and you intersect it with
a rock, you might get a really hard intersection between
them, which doesn't look good because you want your
Normals to be nice and smooth and in real life you would
have dirt building up there. What you see in this video is
a tool that takes a landscape, crops a small region out of it so it's
faster to work with inside of Unreal. It takes in the cliff assembly
from the artists that they placed and then it automatically
generates this buildup you see and once it's done that,
it runs an erosion simulation inside of Unreal using Houdini
Engine, to create nice flow lines along the rocks, so that in
the end, once you apply all the shaders and all the Textures,
you see the moss growing where it should grow instead
of being placed by a noise mask or something else like that. This is all happening
inside of Unreal. In the end we ended up
not using this approach because we decided to
go with Static Meshes for the landscape because it
allowed us to do more things like vertex painting
and so forth. And a higher resolution
Mesh overall. But this is still a really
useful tool that we're publishing for other
uses, so if you already have Houdini and you downloaded
the Game Development Toolset, you can get this tool,
it's called Dirt Skirt. The thing I just showed you,
what are the steps it's doing, you have your initial
terrain and your Mesh in the red part here,
in the top left is your Mesh. You then project your
height field or landscape up to the rocks, so you
can see now we have a shape that somewhat looks
like the rocks. We mask the region of where
the rocks are, in the part that we just projected up. Then we smoothed it so we
have these nice, round, soft blends happening between
the rock and the terrain. Then we generate the Flow Mask. The Flow Masks are an
erosion simulation we run and the only thing we really
grab from that is these flow lines, which are then
multiplied with our mask that we had before, giving
us this nicely eroded look that has no hardened
sections with the rocks and then together with the
rocks, we can see this looks really smooth compared to here. How did it look
inside the Engine? This is the comparison. It's
really hard to see which is which because we're quite far away,
so I highlighted the region where the things have changed. For this project, like I said,
we didn't end up using it because we went with
Meshes for landscapes and we didn't really have any shots
that had these cliff assemblies really close to the camera. But if you have a game
where, for example, people are walking all
the way up to these rocks in a real-time environment
with a place to walk around, this really makes a big deal
of the quality of your game. As for the actual terrain--
What you saw before was just the assembly or combination
of rocks on the terrain. We also needed to create
the actual terrain itself. Icelandic terrain is very
unusual. If you look at it, it has lots of unique
elements to it, if you look at it for a while. We couldn't just use any of
the generic erosion solvers out there on the market because they were simply
not up to the task. Same goes with Houdini. If we
just use the default erosion solver that it came with without
any tweaking or customization, it didn't really achieve
the art directed look that the art directors
required for this project. What we started with is
going to Open Topography to download a digital
elevation map from there. We sourced a real world location
for realistic mountains and hills that were assembled through
the process inside of Houdini with erosion and so forth. These digital elevation maps,
essentially these height maps, are accurate to around 50 cm. It's also completely
open-source, if you want to find the locations we
used for the project. It's actually in Alaska,
not in Iceland. Spoiler. You can find them at this
location if you want to try and replicate this yourself. Like I said, after doing some
basic erosion tests with the stock Houdini nodes,
we resulted with this. It looked okay, but it did
not really match the reference that we received from the
Quixel art team for the toolset that we want to build for them. It was pretty difficult for
us to precisely pinpoint what exactly made
Icelandic terrain look like Icelandic terrain. Initially, you can see it
and you can make some sculpts with noise and so forth in
Houdini and anywhere else and you'll get something
like this, but it's not exactly Icelandic terrain. When we looked closer at
Icelandic terrain, we saw that the ridges and
valleys of all these hills and mountains
actually had lots of erosion happening on just those two
parts and not a lot on the body, so the parts in between the two.
Kind of like lava that's flowing through there. Eroding the tops and
the bottom parts. The artists would need to
have control over where this erosion needed to happen. So we built a tool that
automatically masks these regions for them, so they
would just need to play with some sliders and masking some
things by hand if they wanted to, to have the erosion happen
where they wanted it to. Because once again, the erosion
happening on the terrain was definitely not in
a uniform distribution throughout the landscape. Once again, building this
with stock nodes in Houdini didn't quite do it for
us, but since in Houdini everything is built with
nodes, and you can dive inside, you are able to modify
the erosion solver to get it somewhat
closer to what we wanted, which is a very powerful thing. Here are some more work
in progress shots where we kept getting closer and closer
to the target we had in mind which is a timestep where
we felt comfortable enough bringing these Meshes-- As I
said, we used Meshes in Unreal-- to bring this inside
Unreal for assembly. We already knew these were
all the kinds of masks that we could use inside
the landscape Material. We have the moss layer, we
have the debris layer, we have the Flow Map layer, and all that
kind of stuff, which we then were able to multiply or
assemble with all the Megascans Textures that the Quixel
team gave the art team, which gave them what we see
here on the bottom right, which then later was
refined more and more. It was really important to do
this as soon as possible because the terrain and landscape was
in almost every single shot of the cinematic,
if not every shot. So the sooner we were able
to test this workflow, and get the artists something
in their hands to play with, the better the final
quality would be. Once we had that R &
D part figured out, we built the tools for
the Quixel artists to play in Houdini and
build these erosions, we gave it to the art team. Owen took these tools we
built for him, the workflows, disassembled it and reassembled
it to work the way he wanted, which then allowed him to build these
really cool initial tests that he did, to test whether or
not it would work. This was really cool. You can see the Meshes inside of Unreal,
which hasn't been shown before. As you can see, it's literally just
a Mesh, not a Landscape Actor. Every single Mountain or hill
or whatever else in the project that wasn't very unique, was
just one of these hero pieces. Inside of Houdini we built a
couple of these hero pieces which then were reassembled
inside of Unreal to create these nice
assemblies of a mountain range. As you can see, if we
rotate the terrain around, all the Textures are actually
using World Space UVs. Meaning that if you were
to move a Mesh around and intersect it with something
else, you wouldn't get any disconnect between,
let's say, the moss Texture from one Mesh to another. Again, all the mountains
were made with this approach except for the unique ones. The unique ones is the
thing I will show next, one of which was this shot. The shot looks pretty
straightforward, we thought we were able to build
this using the tools we gave the Quixel artists. But we quickly found out that
this was a really tricky shot. We had our concept art,
which you see here, and it had a very distinct look.
You have these three hills here and if you look closer at it,
it's harder to do than it looks. Try controlling noise
to try and do that. Initially, Owen, the artist
who started with this shot, he tried starting with
the slope we see here, applying noise to it to
get these three hills. You can get something that looks
close but we wanted something that looks exactly like
the art director creates. Once again, we took the
procedural tools in Houdini and gave them some controls
that allowed for more art directable
work to be applied. How did that work?
This is the actual Houdini file used for this particular scene
and all it literally is, is just three cones, which the
artists can move around, project that onto a height
field, which is what we call it in Houdini,
and then some Transform Nodes, which allows us to play God
and move these mountains around to wherever the artist
wants them to be. This makes it really easy to
composite these mountains together to get it to look one to
one with the concept art that Beauty and The Bit created. This is all Owen doing his
cool, magic work. Here we can see the actual final
thing used in the cinematic. Once again,
it's just exported as a Mesh. We exported out all the masks
for the Moss and Flow Maps and all the debris and rock
and so forth, which is what gave us this, the final
shot inside of Unreal, with everything applied.
So the geometry, shaders sections, and so forth. Then if we compare it to
the initial concept art, we can see it's really close.
The top left is the concept art and in the bottom
right we see the final shot. People say, yes, Houdini
is a very procedural tool, you can do lots of things with
noise, math, and other stuff, but you can also art direct
anything, as much as you want. Houdini allows you to make
it as procedural as you want or not as procedural,
as manual as you would like. That makes Houdini a really
powerful piece of software to do these really cool
landscapes or geometry or or anything else
for Unreal Engine. Next for worldbuilding
is things like foliage. For foliage, we use the Houdini
tool called Pivot Painter. To explain what Pivot Painter
is, I got these GIFs from Taizyd. Thank you to him.
You can follow him on Twitter at @DeepSpaceBanana. These just show what it is. Essentially, what you're doing,
is for every single polygon, or every single
vertex in your Mesh, you're encoding what a pivot
point has to be in World Space. For example,
if you have a plant or a tree, instead of exporting
out the tree as a whole, or separating everything,
like a unique Mesh per branch or per leaf and so forth,
you can just export one tree where every single vertex knows
where its sub-object has its pivot located. A branch,
for example, every single vertex inside of a branch knows
where its pivot has to be. Meaning you can rotate or modify
or anything else you want to do in shader, to make it look
organic or create a cool sci-fi looking effect you see here. All you need to do in Houdini
is have your geometry, create something called a Name
Attribute, have some pivot points, and you'll automatically
encode all of that information into the geometry. Which means inside of Unreal,
inside the Material Editor, you can grab that data using
the Pivot Painter Nodes and tell it, for every
single vertex, give me the x axis,
the direction it's facing, or the pivot point location,
meaning that it can rotate or do anything
else, like you see here, by just modifying the
world position offset. Really light weight, cool shader
trick, which is pretty old. What did we use it
for on the project? We used it for all the
foliage, initially. We can see here that we
built a tool for Quixel that automatically figures out
where the pivot points are for every single strand of
grass or leaf of a plant, and then be able to mask it
with a motion mask so we could very easily control which parts
of the Mesh we want to have lots of motion, and which parts of the Mesh
did we not want to have a lot of motion. That's the black and white
color you see on the Mesh when I'm playing
with these sliders. As you can see, the core of
the plant is really dark, you don't expect any movement
to happen there from wind, but the outside or the
parts lying on the floor, you might or might
not want to have wind. What you see here is
initial tests that I did for them before handing
it over to them, and then once you hand over
a tool that works for you, artist comes back and says,
"Hey, it doesn't work for me." Which is what produced
these weird looking effects. Something went wrong there,
we did debugging, lots of back and forth to
get the tool working. Here we have an even more
aggressive version of that. In the end, it results in
something that looks closer and closer to what they wanted. Here we see a shot
closer to the deadline. This shot is actually not the
shot used in the final cinematic. I picked this shot because it
has more of the grass shown. In the end the final shot
maybe has one or two blades of grass moving.
It might not seem important but it's still an important
part. It makes the scene look a lot more
natural and alive. Next up is the
Megastructure Detailing. The Megastructure is a really
big piece of the Quixel Project. Because it simply has so
much detail packed into it and is such an important
piece in the final shot of the cinematic. The artist Victor, from Quixel,
is in charge of both building this Megastructure
and art directing it. He told us this would take a
really long time to build this. Which meant it would take
a large chunk of his time on the project. Meaning he could
spend less time on other shots, which is something
you don't want. You want to have as much time as
possible on every single shot, to get a really high
quality looking cinematic. Instead of building
all of this by hand, we decided to give Houdini
a try on building these cool tools that would give you
this greeble, to prevent any labor-intensive work from
happening for the artists. We didn't initially start
with building this greeble or this sci-fi
paneling in Houdini. What you see here is what
Victor already made before we built this tool for Houdini and
he said it just takes way too long and can't we develop a
tool that does the rest of the paneling for all
these other grids, which would take a
really long time. We said yes, we can do
that, we'll work on a tool. What we started out
with was just using the default noise algorithms
we have in Houdini. I think this noise algorithm
is called Chebyshev. I'll probably be
corrected afterward. It gave us something
that looked like this. But Victor stepped in and said,
that's not what I'm looking for. This is not the thing that
I want. I would like it to be more like this, I like these
parts of what you've created so far. I don't quite like
this, can you do this instead. Which was really helpful for
us because that allowed us to really iterate on the
tool that built this, giving us something that looked
exactly the way Victor wanted. How did it work? Instead of using noises,
we decided to build a solution from scratch, which is something
we call LOD Subdivision. Which essentially is taking the
main primitive that you have, so the big panel which
hasn't been cut up yet, and you just cut it
straight through the middle on its longest side. So if
you have a long thin piece, you would cut it into
two less long pieces or change the way you want to cut
it to create some variation, to give you variations of
thicker and thinner pieces inside of these polygons. Then you reiterate that process
for every single piece you cut. Imagine cutting a piece of
paper in two and then cutting the pieces in two again
and two again and again. That expense exponentially
grows with the number of pieces, giving you lots and lots
of tiny small pieces, which is not what we wanted. After you've cut them
down into tiny pieces, we did a loop over all of
these and clustered them together based on metrics like
position, connectivity, etc., giving you these nice,
irregular sci-fi-looking panels. It's great for doing sci-fin
kind of things, but it's also really useful for
city generation, where this could be the
building profile for building a really great starting point. This tool is not hidden. We've
published this tool for free. You can grab it from the
Game Development Toolset inside of Houdini.
It's called LOD Subdivision. It might look a bit different
here because it's an earlier stage, but the final
thing works really great. We took the tool we made
on the previous slide and we applied that to the
actual building, which gave us this version here, which we
sent to Victor, and he said now is the time to start playing
around with it, so he did that and here we see a
comparison between the thing Victor made by hand and the
parts we created using Houdini. You might spot some differences
there, one of which is this guy here has all
sorts of pipes in between and this is just black. We did not build that yet,
so we're focusing on the panels. This tool or workflow is
a really good example of how you can take something
an artist has made, they've crafted a certain style
or a certain look, and then you convert it into
a more procedural solution, meaning you can now take
that handcrafted look and apply that to a
larger set of data. So instead of having
one handcrafted piece, we can now create hundreds or
thousands of them using Houdini. Since it's all procedurally
driven using parameters, you have something
called a seed. A seed is literally just a value
that you can play around with, you can change it to two,
or five or five million and every time you
change that number, it gives you a
different variation. So by just playing
around with the slider, you can create hundreds of
thousands of these variations and then hand pick
the one you like most, and just say, this is the
one I'm going to go with. Meaning that you,
as an artist using Houdini, you're not really building
these kind of things anymore. You're becoming more of an
art director, telling Houdini, I like this, I don't like
that, change this, change that. Can we do this instead? That's what an artist's role
is becoming inside of Houdini or inside of Unreal if
you use Houdini Engine. Victor took the tool and he
started playing around with it. He sent a screenshot to us
where he said, I'm happy with this output that
the tool gave us. Once you have a happy
artist with a tool, quality will
automatically follow. To show you the process it
takes to build these panels, we made this quick video. First, you import your Mesh
into Houdini from Unreal. You then isolate a section,
so say we want to do this panel, you plug it into the tool and
then you can play with these parameters. For example, I want
to have a rotation in this angle or in that angle. You can
specify how thick you want the border to be between
the panels, giving you a more or less sci-fi look. The overall seed, giving you
variations of thickness of it. Also, controlling
procedurally vertex colors. Your vertex colors are
used for creating variation of materials inside of Unreal. Meaning that if the vertex
color had red it would use this Texture, if it had green
it would use this Texture, and so forth. Meaning we didn't have to export
out a unique Mesh per object or with multiple Material slots. We could just use vertex colors
to create variations with it. The tool you see here, again,
is called Solve Sci-Fi Panel. It's basically just a wrapper
around what we saw before, the LOD Subdivision,
exposing all of the things that Victor
needed to do this. The vertex colors and some other
things you've noticed here, that weren't part of
the LOD Subdivision. You can get this for free,
play with it, create cool stuff. Let us and Quixel know that
you like it or don't like it and any requests that you have. Then, you can see here,
inside the project that we can create light
variations between them. For example, slightly offset the
roughness or anything like that, just to make it play
nicely with the lighting. Here we have that inside
of the Editor in Unreal. Then the final rendered shot. As you can see, lots and lots
and lots of paneling here on this shot. Just this one
shot, there are other shots. This would have taken a really
long time to do manually, which we were now able
to do procedurally. Those were all the
big things that we did for the project that
were used in the project. Now I'm going to show you
a couple of experiments. We consider them experiments.
We planned out things we used in the shots, but they might
have ended up not being used because we couldn't quite
figure it out or because shots that were using
them got cut and so forth. Even though they were not used,
they are still very valuable learning experiences or tools
that we've still published, which you can use
in other projects. Because that's another
great thing about Houdini. You can take any pipeline
or workflow that you used for one project in
Unreal, for example, you can grab the tool you built
and use it in another project. It's not locked down,
it's procedural. You can always go back and
modify what you've built. One of which is piping or
greebling that I showed you. Remember when I showed you
the comparison between one of the shots where
it was handcrafted? That had greeble and pipes
in them and the right one had just black
squares behind them. We tried to automate the process
of generating that greeble, which seemed pretty
straightforward, but very quickly we ended
up with a pathfinding algorithm that would avoid
obstacles the artists placed, giving it these pipes,
but we showed it to Victor and the artists and they
said, I don't think we need a procedural tool to do this,
because I already made the greeble and these
presets or kits that I can batch together, so I can
just take those and put them behind the things
that Houdini made. What we learned from this is not
everything needs to be procedural. This tool, for example,
it didn't have as many use cases as, for example, the paneling. The paneling would have
been hundreds or thousands of pieces you'd have to
do manually, even though this was only maybe 10 or 15. That's the trade off
you have to find between when am I going to build a
procedural solution for this and when am I going to
just do it manually. Here we see these pipes and this
greeble that Victor modeled. We could just take those,
cut them out, and put them behind the Houdini panels and it
would give you the same thing. As you can see here, we have a
dozen panels that have been cut up but only three or four of
these greeble patterns. It's not as important
that this is procedural, but this part saved
us a lot of time. Another cool thing we did
is try and reverse engineer mudslides, or the sliding
of rocks over terrain. Initially we had quite a few
shots in the cinematic plan where lots of these rocks--
in this case it's just roughly represented with these spheres
scattered on the terrain-- that we wanted to look
embedded in the terrain, give them sort of a
backstory, how did this rock get here? Why is it
exactly there and so forth. What we wanted to do was
try and reverse engineer the history of a rock. Rocks don't just
magically spawn somewhere, they start up at a hill, for
example, and then slide down. That gives you the path that they
carve in, where you would have more moss growing or more
debris, or it would look deeper,
a different height there. So what we tried to do was
generate this automatically. How did that work? We started with these
initial white positions and then inside of a solver in
Houdini, we did a simulation that for every single step,
we generated these flow lines which I've shown before,
and calculated the slope at that particular position of
the rock at that point, then figured out, what is the
best path upwards of the hill. Instead of it doing a physics
simulation rolling down, for every single step,
we calculated or engineered a way for the rock to crawl up. Which gives us these
pipes or worms or snakes or whatever you want to call
it, that we can then use as a way to stamp it
into the landscape, giving us what we want. These are the main
experiments and cool things that I wanted to show you. We've seen lots of Houdini
use here, which you can also use inside of Unreal
Engine directly. So as an artist, you don't need
to work directly in Houdini. You can have something authored
in Houdini, for example, the sci-fi paneling tool
that we saw or erosion or something else like that, and
you can give that to a person who is just working inside
of Unreal, in the Editor, who is using something
called Houdini Engine. Houdini Engine takes these
digital assets, these tools, and allows you, in real-time,
during Editor time, so not during runtime--
During Editor time, to play with these parameters,
giving you the exact same controls you have in Houdini,
with some limitations, like, painting and so forth. But
there are ways to mitigate that. In Houdini, author a tool once, and in Houdini
Engine, deploy many. For example, you can
have one artist who is-- or artist or technical
director or whoever else wants to use Houdini-- build
these procedural solutions. He makes these tools, he creates
this parameter interface you saw with all these sliders
the artists can play with, stores that file and
imports it into Unreal, and then all the users
inside of Unreal, they can take this and utilize
it as much as they want. Once again, author
once, deploy many. Here's a video of that working.
Here's a cool example, courtesy of Magnus Larsson.
Shout out to him. He created a tool that would
import these concrete pillars, it could place a sphere that
could cut out or break loose concrete pieces,
and then he had another tool that would
automatically generate chains hanging between them. As you can see this is
a really powerful tool, which doesn't take long to
build, where you can just place these pillars and
automatically have these chains generated between them. As soon as you move any of these
pillars, the chains automatically recook and reiterate
the thing you've made, updating it to the final look. What can you do
with Houdini Engine and what are the limitations? For output, Houdini Engine can
output procedural geometry. Static Meshes, it can also
export out instancers, so instanced Static Meshes or hierarchical instanced
Static Meshes, or overrides for Actors. and geometry It can export out landscapes,
so you can have the native Unreal Engine landscape be
written by Houdini Engine or modified by Houdini Engine. It can also generate collision,
so if you have this concrete pillar we see here,
it could automatically generate collision around it so you don't
just have rendered geometry, but you also have collision
geometry, so if a player walks into it, they wouldn't
just walk through it, they would walk up against
it like you'd expect from a pillar like this. It can also automatically
generate LODs, which means that you can have
a fully functional Asset-- A smart Asset, essentially-- be generated inside of Unreal,
which you built in Houdini. As for input Meshes, you can
import Meshes or Skeletal Meshes since recently,
from the Content Browser. You can also do it from
Actors that are instantiated in the World already.
In this video you see here, he just fed this already
existing pillar or cube or whatever else it started
with, and as soon as he moved it, Houdini Engine would
know something has changed, I need to recook this thing,
giving him this updated look. You can also use
Spline Components. For example,
if you're building a racetrack game inside of Unreal, you can
have that be driven by a spline. Right away, you just place a
spline and then Houdini Engine inside of Unreal will
automatically Generate the roads and all the fences around
it, even people if you want to, like you saw in the
video at the beginning. You can also import landscapes
and landscape layers. The example that we showed
before, where we had a rock assembly be input into
the landscape, embedded, that is also something you can
import, so here is my landscape, Houdini Engine or
Houdini, please modify it and give me an updated look. That's Houdini Engine. What does Houdini
Engine offer artists? Artists can very quickly
rebuild things you've made, recook things,
change the parameters. You have the Houdini Debugger,
so if something doesn't quite work or you want to see
how it works or why it isn't working, you can just open
up Houdini as a live session and you can see how Houdini and
Unreal Engine work together. You can pause cooking,
continue cooking, so it doesn't freeze your Editor while you're
processing an entire scene at once or an entire game
at once, if you wanted to. It gives you a tool shelf,
so if you build any tools using Houdini yourself,
you can just grab those tools and put them on the shelf,
so artists can drag and drop those into the scene, just like
you could with, for example, a light Actor. If you want the
light, you drag in the light. If you want a fence
generator from Houdini, you can drag in a fence
generator from Houdini. It also ships with a couple
of out of the box tools like scatter tools, spline tools
and few other convenient ones. It's also really flexible. As soon as you've made
your procedural Asset, you can also say,
bake it down to a native Unreal Engine Static Mesh, or even modify any
of the UProperties. If, for example, by default,
you always want to have the Static Mesh not simulate
physics, there's a check box inside of the Static
Mesh Editor, tell Houdini to tick that box or untick
it, it can do that for you. It can also automatically drive
things like Material Instances. For example, set the roughness
to this, or assign these Textures, or change
this, change that. That's all stuff you can do. What does it have for
developers or engineers? Houdini Engine is really
accessible. The entire Plugin is open source, it's on GitHub. Meaning it has a high
level of customizability. Anyone can go in,
that knows C++, and modify how something works, for example,
if they have a custom build of Unreal, or they want to
add additional features, they can do that themselves
if they want to do so. Houdini Engine is
multi-platform, so we support Windows, Mac, and Linux. And we update these
in daily releases. Houdini has a daily release
cycle, so as soon as we fix something or
add something or a user has a request, most of the time,
you have that the next day. You don't need to wait
until SideFX says, here's the new
version of Houdini. You can get that the next
day, or you can get it as soon as it's
submitted on GitHub. Super convenient, super easy. We also have an announcement to
make, regarding Houdini Engine. We're announcing
Version 2 of the Plugin. Which is a complete rewrite
on the core architecture of the Plugin, because a lot
of people don't know this, but Houdini Engine was built
on Unreal Engine 4.5. So that's a really
long time ago. The details on the things
that changed or added, you can find here at this link. I just highlighted a couple I think
people will be excited about. Oh, there we go, really excited. We will have Blueprint support, something people
have asked a lot for. What you will be able to do
is, have a Blueprint Actor, and have that have a Houdini
digital Asset component to it. Which means that through
Blueprint, you can drive presets, parameters,
and anything else, through construction
script or Blutility. We will have a PDG Asset Link. PDG Asset Link is something
that will be able to manage complex dependencies and allow
you to distribute async cooking. Instead of, for example,
you have a massive landscape, and you have everything be
procedural, or only parts of it. Let's say you generate
a procedural landscape, then on top of that you have a
procedural road, on top of that you have procedural
foliage around the road, and on top of that
you have other things. As soon as you move
the road a tiny bit, Houdini wouldn't really know
what the connection is between the road and the landscape
unless you explicitly told it, if I change this, modify that. With the PDG Asset Link,
it is going to be able to know, if I move the road in this
region of my landscape, I don't need to regenerate
the entire Level, I only need to regenerate
this particular region around the part that I modified, and it doesn't need to
do that on your local machine you're working. Let's say you're working
on a network which has a a computer farm or a
computer farm somewhere else in your building or off-site
or Amazon AWS or something, you can have Houdini Engine cook
everything on a remote server. Which means it's async,
allowing you to keep working in the Editor, while in the
background Houdini is cooking and as soon as it's done,
it will automatically import everything,
making it really easy for you to to keep tweaking things. We'll also make lots of changes
and improvements on the UI and the UX based on the
feedback we've received. Since everything was built
in Unreal Engine 4.5, lots of it isn't native
UI code that Unreal has in its current release.
So we're rewriting that, giving us a more
improved parameter UI. We'll be able to have
default values, something very important that
we will support, and you will be able to
set expressions and create expressions in
parameters as well. Another thing people have been
asking for is the ability to have Houdini Engine
be Level independent. What we will do instead
of having a Houdini Actor in every single scene,
we'll extract that and only have the HDA component inside of
the Editor just have data. The parameters you saw
on the digital Asset. All the cooking, all the
processing, geometry and data, happens externally. Meaning
that you can have Houdini Engine tell it, generate a tree in
that Level, generate a tree in that Level, generate a tree
in that Level, instead of generating a tree in this Level,
going to a different Level, generating a tree there, and
we'll be able to do that with World Composition. Meaning that you can
generate an infinitely large game with Houdini Engine and
Unreal through World Composition. Another thing a lot of
people have mentioned is the Mesh creation currently
with Houdini Engine has a bit of a delay. What we've done is,
instead of going through the default Static Mesh generation,
which is a slow process because it's optimized,
it's a Static Mesh, it's not meant to be modified. We're going to be looking at
the editable Mesh component, which as the name
says, is editable. Meaning that you will have
a really fast update rate. As soon as you change a slider,
you will have the result almost immediately,
or even immediately depending on what you're doing. That's something people
have asked for a lot, and yes, we are working on
that and you will get that. The time line for this Plugin,
we don't have a date for you yet but I can tell you this
will be somewhere-- Expect something around
the end of the year. That's announcing V2. We're also looking for an
Unreal Engine Developer, which can help us build this
new version of Houdini Engine. If you're an engineer and
you have C++ experience in Unreal-- You don't necessarily
need to know Houdini, that's just a plus
that you could have. If you know Unreal and you're
a developer, contact us. Go to sidefx.com/careers,
find the application that is Unreal Engine 4 developer,
and apply and we'll be happy to listen to you. All this cool stuff that
we've shown you today, I'm sure that lots of people who
haven't used Houdini are like, Houdini is probably
expensive and so forth. You can go online to
sidefx.com/download and you can download a
free version of Houdini called the Apprentice version,
which is a completely free license which you'll get
as soon as you install it, allowing you to essentially
rebuild anything you want. There are no limitations
inside of Houdini limiting you from doing
anything. You can render, you can do simulations,
you can do lots of other stuff. The only thing that we
really limit you with is file formats you can export
with. For example, you can't export FBX and you can only
render to a certain resolution. But it doesn't really matter
because it's meant for learning purposes. Later on,
when you feel more comfortable, you can look at other
types of licenses. That's all the things
that I have to mention. One other thing-- Houdini
Apprentice cannot be used with Houdini Engine inside of Unreal. But if you want to do that,
SideFX is always sponsoring the Unreal Engine Game Jam, which
is when you'll be able to get a free two month
Houdini Indie license. That does allow you to use
Houdini Engine inside of Unreal. Use the Apprentice
version to learn Houdini, and then once you
participate in a a game jam, feel free to reach out to
us and we'll give you a free Houdini Indie license
for an evaluation time, for the game jam. That's all I have to say today. Thanks for listening.
I think now we have lots of questions to
answer, if there are any. >>Victor: There are
certainly questions but it looks like the SideFX Games
Team there have been responding to almost all of them in chat,
but I definitely want to bring some of them up on the stream.
Thank you so much for that presentation, Paul. That
was a lot of very very cool and useful information. There
is a lot of people excited about using Houdini in Unreal,
so that's really neat. Let's see if we can get
Ben back on as well. Is he still on the line? Ba dom dom dom pop! There's Ben! Got to get this mirror
thing working out. >>Paul: Hi Ben.
>>Victor: Ben's back, everybody! Questions. Let's start from the
top, going back to some portions of the earlier
portion of your talk. One of the viewers was
wondering if it's possible to make cloth and import
into Unreal Engine as a Physics Asset
for Skeletal Meshes. >>Paul: The answer is
yes, you can. We have two types of tools
that allow you to export any type of simulation or effect
that you've made in Houdini. For example, a cloth simulation.
You can export it using a shader approach,
using something called Vertex Animation Textures,
that essentially just encodes all the movement in a
Texture, making it light weight, But that doesn't allow you
to use the Physics Engine inside of Unreal to apply
additional simulation on top of it. So we built
a second tool called Skinning Converter. The Skinning Converter
essentially looks at your cloth simulation, and it figures
out where it needs to place bones so it creates an automatic
rig with the animation. Meaning that since
it's bone based, you will have your Physics
Assets be generated inside of Unreal, allowing you to do
real-time physics simulation on top of that if you want to. >>Victor: That's super cool.
I'd like to see that workflow. >>Paul: I think we showed it
on the previous livestream, a long time ago. >>Victor: Okay. I'll dig that up and post it in
the announcement post so that anyone can find it,
if they're interested. Referring to the Steam Texture
Sheets you were showing, they were wondering if they
were available from some kind of predetermined noise algorithms or if you have to
provide your own. >>Paul: The noise that we
were blending together? That's the question, right?
The noises we used for blending? >>Victor: I believe so. >>Paul: Okay. You can do anything. The noise
we used are just the default noises that we ship with.
There's a drop down giving you lots of options and control
the frequency, amplitude, how often it repeats,
offset, and so forth. But if you want to import your
own custom noise algorithm or write your own,
you can also do that. So you can use what we
have, which is what we used for the cinematic, or you can
write your own or modify it. Both are possible. >>Victor: Great. They were
wondering what resolution the Steam Texture
Sheets were at. >>Paul: That's a good question.
I think those are-- Since it's a Texture Sheet and
we're doing a cinematic it's probably 4k, maybe 8k.
But you can apply mipmaps or export them at a lower
resolution if you want to do so. The more frames you want and
the higher resolution you want each frame to be, the higher
the resolution of your Texture Sheet needs to be.
But I don't have the actual answer to that.
I just assume it's 4k or 8k. >>Victor: Okay.
Pretty high-res then. There was a question about
releasing the Rebirth scene for learning purposes
and as far as I know, last week when Quixel were here,
they let us know they will be releasing a scene using some of
the Megascans and Assets that were part of Rebirth,
but the actual final project, I don't believe that
will be available. But there will be a
pretty sweet sample there, that you can use and tear
apart and figure out how a lot of it was made. Let's see. This was answered by
one of the SideFX people but-- Will there be a link to learn
Houdini on how to create clusters with Niagara.
I believe the answer was-- and they also linked a tutorial
from earlier this year, where you can see
how you can do that. >>Paul: Correct.
>>Ben: It's in the chat. Over there. >>Victor: Over there somewhere. What was the time line
from concept to completion for this cinematic?" I think
we mentioned that last week. But I was curious if
you guys might know. >>Paul: I don't know the exact
start date that Quixel started. Because they started before
us, of course. They had to do all this
pre-production and the actual scanning trip of scanning
the Assets in the real world. So I don't know when,
but when SideFX got involved, that was actually October and
we built all the tools you saw until around December,
and then December onward, leading up to GDC,
we were more of a support role. We gave all the tools to Quixel,
the artists did an amazing job using those to build
the cool shots you see, and we helped them if
they ran into problems. For us, it was October to
December for the main part. >>Victor: If you're
really curious, I know we mentioned that during
last week's stream. I just don't remember the
exact time line. But it was-- I believe it was close to
a year from the first idea that something was supposed
to happen. But I believe the work started somewhere
around October as well. Maybe the scanning
trip was before that. We'll have to go back
and look at that. >>Paul: Another reason to watch
each week's stream, right? >>Victor: Yeah. Galen showed
off plenty of interesting things there as well. Referring to the mountain
ranges you were using when you were showing how to
use the tools to blend them, One of the users
was curious if-- Some of those ranges were
pre-made and they were wondering if you just manipulate
them with erosion tools and they were asking,
what if you're trying to make a specific range. >>Paul: The Meshes that
you saw me rotating around, those are just the Open
Topography elevation data. It's a height map,
which is a real life mountain. We took that data,
brought it into Houdini, and we didn't just use it
as is from Open Topography inside of the cinematic. We did erosion on there,
that made it look like Iceland. Because we can't just use
Alaska. It needs to look like Iceland, and we wanted to
have it look a certain way, which you don't get directly
from the real world. So we did erosion, did lots
of processing on there, and then used inside of Unreal. As for the ranges
that are unique, one of them I showed. That's
the one with three mountains. There you can see we handcrafted
that, we didn't use any elevation maps for that. >>Victor: Okay, nice. >>Paul:
It's all meshes once again. >>Victor: They were curious, what
format does Houdini export Meshes out of to Unreal Engine? >>Paul: We support a really
wide range of geometry formats. The main format
people use is FBX. You can also export OBJ,
you can export BGeo, which is a Houdini format you
can import into Houdini Engine, which makes it
really light weight. Alembic,
and tons of other formats. We support most standard
formats there are. >>Victor: That's great. Question for you, Ben.
What are the restrictions to the Indie license? Size of
company, size of studio, etc.? >>Ben: It's basically just
how much money you make. If you make under
$100,000 US per year, You can use Houdini Indie and
it has all the features of regular Houdini. The goal
there is to let people get it pretty cheap to get started
and then hopefully they're really successful with the Indie
license and make a ton of money and then they buy the
full Commercial license. But it's really great for
people to get started, and freelancers who aren't
making tons of money but-- Yeah, check it out. >>Victor: That's awesome. They were curious if the
Houdini Engine significantly increases the project size. >>Paul: This project in general? >>Victor: Just in general,
so I guess when you enable Houdini Engine,
they were curious if it-- >>Paul: Oh, okay. I don't know the exact size
but I think it's between 200 or 300 MB. Might be more,
might be less, I don't know. But that's stuff that only
increases the project size during Editor time. As soon as you release your
build and ship it to users, for example, a mobile game is
a good example. You don't want your app to be 300 MB default
because you used Houdini Engine. It doesn't do that.
Houdini Engine gets stripped out as soon as you build,
so you only have native Mesh components, Skeletal Meshes and everything
else that Unreal supports. >>Victor: Right, because it
can only do it in Editor time, so there's no reason to
package it out for runtime. I had a question, which is
if the new Houdini Plugin will work with out new
Editor Utility Widgets? You mentioned Blutility there. >>Paul: That's a
developer question. That's something that Damien,
who is in the chat, probably can answer and otherwise
we can follow up. >>Victor: Alright, chat,
we all call for Damien. Victor needs answers. Hopefully I can get that
answered. I was really curious. Let's see. Actually, I don't
know what an HDA file is, but Is there a way to create HDA
files that take up a few MB, that contain just the math
and instructions to generate Procedural Assets when the
game is either installed or at the first run?" >>Paul: HDA is usually not
MB, they're just nodes and instructions,
so they're usually KB. But to answer your question,
it's only Editor side, so you can't deploy an
HDA to a file consumer. As soon as they download your
game, they won't be able to play around with slider
anymore because that's runtime. There are other ways of
doing this, but it's not really straightforward. You
could create your own interface inside of your game with sliders which it then sends to an
offline server somewhere and then stream in
their resulting geometry That's one way of doing it,
but we don't support that out of the box. >>Victor: Alright.
Then, just to-- Actually, that one has
already been covered. They're definitely curious
when the second version of the Plugin will be
in everyone's hands. Which was sometime at
the end of this year, I believe was your answer.
>>Paul: Correct. And updates will follow,
the closer we get. >>Victor: The next question
was if the new update might be coming to the Game
Dev Toolset in Houdini-- Let me phrase this correctly. Is there a new update coming to
the Game Dev Toolset in Houdini with these new tools, or are they
already available on GitHub? >>Paul: They were made
available right after GDC, so they've been online since
March for free, open source, so you can grab them, see what
we did, there's no hidden stuff. >>Victor: That's awesome.
Go make things. >>Paul: Tell us if you use them.
We would like to see all the cool stuff you do. >>Victor: Yeah, make sure you
hit us up with all your projects because we definitely
like to browse through and see what's being made. >>Ben: I should point out,
a lot of people don't even know about the Game Dev Toolset
because it doesn't come packaged with Houdini, so you need
to get it from GitHub. But we're always telling
people to check out those tools because there's lots of
awesome tools in there that will probably eventually
be rolled into Houdini proper, but at this point
the games team is making them so fast that we want
people to use them and show us what they're doing with them. Check out the Game Dev
Toolset for Houdini. >>Paul: It's 100 percent free.
All free, all open source. >>Victor: That's great.
Let's see here. Can you create infinite
procedural Worlds with height field and
World Composition?" They ended the question with, essentially, if there are
any resources to learn how you would be able to do that. >>Paul: Can you? Yes. With
V2 that will be a lot easier because it will natively
support World Composition. Right now, you have to go export
Textures, height maps, tiles, and import that with
World Composition. But with V2 of the Plugin,
it can automatically write to all of these World
Composition components. Resources to do this-- You just have to look at
the Houdini terrain tools. You can make them
as big as you want. We have lots of
resources on learning how to use these terrain tools. To do that, you can go
to the SideFX website under the tutorial section and
just type in "height field" or terrain and you'll get
at least 10 tutorials telling you how to do it. >>Victor: Awesome. >>Paul: Lots of which
is also in Unreal. >>Victor: Yes, we just added
a little bit of documentation to our landscape documentation,
regarding Houdini. That's up there. >>Paul: Great. >>Victor: It also has the
link there so you can find it. Any plans to incorporate
VDB for fire and fluid Sims into Houdini Engine? >>Paul: You can't really render
Volumes directly inside of Unreal, so we could add
support, but there's no way of rendering it. Plus,
of course, that it comes with a huge overhead because,
imagine doing a giant, giant pyro simulation,
and importing 40 million voxels. That's going to
blow up your game. What we have instead is,
we built some tools that still allow you to do that,
but sort of fake it in a way. It's not fake-- It resembles
something but it's actually something else. That is
similar to the fog cards. The fog cards were a
Volume, a VDB if you want, that we just baked down
to a Texture Sheet, giving you something that
looks like fog, or volumetric inside of the Engine. >>Victor: Cool. Let's keep going
and I think we've got two left. Then we can all get back
to drinking coffee and browsing the internet.
Because that's all we do. Is Houdini getting any controls
to lower or slow the dump of data back into Unreal,
to keep from using all of the memory?" >>Paul: Can you repeat
the question? Sorry. >>Victor: Yes. Is Houdini getting any controls
to lower or slow the dump of data back into Unreal, to keep from using
all of the memory?" >>Paul: You can actually
pause the cooking. If you're using Houdini
Engine, you can tell it to pause the cooking in general. As for slowing or
controlling the number of-- the size of the memory it's
using, I don't know if we have support for that right now. But like I mentioned, in V2,
you can offload all of the processing to an external
computer if you want to, and with PDG as well, you can
actually control the number of cores that are being used
to process everything on the local machine. >>Victor: Okay, so if you have
a little lower end machine, you get a little bit of
control there so that it's not entirely freezing on you. >>Paul: Yep. >>Victor: Awesome. Last question!
Let's see if I can get this one to be
an actual question. There will exist
too-- Let's see... I think they're trying to ask if
there are tools for the creation of mid/close view of terrain
for erosion and rocks. I believe the question
is if there are tools for creation of-- Yeah,
so they're much closer up. Look at erosion and rocks. >>Paul: For this cinematic,
we wanted everything to be a bit further away, super
high resolution, super dense. But that's just a
setting or slider or a configuration inside of
Houdini, so if you want it to be a low resolution, you can,
for example, use a poly reduce, which I showed with
the car example. You can specify what the
quality of the Mesh has to be, the number of polygons, the size
of Textures and everything else. You can make it as expensive
or inexpensive as you want, depending on your required
platform, be it mobile, be it console, be it desktop. We have customers and lots of
people who use them for all three. It's not like you can only
create super high-res, cinematic quality Assets. You
can make real-time assets too. That's what Houdini
Engine is mainly used for. >>Victor: Great. There's
so much-- A whole slew of information here and I think
chat is very excited to go and check it out there. I believe YouTube was a little
sad that they didn't see the Indie license key getting
thrown at them, so maybe Ben wants to go ahead and be
a little generous to our YouTube audience as well.
I'm not making a demand, but I'm just letting you know
what the chat over there is-- >>Ben: Oh yeah. I've only
been looking at Twitch. >>Victor: Yeah,
we're doing both. >>Ben: Tell the YouTubers
to come to Twitch next time. >>Victor: Okay, okay,
I will-- Well you just did. They will be there. As always-- >>Ben: We'll get you next time. >>Victor: Alright. Well, I will
definitely have you guys back. It would be great to do
something around the release of the Plugin there, I think.
Maybe go ahead and go over a little bit of workflow for
that. I think you all can go ahead and look
forward to that. I am. I'm definitely excited about it. We do a survey every week.
If you are interested in letting us-- Sorry, if you
would like to let us know what you would like to see on
the stream, go ahead and fill the survey out, and everyone
who participates in the survey and gives us their email, will
get a chance to win a t-shirt from our sweepstakes. As always, make sure
you go ahead and visit unrealengine.com/user-groups,
where you can see if there are any user groups around
your area, where you can go ahead and share
what you're working on, possibly learn from other
people in your community. If you don't have a user
group and you're excited about possibly making one there,
go ahead and send us an email to community@unrealengine.com
and we will help you know what that might involve
and give you some resources and assets to go ahead
and get that started. Make sure you visit our forums
and if you are working on anything cool and exciting,
make sure you share that with us and you might be one
of our spotlights in the upcoming weeks. We also would like to highlight
once again, because we-- For a period there, we were
receiving some really cool countdown videos, but they
ticked down a little bit and we would definitely
like to see more. What that involves is
that you take 30 minutes of development, you record that,
you speed it up to five minutes and then send that to
community@unrealengine.com with a logo and we can go
ahead and showcase your project at the beginning of the stream. If you're streaming on
Twitch, make sure you use the Unreal Engine categories,
so that we can tune in, say hi, and possibly
even raid you and that's what I believe we
are about to do now. But first, special thanks
to the entire team at SideFX and Houdini for coming on and
providing all this information. We will definitely
have you back. There's absolutely
no doubt about that. To all of you, everyone in
chat having a good time, we're happy to have you here. Without further ado,
we're going to say goodbye and let's stick around for the
raid that is going to happen in a couple seconds, whenever
that countdown goes down. Alright, that's it for
us tonight-- today. It's actually today here,
it's probably tonight somewhere around the world. Not for
you guys because you're on the other side of the coast
but, across the other pond that we're at, that will be
night in just a couple hours. Okay, I'm done babbling,
that's it for today. Bye everyone! >>Ben: Bye! Thanks a lot!
>>Paul: Bye, thank you. ♫ Unreal logo music ♫