ALEX COULOMBE: Yeah, there we go. Hi, everyone. It's me, Alex Coulombe, your friendly
neighborhood Unreal Engine VR guy. And I'm very excited to
be here with you today in New Orleans talking about Virtual
Reality and OpenXR zone in UE5, especially because it's such a
weird timing for the session. I've never done a talk like this
before where, halfway through, you could come up to me and say,
I really think you suck. And then I would maybe have
to recalibrate and then change what the second half of the
talk was going to be like. But also, it's nice to take a
break in the middle of a longer session like this. It might work out great. We'll see. So introduction-- again,
name's Alex Coulombe. What is something you might
want to know about me? I've been an Unreal Engine Authorized
Instructor for the past two years, working on all manner
of course development and instructor-led training. And that's what that
little logo looks like. It's really fun. You become an Authorized
Instructor, and they're like, you're allowed to use this
logo wherever you want now. I also come from a
background of architecture as well as theater and
theater architecture. And we've done a lot
of exciting projects like the Brockman Hall
for Opera here that ended up being a pretty
good representation in VR before it was real. I've been working in
VR for about 10 years. I started with the DK1
way back in the day. And I've spent a little
longer in AR actually going back to my architecture
thesis when I was but a young lad. And then also, I run two
companies in New York City. One is an XR creative studio
that has recently done things like helped Lincoln Center's
David Geffen Hall prototype what the experience was going
to be like for the railing designs and the whole balcony
setup, which was a really useful way to help all the key
stakeholders understand what's it like to lean over
different kinds of railings and position yourself ergonomically
in the different seats. And that just opened very recently-- pretty exciting. And then the other company is
a XR cloud streaming platform that does exciting things like this. [VIDEO PLAYBACK] - [WAILS] - Murphy, dreadful apparition,
why do you trouble me? [END PLAYBACK] ALEX COULOMBE: And
so what you're seeing there is actually a live actor
performing in VR as a MetaHuman being broadcast to an
audience around the world using consumer devices
like Meta Quests and being able to run
the experience off of very powerful
computers in the cloud. Lastly, I am recently a podcast
host, along with Jacob Feldman, who's here in the audience-- we have
the new Unofficial Unreal Engine Podcast. We've done three episodes so
far, a lot more on the way. And I think we are now
the torchbearers of this soon-to-be-forgotten Unreal
Engine starting scene. So with the bona
fides out of the way, what would I like
to talk about today? Well, I'm going to
assume that most of you here have done some things in VR. You're a little bit familiar with it. But I'm going to do a
bit of a basics flyby because I think it's always good
just to get our fundamentals out of the way. Also, this will be recorded,
so I'll go through that fast. But if you're watching
this as a video later on, just play it back at 0.25
speed, and then you'll be able to take it all
in a more natural space. I really want to talk
about the OpenXR templates. There's actually two of them. One of them was covered
pretty well the other day by the technical
account manager Simon. And I'll just give a very
quick rundown of that. Today I really want to dive into
the OpenXR VR template, which has actually gone through a lot
of changes over the past year, especially now in 5.1. I'm going to be really dangerous
today and try to do some things live in 5.1, and we'll see how it goes. As part of that, I want
to talk about desktop versus mobile, just
different considerations for building those
different platforms, and just expanding upon what's out
of the box and the VR template. Some of the VR basics
we'll start with, we'll try to think about ways that
we could go further with those and make something
even more compelling. And lastly, I know some of you
are probably here because you're curious about what's the deal
with 5.1, and Lumen, and Nanite, don't do it. Don't even try it, but I'm
going to show you anyway just in case you want to be a rebel. So let's start with, what's VR? Again, we'll go through
this pretty fast. But presence, scale,
it's like you're there. Virtual reality is first,
of all, stereoscopic. Your left eye, your right
eye gets a different view. It makes you feel like
there's depth perception. It makes you feel like your eyes are
actually inside a different world. It's immersive, and it can
often be extremely interactive. AR is different from VR. AR is usually a digital
projection of things where you still see the
real world, and the AR is on top of the real world. VR, typically 100% virtual. You can't see any of the
real world around you. Now you have things
like Magic Leap starting to dim AR to feel more like
virtual reality and things like the Meta Quest Pro, which are
starting to add in other AR elements on top of a camera feed. And we call that mixed reality now. But don't worry too much about
all those confusing terms, especially once we get to OpenXR
and what the XR part actually means. This is contentious. I like to say that
360 video is not VR. I think the 360 video because
of its lack of interactivity, the often lack of
stereo, the lack of what we call a 6 DoF experience,
which I'll get to in a second, makes it something different. And I think it gives VR a bad name. When someone shows
someone a 360 video, even if it's a very well produced
360 video, and says, this is VR, they're going to get
the wrong impression. So briefly, 3 DoF-- Degrees of Freedom-- that means
you can do three axes of movement. Typically that would
be rotating your head. Six degrees of freedom
is more true VR where you can also move
on those other axes. So you have your location
and your position, and those are all coming together. Some of the limitations
of VR is you can't use a lot of the tricks that
have developed in the camera industry over time, of course. You have a limited field of view. You can't change your focal
length without potentially making people sick. You can't do as many of the screen
space effects in Unreal Engine as you can when you're just
using this in a camera actor. It's expensive to render. You're not just rendering
one eye at 1920 by 1080. You're rendering two
eyes, maybe three eyes, as you'll see when we get
into the spectator cam. And it might be 2K or even 4K. And ideally, it's running at
90 or even 120 frames a second. So a lot more optimization often
needs to go into those experiences. Also, you're always
going to encounter people that, for whatever
reason, they're not going to want to put on a VR headset. They are afraid it's going
to mess up their hair, their makeup, their clothes. They're afraid people are
going to be laughing at them. Totally fine for you guys to laugh
at me when I'm in this headset. I don't mind at all. But I understand the concern
that other people have. Some health warnings--
people can get motion sick. They can get dizzy. They can have eye strain,
neck strain, headaches. Sounds like I'm doing
a prescription pill ad. But it gets to a point where-- by the way, headset hygiene. There was an ocular
herpes outbreak at-- I'm not going to say which event. But you've got to be careful. You got to keep these things clean. You're sharing things between people. And that also is a
totally legitimate reason for someone to say they don't
want to put on a VR headset-- so all things you want
to be thinking about. Best practices-- just a
few things to consider. First of all, if you're trying to
stop motion sickness, which often is caused by your inner ear not quite
aligning with what your eyes are seeing. It can be caused by
low FPS, by trying to move the camera forcefully
without someone actually having any agency
in what's happening. It can happen from just fast
movement rotation or position. So how can you prevent that? First of all, you could just
keep the player perfectly still. You can make sure that
you are just letting them walk around freely in an actual
space, what we call room-scale VR. You can use teleportation. And you can also use some
different mechanical hardware that tends to get very
expensive, but there's all sorts of aftermarket
ways to help prevent VR. Or you can just tell
people to take a Dramamine. But ideally, you don't
get people sick in VR because VR sickness can last a
day or two, sometimes longer. And they're never going
to want to do VR again. So don't ruin it for the rest of
us who are trying really hard here to make this a thing. Some of the things that we
think about with movement is the fact that teleportation is,
by far, the most comfortable way to move someone in VR, just to
snap them to another location. The problem with that is that's not
how we behave in the real world. I don't, at least. I don't know about you guys. But I usually can't just pop over
to the other side of the room and instantly be there. So there's a few places
along the spectrum that allow us to start to make
that a more comfortable experience. First one is fading to black
during that teleportation, I find, can often help with a little
bit of feeling more natural. Making sure that movement is not
necessarily slow, but constant. Acceleration and
acceleration often tend to make people feel more nauseous. Avoiding making anything, again,
rotate or move in unexpected ways and not grabbing the camera
from people and moving them. If anyone's done a roller
coaster experience in VR, sometimes they can be really cool,
but you have to, first of all, have some good VR sea
legs, but you also need to hold yourself just right
for it to not feel too nauseating. But everyone has different
tolerances for this sort of thing. And of course, in the
real world, ideally we would just allow people to move the
way in VR to match between the two. But that's not always
possible if someone's in a little New York City apartment
trying to make all this work. So teleporting is actually
the default option. It comes with both the CollabViewer
viewer template and the VR template, both, again, which are using OpenXR. It gets handled on the Pawn-- quick,
comfortable, but it's not realistic, and it does require a
little bit of extra stuff if you're trying to set
up a scene from scratch. I remember playing in Unreal Engine
for the first time around 2016 with this stuff, and I
couldn't, for the life of me, figure out why my
character wasn't moving. I was like, I did everything
just like the template. It's just a different scene. And I hadn't set up a NavMesh,
and I hadn't set up collision. So there's a couple extra
steps there if you're not going to start from a template. Quick, little-- I'd almost
call this a fun fact-- the way that people
perceive the world in VR, you can hack it a little bit. There are circumferences--
is that the right word? Perimeter or something
like that, where if you get to a big enough
size, you can make someone feel like they're walking
in a straight line, even if they're actually going in a
curve, which is really useful if you want to have a very expansive
experience that feels like you can just go on forever. So there are some locations that
are large enough to pull that off. Often, you're going to be thinking
about this triangle, which you see in all sorts of other things. But finding that balance between-- I mean, I would say
performance first of all. You really want to hit at least 72
frames a second, more if you can. And then you're also looking at
the quality and the features. And anything that starts
to slow everything down is going to be hurtful there. If anyone dives into
something like RenderDoc, there's all sorts of tools in
there to really figure out, why isn't my frame rate
where it should be? And we'll maybe touch
on that a little today, but that's also a
little more advanced. When you are designing a VR
experience in Unreal Engine, it's important to
think about how you're going to guide the eye
because, again, you don't have some of the
typical tools like being able to move the camera
and everything we're used to in cinematography. Even cuts-- teleportation
is a little bit like a cut. But ideally, it's a
little more like theater. It's a little more spatial,
deals more with proximity. And you are trying to naturally
guide the eye with stagecraft, trying to get people to look at certain
things in more natural ways-- audio cues, visual cues, lights. One VR experience that I
show people a lot is called The Book of Distance,
which is one that I think does a great job of most
of the scene is black. It's not trying to overwhelm
you with a huge environment. And it really helps you know
what you should be looking at as that experience goes on. Newcomers in VR, you'll often
find people just stand still. They'll put on the
headset and, they'll just look at whatever's there. And you're like, there's
stuff behind you too. And they don't to turn around. Everyone who have given a "Tilt
Brush" demo to for the first time, I find that they just
stand there, and they treat it like there's a
canvas right in front of them when, in fact, they should
be swooping through the air, and dancing, and acting
like a choreographer. They didn't have to, but it's
nice to see people understand the affordances of a medium like
this that really allow for things that otherwise wouldn't be possible. And lastly, as you saw with
the David Geffen Hall mockup, tactility is a really nice
thing to be able to add to VR. It doesn't need to be something
super expensive or complicated. You can use something
like a Vive tracker, attach it to a chair
or something like that. Or just have a chair that's
fixed in one location. And just the ability
to allow someone in VR to be able to sit down and stand
up and walk around and have some synchronicity between the
things they can touch and feel in the real world in VR
is incredibly helpful. So the last thing I
think I'd say here-- I think it's the last thing-- is
that in VR, you have to remember that you can't hide things. You can design a level
where normally if you're using like a third-person controller,
people can smack into walls and get stuck. And you can very clearly guide their
actual interactivity and movement there. In VR and naturally, you're going to
find that some people will just walk further than you want them to walk. You might have stopped the
collision area, or the walls, or the NavMesh at a certain point. But someone can still just physically
keep walking and maybe go into areas that you don't want them to be in. And typically the
solutions to that are like to fade the headset to black. But you just always
have to be remembering that you don't have as much
control over what people are doing, and you can't, again, move the
camera wherever you want it to go. So you have to think more in terms
of World Building and Environmental Design than as a
filmmaker might, where you just have very key sets that
are only facing certain directions, and nothing else
needs to be built out. That being said,
everything I just said-- more like guidelines
than actual rules. Every single thing I
just said, don't do this, to this-- there are experiences
that have broken those guidelines and done incredible work. But I would quote Picasso here and
say, learn the rules like a pro, so you can break them like an artist. It's a good quote. So diving in now into OpenXR,
one plugin to rule them all. What is it? How is it working in
the Unreal Engine? And what can we do with it now? Right now, it is, first of all-- it's an open standard--
we'll call it that, that's the right terminology--
where the goal of it is to make it so that you don't
have to build your experience nine different times for
nine different devices. It's something that just lets
you create one experience, have all the different
inputs, and mappings, and interactions targeting
every different device that it might end up on. And Then you can build a Quest version,
and a Vi version, and a Desktop version, and a Mobile
version, and, on the way, more with augmented reality. You'll notice that most of
this is virtual reality. I'm a little bummed that Magic
Leap isn't on this list anymore. The first course I made
for Unreal Online Learning was How to Build Magic Leap in
Unreal Engine 5-- or Unreal Engine 4. And now none of that applies anymore. But the goal really is to just
make it platform-agnostic. We want to make developers'
lives as easy as possible. I say we as though
I'm a part of this. I'm not. I'm just reaping the benefits of it. First thing you're going to
want to do when you're getting started with OpenXR,
especially if you are coming from a background
of actually enabling the more platform-specific plugins. You're building something for Oculus,
so you enable the Oculus plugin. You're building
something for Steam VR, so you enable the Steam VR plugin. By the way, isn't it nice that
Unreal Engine 5 doesn't always start up with Steam VR, even if
you're not building a VR experience? It was such a little thing where
I was like, thank goodness. But the first thing
you're going to need to do is tell whatever runtime you're
using to use that one for OpenXR. Otherwise, you're going
to build an experience, you're going to launch it,
and you're going to say, oh, and it can have
different behavior. You might be able to look around,
but your controllers won't work. You'll have some kind
of unexpected behavior, unless you're actually
connecting this. And there is some
software out there that automates this a little bit more
but the simplest thing to do is just to open up Oculus, Windows,
Mixed Reality, Steam and say, hey, this is the device
that I'm connecting. Please use this for all OpenXR
experiences that I'm running now. Here are some of the
plugins that you'll see if you search for OpenXR
inside Unreal Engine 5. Mostly, for now, you're just
going to want the OpenXR plugin. The eye tracking and hand tracking
isn't super implemented yet. That's all on the way. And there are other OpenXR
plugins on the Marketplace that are more tailored to
different kinds of devices. So all that being said, I
think now would be a good time to try to open up Unreal Engine. And-- oh, that's the thing I forgot
to do when the computer restarted, and I did not reopen it back up. So actually while I'm doing this
here, I'll show this for a second. Yeah, but the question
I get asked all the time is, how do you get started with
actually doing any of these things? And the answer is actually
to start with the templates. Before you dive in and you say,
I've made another experience, and I'm trying to make
it all work for VR, you actually ideally would start
with something that's already perfectly set up for it and
then take that knowledge and bring it into other projects. So we'll start with the
CollabViewer template, which we're only going to touch on. But this is your out-of-the-box
multiplayer solution in Unreal Engine. This is a very easy way to
have a host, and clients, and have everyone join together. It's made for working together on
different kinds of projects, usually design. The way you access it is by going
into either the Architecture or Automotive Product Design
and Manufacturing sections and grabbing the CollabViewer
template up there. And here's what it looked like
during Simon's lab the other day. We had 15, 20 people at a time,
all running around together-- some in desktop, some in VR. So it's really nice how
it supports that kind of multi-platform
experience in real time. I'm going to play a little video
here while I open up the project. So here's what it looks like as
you start to pop into a session. And you see that, by default,
you are actually in not VR mode. So I'm flying, and
walking, and orbiting. But then you can click VR. And as long as you've got your
headset connected, you can pop in. It looks like that. Yeah, so as I said, you can teleport
right out of the box, move around. You'll notice that whatever
direction that is pointing-- sorry. That's Unreal Engine hitching it up. Whatever direction
you're pointing, that is the direction that you'll actually
be pointing after you teleport. And then there's some features here. It's not the full menu of everything
inside the CollabViewer template. But you can do things
like draw a little bit of "Tilt Brush" kind of action here. So here, I'm like, oh, I don't know. Maybe this should be a window. So you can annotate things. You'll get a little VR
keyboard that pops up. And this can all be happening live
in a multi-user experience where everyone can see it together. And this is totally extendable. You can add additional
functionality to this. But the out-of-the-box
stuff is pretty cool, just being able to put everything
in X-ray mode to isolate things. And again, doing
this all with a group of people, if you were
viewing a project together, is really exciting. There's even Data Smith
Runtime now, so you can be loading new projects on the fly. Here I am just taking
some measurements-- just, oh, we're going to
modify this walkway, and we've got to see how
long it is right now. How tall is the building right now? Let's find out. And you'll notice, by
default, it's in meters. Unreal Engine, of course,
is in centimeters. But if you do a little
scroll down, you can convert everything
to Imperial there. And over here, the bookmark, you can
jump around to different locations that you've preset-- so
teleported waypoints, as it were. And then the last thing
I think I'll show here-- or sorry-- two more things. One is the Section Box tool. This is a little bit like for
anyone who's used Revit before. You can have these different
boxes that are either including or excluding what is inside of it. So right now I'm deleting
whatever's in the middle. Then, I inverted it. And now we're only seeing
what's inside of it. So this is a really useful
way to make sure that everyone is focused on the same thing. As I alluded to earlier, I
think a lot of times in VR, people feel too much of a
need to show everything. If you're building a
complete game or experience, yes, you don't want
people seeing nothing. But if you're in a
design review meeting, only show people what they
need to see, so they don't get hung up on some weird design
byproduct that's off to the side. Hey, Sam. What's up, man? Can I take a picture of you? Great. Look at this awesome
VR camera I have. I can move it wherever I want and
just save that into my folder, and there we are. So very quick rundown of that. I mostly want to focus,
though, on the virtual reality template today, which I would
say is a little more built out and a little more extendable. So let me see. First of all, this
is how you access it. You go into Games, and then you
can open up Virtual Reality. Again, it is in Games. But I would say that can
use this for anything. I've used it for every kind
of project you could imagine. And let's see IF I can get it to run. There we go. So here's Unreal Engine, opened up. We are in the 5.1 preview right now. And I'll just start by showing
you what's going on here. And the first thing I'm going
to do is actually make sure that I'm using the
original pawn, which I am. And I'm going to make sure
that my enhanced input-- which is something
new, and we'll touch on that-- is set up the
way that I expect it to. It is. Great. So what I'm going to do now is
I'm going to hit a VR Preview. And you can hear a little bit of
the fire burning in the middle. And I'll just give you a
little bit of a rundown of what the supports-- if my
VR headset doesn't break the moment that I put it on. [GRUNTS] OK. So right now you're seeing
that I have controllers. We can change that
in a second actually. And I can teleport just by
pushing the joystick down. And that will go
wherever I want it to. And I can do things
like pick up this. And once the shaders load, these
will be little yellow balls. And there's some physics going on. We also have a very simple way for
creating grabbable objects, which I'll go over. And I can teleport over here. There's a No Teleport zone, where
you'll notice we're saying, hey, we don't want people over here. [FIRE CRACKLING] And you get a
sense of the spatial audio we have. The closer I get to this
fire, the louder it is. Now, I'll go back for a
moment, and I will open up the spectator camera, which actually
allows me to be in VR while I fly around with my WASD keys. And fun fact-- the
spectator camera actually works even if you don't
have a VR headset attached. So it can be a useful way to
just zoom around your level. Although something I
recently learned-- and I just want to share it with you-- is
even if you don't have something like the spectator camera in your
scene and you want to fly around, the semicolon on the keyboard or the
console command toggle debug camera will actually let you fly around any
scene at all, which is pretty cool. So that's a very, very quick overview
of what we get out of the box. And what we can do now is show
a very, very quick preview of what the hands look like. So this literally just
released with a Preview 2. But now you have some
nice inputs there for if you want to use hands
as part of your project. And we'll dive into the blueprints
a little more for that setup. So to go over a little bit of
what you get out of the box, first of all, something amazing
about this template that blew my mind the first time I opened
it is as long as you have Android set up, you can build to Quest
and to Windows immediately. I think Linux is almost there
as well, which is pretty cool. So it's nice that
you don't need to do a whole bunch of
different configurations for Windows and for a mobile headset. It can build exactly the
same project, exactly the same starter map and
work great on both a totally mobile standalone platform
and a desktop one. So how is it set up? Now we're moving a little bit
more into the territory of what if this is something
that you actually want to implement in a project
that you've already created? Again, l would say,
start with the templates. But let's talk a bit about
what you're going to be doing. First of all, as we
alluded to with OpenXR, you can use a variety
of different headsets. There are different
considerations if you're targeting desktop versus mobile. You can have everything
scaled down so that you say mobile's the baseline. And then anything
desktop is still going to be running at mobile settings. Or you can have different
branches in your blueprints that might target
certain kinds of settings based on what the
platform that is detected. You can do that manually
or automatically. I also want to mention that something
that's happening more and more is a move toward the ability
to pixel stream, cloud stream. I mentioned my company
earlier where you can use things like NVIDIA Cloud
XR, or Airlink, or virtual desktop, or Vive Business Streaming
to use a standalone head-- or yeah, a standalone headset
like the Quest totally wirelessly, but get the fidelity of something
that's actually running off a much more powerful computer,
either one you have locally or one in the cloud. And so I'm finding more
and more of these days that I like to crank
up my VR settings, make things as
good-looking as possible, and then run it on hardware
that can handle it. Only tricky thing there
is you need to make sure you have a strong
enough internet connection to handle the latency
of going back and forth and also streaming a high
enough resolution setting. But it essentially becomes
a Netflix Bandersnatch kind of level of what the
data signal is like because you need to
tell that computer where your head is, what your hands are
doing, what buttons you're pressing. And then it needs to send you back
in milliseconds exactly the signal that you should be seeing. That being said, you might
think that's definitely going to make you sick. Not necessarily. I've been very impressed. There's a lot of
predictive algorithms that help make sure that you
are going to feel comfortable in the experience. And if you turn your
head really fast, there's certain assumptions
about the way physics works that allows some
predictions to know where you're going to be in a moment. OpenXR, again, one-stop
shop for input. You'll also see as I open up
the new enhanced input settings that it's very, very
straightforward now to make these modular
UASSET files-- you remember before with input,
you'd have basically a little text or an any file
that had everything listed there. Now you have these assets that
are much easier to share and bring across projects that are
targeting different devices. And then, yes, just
a little reminder-- OpenXR with Lumen and Nanite-- it's
there, not super stable quite yet. So when you are targeting Quest,
one thing you want to set up is in your VR settings
in the project settings. Ideally you use
something like foveation. Foveation is what allows you to
render the middle of your view at a much higher resolution,
higher quality than what's around. I think the human eye
only actually is ever looking at something like three
or six degrees of its view. So being able to get closer
to only rendering that saves a lot of processing
power, especially when you're running from a mobile GPU. Being able to have
everything else be rendered at a lower resolution, or blurry,
or whatever is easier for a device to render is pretty great. And you'll notice there's
low, medium, high. There's a Dynamic Foveation level. And now as we move toward more
headsets that have eye tracking, that also helps with knowing
exactly where you're looking and how to foveate that rendering. Typically with a device that
does not have eye tracking, it's just going to be the center
that has a higher resolution. And everything else around it
is going to be more blurry. Some other things you
want to make sure are on-- Instant Stereo, Mobile Multi View. Be careful about enabling Mobile HDR. It says Mobile HDR. What it really means is, do you need
your post-process volume settings to be working? And again, some
post-process volume settings don't work with VR at all,
especially on a mobile device. But if you are going to
be enabling Mobile HDR, that's probably because you
have an experience where you've done the color calibration just so. You're using lookup tables, or you
want the bloom to be a certain way. You're doing a theatrical experience,
and you want all this nice-- yeah-- bloom to come
off of your lights. It's incredibly expensive. So just be aware that the moment you
hit that checkbox for Mobile HDR, you're cutting your frame
rate in half, basically. Also, Round Robin
Occlusion-- just another checkbox I actually
would recommend hitting. I don't know why I took a
screenshot where it's off. But it gives you basically
one extra frame of latency, but then actually has pretty
significant render savings. On Android, I'm not going to
go through the whole process of actually downloading
installing Android. There's great documentation on that. But once you do have
it installed, there's just some basic settings you want to
do for what the name of your project is. Usually do company name, dot,
whatever your project name is. And then you can make sure that
you're using the right SDK versions. And that's all down there
in the Android plugin setup. You might also see
some warnings that say, this project is not
set up for Android yet. Would do you like it to be set up? Yes. So they do try to
walk you through this in a fairly straightforward way. Over in the SDK settings, this
could be populated automatically, but you will have to tell Unreal
Engine where all of those paths are. And I guess I should
have also been very clear, the reason why
we're talking about Android is because most VR standalone
devices right now run on Android. When Apple comes out with their
headset this year, in five years, in 20 years, whenever it happens,
probably will not run on Android. But right now that's the
state of the industry. Forward rendering-- so this
is right out of the box. I can tell you, this
is the thing that is going to make VR much,
much more performant for you. It also means you cannot
do Lumen and Nanite. Forward rendering allows you
to render much, much faster. It works great if you are
baking a lot of your lighting, if you are going to have a
scene that is very optimized. If you're going to have
a lot of mobile lights, if you have things
that are moving around, then you actually might
get better performance with deferred rendering, which is
what happens if you just uncheck the Forward Shading checkbox. But it's going to be
a much heavier scene and would not recommend trying to
do that on a standalone headset. Another nice thing
about forward rendering is that you can use multi-sample
anti-aliasing, which is very, very sharp. Looks good in VR. I wouldn't recommend
using it with MetaHumans. I find MetaHumans look more
uncanny valley with it-- just my personal opinion. I actually try to use deferred
rendering with MetaHumans in VR, if I can. And I usually don't put them
inside a standalone VR headset. Some post-process settings you
might want to be aware of-- first of all, again,
they're expensive. So even at the engine
level, I recommend turning them off if you
can or overriding them in the post-process volume. Quick reminder about the
post-process volume-- I know a lot of people,
they see the check boxes, and they think the check boxes
mean you are turning on settings. What it actually means is
you're overriding settings. You have your post-process
settings in your any file. That shows up in your
project settings. Those are all on by
default. Any checkboxes you have there are just on. Then you override those with
your post-process volume. So if something is off, you can
override it by saying it's on. If something is on, you can
override it and say, no, for this particular volume,
I don't want bloom on. Another thing you want
to be thinking about is that cameras can do
another level of overriding, and that can be your VR camera. It can be something more cinematic. But that's just another level
of overriding the overrides of project settings. Post-process volume's something else. So some basic settings here-- set
your Screen Space Ambient Occlusion low or off. It's expensive. Set your bloom low or off. It's expensive. And I just have to say, I
think convolution bloom looks-- [KISSES FINGERTIPS] chef's
kiss-- so good in VR, but it is very expensive. Screen space
reflections, if you can-- I mean, if you have something
that's very, very blurry, I think they're OK. But for the most part,
particularly in VR, I find them incredibly distracting. And in deferred
rendering in particular, they actually have something
really wonky going on with them. Then, they don't really work
with forward rendering at all. And then also with reflections,
I'd say planar reflections-- I have a slide on that in
a moment-- they look great, but they are very expensive. But they are the best
mirror quality setup in VR if you want to have good-looking
reflections outside of, of course, ray tracing and Lumen,
which you can do in VR. It's just a factor of five
more expensive than anything we're discussing here. Lens flare doesn't look good in VR. Motion blur doesn't look good in VR. Don't put dirt on
people's lenses the way you would with a little free
cam war cinematography thing. It's going to make people feel
very wrong, like their eye is about to die. So be careful. Here's the setup for
planar reflections, for anyone who wants
to play with that. It's just one checkbox. It'll recompile all your
shaders, and you just have to be careful
because it's expensive. And sometimes when you're using a
render target like the spectator camera does, which we'll
go over more shortly, it can look a little bit funky. But Target Hardware--
this is a setting that is just a very, very quick way
to, even in an existing project, to start to make it be a
little more suited for VR. If you just go ahead in
your project settings and you change it to
mobile and scalable, it's going to be
like, oh, do you want to change these 30 project settings? And you can say, yes. And now all of a sudden, your project
is much more optimized for VR. Also, this is going to be going
away soon, but it's not gone yet. So I do want to mention it. If you enable the Oculus VR
plugin, which normally you wouldn't do if you're using OpenXR,
there's a great little tool in here that you can use for any VR headset. It doesn't have to be an Oculus,
Meta, Facebook, whatever headset. And you can launch what's called
the Oculus Performance Window. And it will actually flag things
that it sees in your project that might be costing you frames. It's just a very nice kind
of, oh, hey, you're on mobile, and you've got a bunch of
movable lights casting shadow. Maybe don't do that. And it's just a good little-- I'd call it a stupid check. Even if you've been building VR
experiences for years like I have, I'll still pop into this every so
often and say, oh, yeah, of course. I made everything glass
when I could have just made all those windows blank. And on a mobile VR headset, it's
definitely worth the cost savings to make glass nothing, for
one particular example. VR Mode, I want to touch on as well. This was something I got super
excited about when it was introduced maybe five or six years ago. It was called the VR Editor then. To access it, you do actually
need to go back and enable Oculus VR, Steam VR, et cetera. It's coming to OpenXR. But what it allows you
to do is take any scene that you have running-- doesn't
even have to be a VR scene as long as you have those plugins enabled. And it lets you explore that scene. You can fly around. You can move things around. There's two different modes. This first one is basically
like the VR Editor mode that lets you pull
up your content windows. You can drag materials onto things. You can simulate physics. You can have a cute,
little flashlight that you can go around
your scene with. And especially if you start playing
with something like Lumen and Nanite in 5.1, I'd recommend
actually starting there. I find that actually runs much
smoother than trying to do something like VR Preview or a build,
probably based on the way that it handles resolution. And then that's just how
you set it up a little bit. It's a little bit of
what it looks like. It's great to paint, things like
trees and do your landscaping because you just feel like a
god who's like, mountain, tree. And you just get to set
everything up that way. You can move cameras around,
any kind of objects you want. And you can also make it so that
if you're really going hardcore into the VR Editor,
you can make it so that every time you
put on your headset, it's like, all right, we're doing
the editor, but now we're in VR. It just does it automatically. And then this is newer. This is the Virtual Scouting
tools that come inside the Virtual Production template. This gives you this wonderful
array of cameras and tripods. And you can create
dolly tracks and set up markers for A camera and B camera. And it's a very cool system for
thinking about virtual production and just feeling like you're on site. So it's a pretty easy
template to even migrate over to an existing project. And now you're walking
through your environment and setting up all of your shots. And again, the final project
might have nothing to do with VR, but it is pretty cool to be
virtually scouting a location if you're just setting
up some cool videos. OK, VR Pawn Extensions--
let's see how we're doing on time-- pretty good. [FIRE CRACKLING] So I'd like to show you
guys a little bit of how we can make very loud fire-- how we can add a few
features to the setup that's already inside of the
template that I would just call them extensions. It's taking what's there and
maybe going a little bit further into options, into scalability,
into different levels of comfort. So the first one I have here
is dash teleportation, which is instead of blinking
from one spot to another, it actually lerps you from
one location to another. And I use the timeline
node, which is basically a little Sequencer that plays
everything over a second and then brings you there. We'll look at that in a second. But actually, the first thing
I want to show you guys-- I should have moved
the slide earlier-- is a dampener to dampen your
spectator camera because I find that-- and let me just demonstrate this,
and feel free to close your eyes if you don't want to see this-- that if you just watch someone
inside VR, like me here, it can be a little bit nauseating
to just be like, my god. I might be having a great time. It's like someone who's
a crazy driver in a car, and they're having fun
swerving all over the road, and you're a passenger
who's about to throw up. So first thing I'd
like to show you is, how can we make this a
little bit less wonky? And it's actually not that hard. So let's go ahead and open up. Again, and we are literally
just in the sample scene that anyone will see if
you come into this project. And we're going to open
up BP_VRSpectator-- so Edit. And in here, we'll just go through
a very quick rundown of some of the things happening in here. We say if it's a mobile device,
we can't do a spectator camera, so don't worry about it. Initialize your input because you got
to make sure you can click things. The Level Blueprint, just a
reminder, is the only blueprint where you can just put in inputs for
things, and they're going to work. Otherwise, you need to
say, hey, please do this. Find the player controller and make
sure that it is enabled for them. This is the new enhanced
input local system. So you do this also. You do a Mapping Context. And then you say, hey, I want to use
these keyboard presses in this way. We'll dive more into
that in a moment. But we go through this. We say, OK, is the
spectator camera enabled? Great. Then let's jump over to here where
we enable the spectator camera. You don't necessarily need to know
what every single note is doing. I often say, blueprints, it's a
little bit like driving a car if you go go back to the earlier analogy. You want to know enough about your
car to be able to take care of it, but you don't
necessarily need to know how every gear, and axon, and
piston works inside of it to get it to do what you need it to do. So we enable the VR spectator. There's a toggle for
making it go on and off. And you'll notice the
input nodes here-- again, enhanced input, a different setup--
triggered, started on, going, canceled. And what we basically
want to do is we want to find what's happening
on Event Tick, which, of course, is every single frame. And so right now it's
changing the field of view as needed because we can do
that with our mouse wheel. And right now there's two modes. There's a Fly mode and
a First-Person mode. And basically the
First-Person mode is as it is. It's just going to match
what you see as a VR player. But that's the spectator view. And again, I think it's a little
bit nauseating to do that, so let's see if we can dampen that. And so what I can do
is we can say, OK, if we're in the
first-person mode, then I'm going to create a new
custom event that is going to actually handle this dampening. So we're going to come down here. I'm going to type in custom event. And I'm going to call this dampen VR
camera view, or something like that. And then just so I
don't forget, I'm just going to go back up
here and remind myself that that's what I want
this to connect to, even though it's not
doing anything yet. So if we're just in
Fly mode, if we're going around with a spectator cam
however we want, no problem at all. If we want to jump into here,
then it's going to pop down here. So what we're going to do at this
point is we're actually going to go, and we're going to find-- or actually
we can just right-click here, and we're going to find
an interpolation node. So an interpolation node-- a little bit back to lerping
and trying to smooth something over time, there's different
first-letter prefixes depending on what
you're trying to do. So you can do F for a Float. You do V for a Vector. And I'm just going to do a
Transform-- so T interp to. And so this is kind of a magic node. When I first was trying to
set up my camera dampening, I had 100 nodes here. And I'm like, oh, I got to--
it was a lot of lerping. And then I found that
this one really does exactly what you want
because you are going to say, hey, here's where this thing is now. Hey, here's where I
want it to get to. And it's going to do it the
speed that you want it to go at. Oh, don't autosave. So we are going to go and
set an actor transform. Actually, we don't even
have to actor transform. We can do the actual
camera one right now. So we can do set world transform
for our actual camera here. I'm trying to do it for the
actor, not the component. There we go. And now target self-- that's all fine. This is going to plug
into here because that's going to be the new location. And now we just need
to know, where is it? Where is it getting to? How much time has passed? And what speed do
we want to do it at? At zero, you're going to
be doing it immediately. There's a lot of things in Unreal
Engine where zero means almost ignore it. Just skip right to it. But then, as soon as I bring
this to something like 0.1, it's going to be very,
very, very slow versus then if I go to something like 10,
and it's going to be much faster. So we'll see that in a moment. I'll leave it at something like
0.5 for now, so we can see it. For delta time,
basically we just want to be getting the equivalent
of the event tick up here. So just to grab that real
quick, get delta time-- or, ooh, it's delta time or
world delta seconds. It might be world delta seconds--
how much time has changed. Remember, old math classes,
delta means a change over time. And then current,
we're going to say, OK, what is the current
transform of our actor? Oops. Not the comment box. Live demos are fun, pressing
wrong buttons and everything. So we're just going to Get
Actor Transform for self. And then the target is going
to be wherever the camera is. So we're just always, in a
lazy way, trying to keep up with where the actual VR camera is. So we're just going to
get the player pawn, zero. And we're going to get
their actor transform. Actually, you know what,
instead of the player pawn, because that won't necessarily
be where the camera is facing, we're actually going to
do a different one here. We're going to Get
Player Camera Manager. That will actually show you
where the actual camera is. And then we can connect
that into there. Now, do bear in mind that because
we're doing the entire transform, if you've got a different scale,
that can start to get messy. You could also do-- there's a node for set
location, set rotation that leaves that scale entirely. But this is the really simple way to
just go ahead and dampen the camera. Something like that. OK. So again, just to
review, up here, we're saying, hey, on event tick,
every single frame, let's check to see if we're in fly
mode or first-person mode. And if we're in
first-person mode, let's go ahead and dampen the VR
camera view every single frame. And so this is firing
in every single frame. We're trying to set the current
actor transform of the VR Spectator Blueprint. We're finding where it is right now. And then we're trying to very slowly,
at a speed of 0.5 at the moment-- and we could, of course,
promote this to a variable-- public variable, even,
once we compile-- public. We're just going to try to get it to
get close to whatever the VR player camera is doing. So Save All, come out of there. And right now we could
actually set up a button for toggling between these. But I'll just say, instead
of fly mode, down here, we're going to do first-person mode. So let's see if that works. Go ahead and hit Play. And I'm looking around, and
you see it's all wobbly. And oh, no. Stop it, Alex. And then I go into this
mode, and it is first-person. And what's actually very
interesting is right now I am actually not even
seeing this in VR. So something funky is happening
because I don't see the experience. All right, shall we resume? Great. I like that attitude. So let it be known
that I did everything right at the end of last
session, and the VR headset it was just freaking out. This is a slightly more robust
version of the VR spectator template that we were just looking at. I'll walk you through a little bit
of how everything is happening, but those key functions that we
did together, that's at the bottom there. We're setting the actor transform. We're using the
transform interpolate to. We're getting the current
location of the spectator camera, and then where the
player camera manager is, and then trying to
basically connect them. And so let's just see what
that looks like in action because it is going to guide what
we're doing moving forward here. I'm going to talk about
that one in a second. So here's our scene. And I also had a 5.1 Preview one
version of the project open before. So that's why we didn't see hands. Now we should see hands. So you'll notice, by
the way, I'm also going to launch a standalone game here. The reason I was playing more in
standalone while you were all gone is because there is a feature where
you can change your size in VR. And right now it works
totally fine in a build. In the Editor, it tends
to not behave correctly. So if you do a standalone game,
it's treated almost like a build. And then the only thing you want
to make sure you do is in your-- not Project settings-- in
your Editor Preferences, you'll want to make sure that you
launch with the launch parameter -vr. So that's just like
you would do a launch parameter for a build version. That will make this work correctly. So let me show you real quick what's
going on with the spectator camera. I got to pop back into VR. You'll notice of a
headset isn't detected. That's what it will go to. But I think I'm still in it. There we go. OK, so I'm here. You'll notice I've got hands now. So this is my fully
decked out pawn that has all sorts of expansions from
where we start in the template. So we'll walk through
a bunch of those before we get to some fun, very
experimental Lumen/Nanite stuff at the end. So let me toggle over. Actually, it's doing it already. So I want you guys to
notice that if I do this, it's not nearly as shaky as if-- yeah, there we go. So that's much shakier. And then what we can do as we go
to the Spectator Camera-- so here's the Spectator Camera. And then I can go over
to match what I'm doing. And now even if I shake my head
a bunch, it's dampening it. So it's a much smoother view. And now I'm much less likely
to make all of you sick. I mean, I would say
there's a sweet spot because if you do too much
dampening, it almost feels like you're on a swing or something. So I'd almost call this a little bit
too much to the right, so to speak. But it's a lot better, I think, than
just getting the super shaky view that you would otherwise get. Someone was asking during the
break about when you have maybe not the best PC, and you do want
to provide a spectator cam that isn't going to make people sick. And I have a good answer
for that because I've been in exactly that situation
carrying around a subpar laptop to give VR demos on. And what I've tended to
do in that situation is I will go into the Spectator Camera
right here, and I'll go ahead and I will go to my Scene Capture with a texture target. I'll crank this up to
a very high resolution. I'll make sure that I'm rendering at
top quality for the capture source. I'll go into the
post-processing volume and crank up all the
quality settings there. So that all looks great. But then I'll go to the VR side
and in the VR pawn and the camera, I'll crank everything down. So that way, I'm getting a good frame
rate, and I can be in VR for a while without getting sick. But I don't have all the
bells and whistles of bloom and other high-quality
features on VR for me. I'm tailoring the
quality of the experience for the majority of the
people who are watching it, depending on the presentation
because, as I said at the beginning, sometimes you just will get people
who do not want to be in VR. They're OK with watching
someone be in VR. And again, I've done a lot
of theaters in my time, so it's a lot of Alex, can
you please lean forward? OK, I'll lean forward. And they just want to
see how the view changes. And you want that experience to
work well for them as a spectator. So let's go ahead
real quick and open up and just take a little bit of a look
at some of the additional settings we have. So this is where we started. We're firing that event. As we said, we're
connecting all this. But I also have
another feature in here that I wanted to add in
because one thing that's a little bit of an issue with
having it in the simplified setup is if you want to toggle between
the flying version of the spectator camera and the version
that matches, when you press Tab to go back
into the flying version, it is actually still
going to be matching where your head was because you've
physically moved over the spectator camera to where the head is. So what I'm doing here,
I'm saying, OK, let's flip-flop when we press the M key. I am not using the enhanced
input here-- very bad to just say we're going to press the M key. But we can see that,
first, we're going to be in the first-person mode. And then before we start
doing what we're going to do, I'm grabbing the transform. I'm saying, where is the
spectator cam right now? Let's save that. We're going to save
that, so then when we are done going through all
the wild, wonky world of being inside the VR player's head, we can
snap back to that other location that the transform was at before. So we're saying, Get
Actor Transform, set that as a transform variable
for last free transform. I then just do a
little debug that says we're now matching the player cam. And then this is a
little bit more advanced, but I'll walk you through it anyway. Then I'm actually going
to cast into the VR pawn because there's a little
bit of a problem when you do a dampened camera where
as you might have noticed there's a headset now. We do, by default,
you could turn it off. But there's a physical
static mesh of a headset. And if you're dampening
the spectator camera, you keep bumping into that headset. So the little thing I'm doing
here is actually popping over to the spectator visibility,
and I'm saying, hey, we are we going to be
bumping into that thing? Yes. Then let's go ahead and hide the
HMD so that's not in the way. And similarly, I'm doing something
over here where for the camera mesh, I'm making it so that it is only
being seen in the Scene Capture. The Scene Capture's what's
happening over here. So that way, the VR player
doesn't have the camera constantly bumping into them in the scenario
where the camera is actually following them directly. So that's a little
bit of an extension on the extension we already did. And then down here, just a
very simple flip-flop toggle. Sometimes I like to actually
assign Caps Locks, especially if I can see the light to
something where I actually want to see if it's on and off. And in this case, we're
doing a back and forth between the Fly spectator mode and
the First-Person spectator mode. But everything else, I think for
the spectator side, looks the same. We're just saying, up here,
if we are-- oh, you know what, I think-- yeah, OK, here we go. So out of the first
person, we come over here, and we match the VR camera. I think actually the
problem with what happened before is I think we
were actually going through here. And so this is saying, hey,
let's match the camera directly. It's not doing any kind of dampening. So that way, if we bypass
that entirely and just go right to matching the camera
with our interpolation node, then that's a lot smoother. So there we go. That's the spectator camera. One other thing I want to touch
on that came up during the break is controllers. So you'll notice right now
in my special extension pawn, I've got a bunch of assets in here
that I've called _ext for extension. And so in that pawn right
now, I have the hand. The hand comes by default in 5.1. You can see all sorts of fun hands. And if you want to also
have the controller or have the controller
and not the hand-- let's say you don't want the hand. You could go ahead and just set
the visibility off for the hand-- or sorry-- nope, not that one. We could go down to-- I didn't actually check
it-- yeah, Visible. I thought I typed that in. So you would uncheck Visible,
so you don't see the hand. And then up here in Motion Controller
Right and Motion Controller Left-- not the aim
versions because that's just for something with the
menu I'll show you in a moment. You'd actually go down to Motion
Controller, Motion Source, right, left-- make sure
those are matching. They should. And then above that,
Display Device Model-- so by default now, that is unchecked. In 5.1, it's unchecked. In 5.0 and previous, it was checked. And so you can say that just want
to let OpenXR take care of it. You can say, oh, I bet OpenXR is
going to know what kind of device I'm on. And it, I'd say,
maybe 70% of the time will show you the
correct controllers. If it doesn't or if
you want to be more discreet in how you're setting it up,
you could go ahead and say Custom. And then you could actually tell
it exactly which model to use. We got pico, neon
here, Samsung Odyssey-- all sorts of different controllers. You just basically
type in controller, and it'll show you a bunch of
different headset controllers. And we can do it
directly in here, or you could set up something like maybe
a select or a switch parameter that is getting the HMD data-- a little trickier with Steam VR. But then you can feed
all that out and start to automate that in your own
little custom XR solution. So for the moment, I've just
got both on at the same time because I think it's interesting to
see the hand holding the controller. I did actually have to
rotate the hand a little bit in order to fit a little nicer. So I'll just point out I've
got 17 degrees and negative 17 degrees on the y-axis to have
the hand and the controller on at the same time and looking OK. OK, before we go back
into the presentation, one thing I wanted to point
out is just a little bit of advice for desktop versus mobile. This is a little architecture
project I tend to teach with. And I have three
maps of this project. And each map is targeting
a different configuration. One is a nice, high-quality computer
that can render things fast. In this case, you can try to make
it look like a good VR experience. And there's some things I
have on here that I then start to turn off as I go
down into lower end hardware. So I thought it might be nice to
just walk through that a little bit. A couple of things I want
to draw your attention to-- the water. I've got some nice-looking water
here with some reflections. I have got-- oh, actually, I'm sorry. This is the middle version. Let me start at the high version. I forgot that I
reopened up the project. So we're going to see
the better version first. And then we'll step down. So here we go. So we got all this nice
bloom on the water. Look at the lights here. We got some nice convolution bloom. I'm going to turn off the grid. I can never find the grid. That's it-- G. Thank you. Yeah, good shortcut. I've got some plants. I've got a bunch of trees everywhere. I've got a movable light, and
I've got some higher quality furniture in here where I've set up. I do have some baked
lighting in there, but I also have some movable lights. So if we go ahead, for
example, and look at our sun, this is set to be entirely movable. And the reason for that is
I have a little sun study that actually will physically
change the time of day. And I think we can even
see that if I just hit Play and just go into the Spectator view. So if I press, I think
it's K on the keyboard, we can see the shadows changing. And we could actually see the
sun set as it goes into night. And so that's what you
can do on the highest end. I'd like to do that where we
can actually see the camera. Hold on real quick. Let me just-- and you'll
notice I'm not even connecting to the camera
right now or the VR headset. And I've got some blockage there,
but let me press K real quick. There we go. So there's the sun going down. And you got some nice bloom and
light shafts going through there. So this is the higher end
version of what you could do. My windows are actually glass. We've got some reflections on them. And that's all great for
higher-end configuration. Also, the amount of foliage
is all going to run fine-- 90 frames per second--
on a machine like this. So then we step down a level,
and now we go Desktop Low. And then we start to do things
like decrease the foliage. Now we are baking all the lighting. And the sun, though, the sun,
you'll notice, is set to Stationary. So I'm not going to go
over how lighting works. I'm sure a lot of you have already
done this kind of stuff before. But a stationary light,
of course, is something where you can change the intensity
and the color even after baking. And so on a lower
end VR configuration, this is where I might
do a fake sun study. So I'll do exactly what I just did. I'll go ahead and hit Play, go
over to my spectator camera. And you'll notice now that
if I were to go and press K, we're going to see
everything get dark. But we're not actually
seeing the sun set. So literally all we're
doing in this sun study is turning down the intensity--
so again, no movable lights. Everything's baked. But by having one
stationary light, we can simulate the idea of a sun study
while targeting lower-end hardware. By this point, I've
already removed the glass because I was hitting some
frame rate issues in there, and we'll go over some of
the things like geometry merging that I did to also
help this render faster. But yes, there's no glass now. Everything that would
be glass is just clear. Translucency, of course, is very,
very expensive in any context but VR in particular. And I've simplified
the water a little bit, and I'm just using a standard very
light bloom, no longer convolution bloom, even though
it looks very good. Next map. So this one is targeting Quest. And in this one, you'll notice
that it's much, much, much simpler. And this actually is not
representing exactly what it would look like on Quest. There's actually an Android
mode we could enable. I'm not going to do it right
now because it will take forever to recompile. But I could click
Android Vulkan here, and that will give me a closer sense
of what it's going to look like. But here, I can't really afford
any foliage, so I'm using an HDRI. My lights are all baked. I'm not even doing
anything stationary. And for the water,
what I've done here is I've still left
the panning texture. All I've done is I've
changed the material. In fact, I just did
a material override. I think I just went into here. No, I didn't do it here. But basically it's this. This was set to Translucent, and I
went ahead and just made it Opaque. So just by doing that-- you can see the shader
instructions in the upper left-- I think I cut it down
by like a factor of 5. It was, like, 3,000 shader
instructions, and now it's 855. So you can still get a little
bit of the same effect, but now you're trying to make run
very smooth on a lower-end device. And this isn't going to be the
most beautiful scene in the world, but this is also targeted toward
what an indie developer can do when you're cranking
out models really fast. This is a fast way,
within a couple of hours, to be able to target three
different platforms without having to optimize 1,000 things. And speaking of that
kind of optimization, we have some optimization view modes
over here, which are really useful-- so shader complexity. You want green. In this case, I'm letting the
fire be a special snowflake in the middle of that we can
all have our attention drawn to because it does look a little bit
more robust than everything else. We can look at things like our light
complexity, which is nothing here because everything is totally baked. We can also look at our level of
detail coloration, which is useful because we can start to see some
of the higher geometry things, like the rocks are actually getting
different levels of detail there. I've also removed some
things like curtains, which, even at a
lower level of detail, were becoming very, very complicated. There's no need to have LODs
on things like these walls because they're already
very simple geometry. And of course, if you start to go
to something like Lumen and Nanite-- not compatible at all
with mobile devices-- then that's pixel-level LOD. And you don't have to
worry about that at all. So just wanted to fly through
some of those features real quick. We've looked at our
updated spectator camera. And now I'd like to go through
a few of these extensions I've done to the default VR pawn
just to give you a sense of how we can add more options. So remember our old friend
the VR Dash Teleport. So the idea here, once again, is
that blinking from one location to another-- sometimes that's fine. But if you want to
feel a little bit more like you're getting the
sequence of moving from one spot to another while still maintaining
a certain level of comfortability, you can do the teleport but actually
have that very consistent movement from point A to point B. And for
me, I'd prefer that kind of teleport over a blink one because
I still have context. As a former architect--
still kind of architect-- I'm always thinking about the
progression through spaces. And I love the sense of
compression and release. I used to commute into
Grand Central every day. And to be like on a tight, tight
train for an hour and a half and then to open up into that space
is a really incredible feeling. And the contrast is important. And you lose all that when you're
doing things like teleporting. So here's the little blueprint. We'll look at all
this stuff together, but that's what we're
going to have set up. VR Flying is another one here that
we're going to take a look at. So instead of teleportation,
if you got the sea legs for it, very cool to just fly around the
scene wherever you want to go. And the blueprint is not
that complicated to set up. You basically need two axes. One axis is going to
be forward and back. One is for strafing. And you're just adding really
that one out-of-the-box component, Add Movement Input, add a certain
vector, add a certain amount. We've got move speed in there. And actually I'm going to show you
the little node we have here is axis greater than dead zone. That's intended to be so you're not
accidentally touching the joystick, and it's moving, and you're
like, oh, god, I didn't actually plan on moving, and you feel sick. So you often will have a dead zone. And you'll do this with
Xbox games as well, where you're saying you
want people to really be putting some intention into how
they are making the player move. And with the new
enhanced input system, you don't actually need to
set up dead zones anymore. That is part of the input system. So I'll show you exactly
where that is now. And if I'm being totally honest,
I only discovered it an hour before my talk. So I was like, great. That can make the
blueprint even simpler. And here's a little bit
what flying can look like. And obviously we have the
spectator cam following this too because of the way it's moving. VR Scale Adjustment-- so
this is what I was just mentioning, that this
is only going to work in Editor if you do standalone. Otherwise, if you're
in VR Preview, mode you're just going to see the
controllers scaling and not anything else. But the magic of VR, of course,
is really having a sense of scale. This is something that won't look
any different if you're not in VR. But if you are in VR, you can
go from that human-scale level, to an ant level, to I'm a
god, and this is my dollhouse. And there's a lot of context
where that's really useful. There's a game my kids love to
play on Quest called "Half + Half," and there's a hide-and-seek
experience where one of you is a giant, and the
other one is very tiny. And there's this little
playground to go hide in. And games like that really
only work properly in VR because the 2D version isn't as
nearly compelling in that way. So yeah, we're going to look at
all these blueprints together and how they're set up. And then we'll go into things
like Create New Grabbable Objects. So let's pause here. And by pause, I mean let's
actually open up the template. And let's also just very quickly
see all of these things in action. So I'm to go play, save everything. What did I change? I was getting nervous
when I'm unexpectedly being told to save something. Did I delete something? Why is it new? All right, so as we
saw, I've got my hands here doing my thing--
click, click, click. And on my left hand right
now, I'm moving my joystick. And that's letting me fly. And the flying is happening
based on wherever I'm looking. Sometimes you might want to fly in
the direction of your controller. Some people prefer that. I prefer for forward to always
be the vector in the direction that I'm looking. That's my personal preference. Over here with the right joystick,
I have it so when I push Up, I'm getting bigger,
bigger, bigger, bigger. And it might not be
obvious on the screen except for my shadows and
the fact that I can now be like, oh, look how big my hands
are compared to this little ball. And you can even see from
a gravity perspective. Everything feels a little bit
different when you're a giant. I've also set up a
little toggle on here where if you just very quickly
want to snap back to human scale, you just click the joystick,
and you just snap back to being the size of a regular human. And that's pretty simple as well. One thing I haven't shown you yet
that I just want to pull up here real quick is the menu. And that can be accessed with
the B button and the Y button. And these are the default
buttons, but I'll also show you how we can add some
additional ones to this. Finally, I think this is only
true as of 5.1, you can actually do an OpenXR-based reset
of your orientation that wasn't working before. So I can be facing some weird
direction and then do that, and now this is my
new forward direction. And that works with just about
any headset, which is really nice. The Restart restarts the whole level. And the Real life is
actually going to quit. One other thing I've got on
here is the teleportation. So by default, teleportation--
and I have no idea what direction I'm facing now. Hey, guys. The teleportation,
by default, is done by pushing the joystick forward. But because I've now mapped that
to changing the scale of myself and I've added some haptic
feedback to make that feel more, I don't know, powerful, the fact
that I'm actually changing size, which is no small accomplishment. I've now made it so that the A
button is my teleport button. And you'll notice that I've
got two things on here. We're doing a fade to black, and
we're doing that dash teleport. So in the default option, there's
no blink, and there is no of that-- the dash system of moving. And as I said, I do prefer this. And if I can handle flying
around, that, to me, is the most free and
wonderful version because it's so different from
what's possible in the real world. You'll also notice my body is a gun. And I forgot that one thing
I was playing with was-- I'll show you this now. Oh, and, by the way, the
whole time I was doing that, I meant to be in the spectator cam. So sorry for not being in
a smoother motion mode. But that's what I
look like right now. The reason why I was fiddling
around with that is if we go over and we open up my pawn,
there's a few things I'll point out about changes
I've made to my pawn, and I keep thinking
that's in my scene. It's not. It is right here-- vrpawn_extension. And you'll notice that over here,
this looks a little different now. The first thing I did is if you
want to be able to easily fly-- or walk around. I didn't show you walking around,
but I can just do smooth locomotion. I'll show you in a second. But the way you can do that is
by default you have a VR pawn. I don't why I'm wearing
this like a hat now. I can take it off. We can actually go ahead and convert
our parent class to be a character. And as soon as you make it a
character, it does a few things. It adds a capsule component at the
top that becomes the new parent. It also gives you a
character movement component, which has all sorts of fun features. You can add in swimming
at that point-- flying, walking, swimming, jumping, if that's
going to be part of your experience. And that all pretty much
is there out of the box with minimal implementation. The way you do that, by the way,
is you go to your Class Settings, and you just change the parent class. By default, the parent class is a
pawn, and then you can change it to. Character and then it
makes those adjustments. And pretty much everything
else still works. You are going to have some problems
with snap turning because of the way rotation works with the
character component. But we'll touch on ways to fix that. But let's look at some
of the blueprints we have going on in our updated version. You'll notice that I'm color coding
my comment boxes for anything that I've changed. So let's start on the left. First of all is teleport. So you'll notice that I
decoupled this from the joystick. So this was the input action
that was already in there. I just moved it off to the side,
and I made a different one here. And we don't need to know anything
about how teleport's working, just that we want a button to trigger it. So this might be a good spot
to do a very brief introduction to the new enhanced input
system because, as you saw with the spectator camera,
I just had press the M key and then make a thing happen. That's not so bad if
you know that there's going to be a keyboard involved. But as soon as you want to map
something to controller buttons, and you know multiple headsets are
going to be using your experience, this is where something like
the new enhanced input system becomes really useful. So if I double-click
on this, I'm going to see something that doesn't
seem like it has a lot. And you're thinking, oh, wait. Where is the part where you
actually say what button. It is? All we have here-- and IA is Input
Action, move teleport button. All we have here that's important is
the description and the value type. And the bool, of
course, is a Boolean. We're saying it can
either be true or false. We have a few other
options in here too. It was mapped to be an
Axis1D float because it was saying if you push the joystick
forward enough, that's going to act as the trigger. 2D vector 2 would be
x and y, like you're moving the joystick all around. And then a three-vector, of course,
is more complicated than that. So in this case, we just have a bool. But then you say, OK, where
are we actually triggering what's going to happen? So with the input enhanced input,
system which is a bit of a mouthful, and I'm still getting used
to it, this is the top level. You have a file called a Player
Mappable input configuration. So we'll open this up. And everything in
here is exactly as it came in the OpenXR template--
something for hands the weapons, the menu. And again, I'm not going to give
a whole class on this because I'm still learning it myself. But as you can see, these
are different contexts for what you want your
input to be doing. And that's all UASSET
files, so it's very easy to migrate over to other projects. And you see Menu, Weapon, and then
the only thing I started to change is the default up here. So I have an _ext for
what I was changing. So I open this, up and
I look at the mappings. And every one of those input
actions is listed here. And this starts to look a
little more like the input that you're used to in the editor
where you have all of those input actions, and they're all basically
saying, I'm an axis, I'm a bool, I'm a 2D vector. And then you just list here
what's going to be using that. And this is the same way it's
always been-- so y-axis, x-axis, left hand, right hand,
different kinds of controllers. And you'll notice that I
added some down here for fly. And there is my move teleport button. And if anyone has an eagle
eye, you're going to say, Alex, you're very lazy because
you'll notice I didn't actually set it up for OpenXR. I was in a hurry, and I'm like,
we're just going to do it for Oculus. If I was doing this
correctly, I would keep adding the equivalent of the
A button for all the other headsets as well. So both of these, the reset scale
and the teleport button are Booleans, and it's just one thing there. And then all the way back over
here, that's what's happening here. So we're saying, whatever
that's mapped to, we can have something happen when
it's triggered, when it's started, ongoing, canceled, completed. We have all these options now for
doing the thing based on what's happening with that button. Oh, and I wanted to show you
the thing about dead zones. So let's open up really
quick, and then we'll walk through this in more detail-- something like walking flying. So You'll notice I
don't have anything here asking for a dead zone. This blueprint is even simpler. Now than the one I showed
you in the PowerPoint. So all we have here for the
strafing first is we're saying, OK, is this triggered? Add a Movement Input, World Direction
x, World Direction y, don't do z. It's all based on the
direction of the VR camera. And we're multiplying it by a speed--
that's it-- to fly like a bird or to walk. We don't even have to
change much in order to toggle between those two options. And so what this looks like, in
order to still maintain that access dead zone, is I open this up. And you'll notice we
got a little modifier. And the modifier here is just that. It's literally just a dead zone. And I can say, I only
want this to work-- because the axis is from 0 to 1-- if
someone is moving the joystick past that point two points. So they need to be more than 20%
pushing their joystick in order for anything to happen. So that's a wonderful
new feature that I think helps keep our blueprints
a little bit more simple. And so that's happening there. Similar down here for Walk/Fly. You see I named these in a way
that is just helping me remember what's using x, what's using y. But same idea down here. This is for moving forward. This is our World Direction,
just going on a forward vector. And down here, I also
just called out the fact that this is what I
used to have to do. Now that's no longer needed. So now that you've all seen
that, I can just delete it. That's what was in
there in the middle. Now we can go right past it. Here's how we're doing
the scale change. So this, again, is an axis. We're getting the current world meter
scale, which by default, is 100. And then we're basically
saying, the more you're pushing Up on your
y-axis on your right controller, the more you're going
to increase your size. And then we're popping into here. We're setting the world meter
scale, and we are updating it with a little function that's
taking your motion controllers and everything under them--
hands, whatever-- and the HMD, and it's all scaling relative
to that world meter scale. And then I'm just playing
a couple of haptic effects to make you feel more
like a god or something as you grow bigger,
and bigger, and bigger. So back over into our Event Graph,
that's pretty straightforward. And then there's that
little reset that I talked about where the default value
is 100, and we're just saying, hey, did you press that button? Great. Then we're just going
to update the scale, and your new scale
is going to be 100. And then, also, we're
actually going to do-- this is a little more
complicated, but we're going to do a little teleport
trace the same way you do the teleport where you
get a nice, little arc. Now we're just pointing
straight down, and we're saying, is the ground below us? Was I up in the sky like a bird? Now we're going to bring
you down to the ground. And if you're flying, you'll
actually snap to the ground. If your capsule component is actually
working as a standard walkable component, then you'll be at the
correct height when you snap down. So that's a little
bit more complicated, but I think it's still
worth pointing out. Here's our dash teleport. This one still pretty much looks the
same-- no need for an input action. This is being triggered
over in Teleport. So first, we'll go through this. Then we'll look at it
inside the Teleport node. So we're getting the player
pawn and the actor location, and we're setting a start location. We're saying, OK,
you want to teleport? Where are you right now? And then we just need to find where
that projected teleport location is. I didn't even have to create
a new variable for that. That variable already exists when
you're doing teleport anyway. And now I'm lerping. I'm going from this vector to this
vector over a set amount of time. And every frame of the way, I
am setting the actor location. Instead of just at the
start and at the end, it's happening the whole time. And so if I open up
the timeline, you'll see I'm doing it not in a curvy-curve
way because that will likely make people feel sick. I'm doing it in a very straight
way, a consistent speed over a course of 0.4 seconds. And you could modify that
or make that a variable and try different things. That felt good to me-- 0.4 seconds of you moving from
where you started to where you end. If you wanted to get more
into the weeds of that, you could actually make it
not a set amount of time, but make it based on the actual
distance you're traveling. So if you're traveling
further, you won't go there as fast as if you're
traveling a short distance away. But I'm trying to keep
things relatively simple. So the way you insert this
into the teleport system is you find the function
that gets called already in the Blueprint called Try Teleport. And again, you don't even need
to know how all this works. You just have to this is the last
step before teleportation happens. And so it's checking to see. Do you have a valid
teleport location? Hooray, hooray, hooray. The last step before teleport-- this
is where I'm just inserting myself-- I'm saying, whoa,
whoa, whoa, hold on. Before you do an
instant teleport, I'm just curious, just
asking, just wondering, is our little Boolean for
Use Dash Teleport checked? Because if it is, then before we
do that-- we really don't even need that last step at this point. Before we do that, we actually
want to then go and trigger our dash teleport. And then it pops up to here and goes
through there, which is pretty cool. And yes, I should have mentioned
that I have Booleans for all these. They're all public. That's the little eyeball here. So we can make it very easy
to set either in the Editor or while playing the game, your
move speed, your ability to fly, your ability to dash teleport,
your ability to fade-- sorry-- the fade on teleport
is the fade to black actually. And this is a little
bit more complicated, so I'll go into this if we have time. But let me say briefly about
the opacity thing here. The fade to black-- Unreal Engine has built
in a Fade to Black system. You can type in fade to black,
and something will come up. It usually doesn't work in VR. So the way that I tend to
handle that, especially if I'm creating a game that has
desktop and VR options, is I'll have both my VR and
desktop pawns or characters derive from the same
parent class that has what I'm going to show you
right here over in the Viewport, which is this fade plain. This fade plain is a
very simple material-- in this case, unlit translucent--
which is using a material parameter collection-- no time to go into that in detail-- material parameter collection
with an opacity value that goes between 0 and 1. So when I need to fade everything
up or down in my scene, I'm not relying on the
built-in system in Unreal. I have everyone with this black
plane right in front of them, which does have cost
because it has translucency. And I am group fading
up and down everything that is using that
particular opacity parameter. But the short version
is you got a fade plane. It's a big plane that's
right in front of the camera. And as we change the material
parameter collection value-- I can do it right now-- Scalar Parameter, I could go
ahead and turn this up to 1. And then I could go back into
my pawn and, oh, no, it's black. I can't see anything because the
wonderful thing about material parameter collections is you can
affect anything that is referencing that particular variable. It's also great for sequencers. And I have now spent far more time on
that than I thought I was going to, so let's keep moving. All right, so down
here, fade opacity, MPC opacity-- that's
all connected to that. And then, yeah, those are
all of our public variables. So I think we've done our quick flyby
of everything inside the Event Graph there, timeline. Here's that fade-- fade on teleport. And the timeline, I'll just say-- I know I'm still talking about it--
very similar to the dash teleport. We're just saying change that
material parameter collection fade opacity using the name,
the value, which is opacity over a set amount of time. That's all. And yes, that's the event
we saw before where we say, if you are going to be using
the dampened spectator camera, we want to make sure that
the camera isn't getting in the way of everyone-- or sorry. This is the other version. This is the one where we make sure
that the headset isn't getting in the way of the spectator camera. So we just hide that in view. All right, so I'm hoping
that even though I'm whizzing through this
stuff pretty fast, there will be plenty of
moments in this video where someone watching
the on-demand version will be able to pause and get
everything they need out of this. So apologies if I go through
some of the stuff fast, but I want to squeeze
it all in if I can. Next thing-- oh, actually, let
me show you walking real quick. So when I click on my
extended pawn here, which I, again, I keep looking for
it where it doesn't exist. I could put it in the scene. I absolutely could. I would just need to make sure
it auto possesses the player. I'll just show you that real quick. And I should change the
body to not be a gun. That was just a test. I'll show you why I
was doing that test. But yeah, if I'm going to
actually have it in the scene, I just need to make sure I have
Auto Possess Player 0 down here. And then that's fine. I can use that one. And then I can make some edits that
are very specific just to this. I could turn off the
hands or something. So with the capsule here-- or sorry. When you do a character-parent
class, it actually asks you, what do you want the
mesh to be for the body? And by default, it's nothing. And I was like, oh, what do I have
access to for skeletal meshes? Because I don't actually have
Manny or any of the default elements in here. So it was really just the gun. So I was like, OK,
I'll throw in the gun. I could also just do a skeletal cube. But you could actually give
yourself a VR body that way just by dropping
something into that spot. So to toggle between
flying and walking, you'll notice that there is
a checkbox for Use Flying. And when that is unchecked, it
is just going to be walking. The other thing you
can do-- really, this is to give you the ability to fly
rather than actually turning it on and off. By default, the flying is going
to be way too ease in/ease out. So I would recommend cranking
up that Braking Deceleration to a high number. Otherwise, you're going to get
pretty sick as you fly around because you'll stop
flying, and then you'll do like a [WHIRRS]
slow-down kind of thing. But we actually want
to be able to toggle to go between flying and walking. And again, it's all going to use
exactly the same commands right here. We don't have to change this at all. One is just using
gravity, and one isn't. We go to Walking, and we have this
here-- so default land movement mode and then flying, that's there too. So what we have at the
beginning right here, I added in a little sequence
node after Pixel Density-- which, by the way, pixel density,
if you ever need a quick way to make your experience
run at a higher frame rate, bring down the pixel
density a little bit. If you bring it down to 0.7, 0.8m,
not that many people will notice, and it's going to increase
your frames quite a bit. And especially if you're doing
Lumen and Nanite experimental-- don't do it in 5.1-- Pixel Density less than 1 is
going to help you not crash. So the little extra sequence
thing I'm doing here is I basically say, if you
don't say it's OK to fly, then it's actually going to use the
default value, which is walking. So you saw when I searched under--
here we go-- just to show all this. I'll explain this in a moment. But when I did the default ground
movement or the movement mode-- I'll just type in movement mode,
so we're being consistent here-- default land movement, walking,
there's our swimming right there. So we have walking on by default.
But if we have can fly or Use Flying, rather, checked, then right at
the beginning, it's going to go, and it's going to change that
movement mode from walking to flying for character movement. And then the other
thing I'm doing here is I'm actually
changing the collision profile to be Overlap All Dynamic. And the reason for that is it feels
really jarring if you're flying around free as a bird-- weee-- and then you hit an
unexpected collider, and you stop all of the sudden. So all I do with flying
is I change the collision profile to overlap all dynamic. And then you can go through
things, and you're not as likely to get sick there. Otherwise, you will
actually bump into things. So what I'll do here very quickly
is open up my-- actually, yeah, I don't even have to do it from here. I've got an instance in the
scene, so I can do it specifically to this guy. I can scroll down and I
can see, as long as I'm at the top of that parent level,
that I've got these public variables I can access-- Move Speed, Use Flying, Use
Dash Teleport, Fade on Teleport. So I'm just going to
uncheck Use Flying, and I can-- just I'll use VR preview
because it'll load up quicker. And let's give you guys a dampened
camera view, so I'm not a monster. Or actually, you know what, we'll
do the third-person one here. And by the way, I didn't
mention you can look around. You can use the mouse wheel
to change your field of view. You go around with the WASD keys. And oh, now I've
gotten stuck in a wall. Fantastic. So what we can do is restart
level, or I could just exit out, but this is fine too. So there's that. And now if my controllers turn on. Now you'll notice that
I'm not flying anymore. I've been grounded. My wings have been clipped. And I have to get a little bit
shorter because my ground isn't quite at the right height. But I can move around without flying. And this, again, is on that
spectrum of comfort options, less comfortable than
something like dash teleport, but also feels more natural if
you're not going to actually be able to physically move around. Again, best version is just go for
a walk while wearing a VR headset, and you don't smack into anything. Cool. OK. Hope that was helpful. And now I want to show you guys,
with the time we have left, what it is like to create a new
grabbable object because it's a real simple system. This hasn't really changed
much in the past few releases, but I just want to make
sure you see how it works. So you'll notice we got a
bunch of grabbable objects. We got our ball here. We have this. And it used to be in
the previous VR template that you needed to
work with interfaces and do some stuff that was a
little more complicated than what a beginner Unreal Engine
user would want to use. But now it's just a component. You can take any actor in your scene. We could take something
that isn't grabbable right now, like this cube over here,
just a regular static mesh actor. And we could go Add,
Grabbable-- whoops-- Grab component. And then that Grab
component now, it's just going to ask us
a couple of questions. It says, hey, how do you
want to be able to grab this? Do you want to make
it so that when you grab it the spot where
you grabbed it from, that's the spot you're
holding it from? Or do you want to make it so that
it snaps to a particular location? By default, it's going
to be the pivot point. So I grab it here, but my
hand snaps to be in the middle because that's how
we want to hold it. Then, of course, the
more advanced version is you start to change the hand
pose, so if you're grabbing a vase, it's like this. And if you're grabbing
a ball, it's like this. But that's more complex. By default, Free, Snap, and
Custom are your options. But it won't work yet because
then what we need to do is we need to scroll down. You'll notice it already
went from static to movable, so that's helpful. Oh, actually, I'm sorry. It did not. The grab component says movable,
but we need to go up to cube and tell that to be movable. Otherwise, it'll be
stuck in the ground. And then we need to simulate
physics, at which point, it can actually drop and
bounce around and have weight. I could change the mass right here. And we have Enable
Gravity checked right now. If I were to disable that-- in
fact, I'll do it just for funsies-- I'll be able to actually
throw it around. The other thing is we want to
make sure we have Overlap Events and Simulation Generates Hit Events. And that should work now as
long as this has collision. I do know this cube has collision. If you need to check,
you can open it up, and you can go Collision
and look at the-- well, actually, right
here is one way to do it, is if you see Remove
collision. under Collision, then it has collision. But I'm forgetting the new
way to just view collision. It might be under Show. Yeah, simple collision. So there it is, the
little green box there. So that tells us it has collision. So watch this now. I go in, give you guys my
dampened spectator mode. And it's right here. Oh, [LAUGHS] and it's just
floating up like a magical bird-- not terrifying at all. But I should be able to actually
go ahead and just grab it, right? And then I can be
like, [GROANS] goodbye. And if gravity was on, of course,
then it would fall back down. But maybe you're making a space game. I don't know what you're up to. So that's that. And yeah, I mean, that's really it. That's how you set up grabbing--
nothing too complicated there. And that's me just throwing
things at the camera. I think everyone who realizes
the spectator camera is like another player
inside the experience wants to do that kind
of throwing thing. So menu items-- I
mentioned very briefly that there are some simple ones. You can reset the HMD. You can go ahead and quit
the game, restart the game. That's in there by default. It's very easy to add
additional menu items. I'll show you that in a moment and
a couple of examples of what I did. We were talking earlier about
waypoints, where you-- or sorry. You saw it very quickly in
the VR Collaborative Viewer with Bookmarks-- I think CollabViewer
is the official name of that template-- where you are
able to snap to different camera locations. So I basically just did
something similar here, where there is a class of objects
in the actors inside Unreal Engine called target points. And I just set those
up around a scene. And I said, OK, I just want
to have my menu show me that I can jump to an interior
location, an exterior location, and those can be placed anywhere. So at the beginning with a UMG,
you have your event construct, which is like Begin Play. And it goes through and it
finds all of the target points that have a certain tag, puts
them in a little bit of an array, makes sure that they
have a certain name. And then what we do is
we have some new buttons, and I'll show you how
to make the buttons that are casting to the player upon using
the fade toggle, the same one we're using for teleportation. So you fade out to black,
and then you fade up, and you're in that new location
wherever it might be in the project. And we're just using the
standard out-of-the-box teleport. I can also set the rotation
if I wanted to be consistent, but there is actually
something cool about just pointing in whatever
direction that you've set up with the target points. So that's one example. The bottom one is another one-- Stat RHI, Stat FPS, Stat Unit,
Stat GPU, Stat Detailed-- these are all things that you
might want to pull up while you're doing VR to just give
you more information about how everything is performing. So I just added a button that lets
me pull that up and off on the fly. And fairly recently, I think
it just happened in 5.0, now you see the console.log in VR at
a fairly comfortable distance. It used to be you couldn't
quite see the corners, and it was too close to you. Now it's actually quite readable. So let's look at that real quick. Oh, I still got this guy up. We can close you. Although, actually,
here's where I've got the target points, so I'll
just show you that real quick. So I've got Target Point
Exterior right here and Target Point Interior right here. And so that's what gets
created for the buttons. So yeah, I can just type in menu. There's menu final. And so all I did to add these
buttons-- because, again, it was just a restart and a quit. Now the new one,
again, has reset HMD. But you can literally just
duplicate one of these. Let's add a new one now. So I could go ahead
and duplicate that. And you'll notice it
takes the text along too. And I don't know which one is the
original-- probably the one that says the number 1 on it. So I'd call this, like,
new thing I want to do-- whatever-- some kind of placeholder. And then I'd want to change the
text to New Thing I want to do, or whatever. And you'll notice it's making
all the buttons bigger. The UMG system has all sorts of nice
auto-fit kind of stuff going on. And then all I have to do
is click on that button-- not the text, but the button-- and then I can scroll down. And then I can say,
hey, when someone clicks that, I want something to happen. So we press that plus. It creates this new, little event. And then I can say, the
thing happened or whatever. And I made a new button that
will actually show up on my menu because this is literally
the menu that spawns whenever you press the Menu button. By default-- and I want to show
you this in the new template because it's different
from how it used to be-- when I go into here and
press the Menu button and I go into the not shaky mode-- [LAUGHS] that floating
cube back there. It's amazing. It's kind of malevolent too. There's something about the way
that it's just slowly rising up. I feel like I have to keep
turning around and be like, what's going to happen? Is it going to explode? Is a ray of sunshine
going to come out of it? Who knows? But if I press-- now it is the B button
and the Y button. You can also use the
Oculus menu button, but that's a little
trickier with Steam VR. And you can't use the
Menu button down here. So I think the intention was
to just make it really easy. So you're spawning it on either hand. One thing you could
do is you could take-- this a little bit of
homework, you could say. You can take the same technique
we used for the spectator camera dampening the view you
see right now-- oh, and you guys can't
really see the menu. There it is. And by the way, I
can move the joystick to click something and pull
the trigger, like that. Or I can pull up the menu and get
a nice, little ray pointer here. So I like having both of
those options available. But you can take the
same dampening technique that we have with
the Spectator mode-- and I just realized you
can't see what I'm seeing. Let's go there. There you go. So yeah, so pointy point,
click, click, click there. Or I can just use my joystick on
my left hand and pull the trigger and do things that way. So what you can do is
take the same technique that we did for dampening the camera
and, instead of spawning that menu right there in front
of you, you could have either button pressing
the Menu button on either side spawn the menu right in front of you. And rather than having it
always directly in front of you, you could dampen it. You could have a target point that
you're telling the menu to spawn at, and then you could just have
it be chasing that location. And that might be a nice,
comfortable big menu right in front of you,
which is going to be useful if you want more than just
three or four buttons. You don't want 30 things pulling up
like right there in front of you. Maybe it's OK if it's
larger and directly there. So I'll just show you
very, very quickly without actually going
through the Blueprint how you could get started
with something like that. I'll type in menu-- menu. And actually, that's
the menu, but then actually it's spawns
over in our pawn-- so ext, get our pawn extension. And we've got a function that
gets called for toggle menu. So toggle menu right now,
it spawns the actor menu at a 0, 0, 0 location. And that reference is actually going
toward the widget interaction spots here. But instead, you could actually
have a specific location. So when that spawn happens,
the Spawn Actor menu, that's where you could tell
it to be at a certain location and then be constantly
on event ticks setting it to be at some kind of target point. So if anyone does that on their
own and they're proud of it, find me @ibrews-- I-B-R-E-W-S-- on Twitter, and
I'll give you a prize or something because it's always fun to see people
do a thing that I told them they might want to do. OK, moving on. That's the sun study I mentioned,
being able to watch the sunset-- very cool. Packaging-- we already
covered this, but I'll just mention that you can
have everything set up for these different platforms. I was showing you a low,
medium, high-end version. But you do want to make sure
you change the game default map. This is Packaging 101. But if you've already
made everything work great across those different maps,
when you actually are saying, now I'm making it for
Quest, now I'm making it for high-end Windows
or low-end Windows, that you have that same map set up. The more advanced
technique, of course, would be you have everything
all happening in one map. And then you have all
sorts of switches and knobs that happen on Begin Play that are
detecting what the capabilities are of the hardware and turning bells
and whistles on or off based on that. I think that gets a
little bit complicated. It's easier, in my mind, to
just do it as separate maps. And maybe they share a sublevel
or something for the things that you know are going to be
the same across everywhere. With the time we have left, I do want
to have a little bit of Q&A. And I'm the last session, so we
can keep chatting for as long as anyone wants to-- some VR best practices. I do want to show you
guys Lumen and Nanite. So I'm literally going to hold these
up on screen for, like, a second. So again, on the replay,
you can look at some stuff. Can you simulate VR without an HMD? Yeah. You can use console commands like
emulatestereo, things like that. If you make the screen percentage
higher, that simulates oversampling. That's useful. Note that when you're using
Steam VR, you can simulate input in different ways. There's that Vulkan Preview
session I mentioned, some optimizations we can do. Sorry, guys. This is definitely more for
the recording than for you-- settings with level of detail,
mesh optimizations you can do. The LOD system that Unreal does out
of the box is actually pretty great. So if you just tell
things to be high detail, it's going to create a ton of
different LODs of everything you have. Material optimizations--
how expensive are materials? Translucent is very expensive. Try to use the same parent
material if you can. Your material nodes can also have
some switches based on Settings. some different texture optimization
generally use the power of two, so it can create MIP maps. Things can look fuzzy if
you don't have MIP maps. Here's our optimization view
modes for light complexity and stationary light overlap. Too many stationary lights are bad. They won't work. Lighting optimization,
static when possible. Quad Overdraw, don't
have too many quads overdrawing because then you're
wasting precious time and energy. Cull Distance Volumes-- ; great. Merge Actor Tool-- great. Hierarchical Level of Detail
going on to the next line-- another great tool. Sorry, that'll be hard to
read, even on the recording. That's how that can look. You guys are troupers too just
let this all wash over you. Packed Level Actors are a new
system in Unreal Engine 5. That just lets you have
everything go into a level. Some C vars-- the one
here that I probably use the most is PixelDensity or
changing the SpectatorScreenMode. That is different from the
spectator cam we have here. It's literally for what
you see of the VR view. You can change what that looks like. There's what's changing
the pixel density is like. It's a lot like screen resolution. Here's a bunch of System
Settings you could start with. I should give you guys the
actual text of that, so feel free to PM me if
you want any of that. Some of those profiling
commands I like to do-- stat unit. OK, here's the fun part. Initial Lumen-- yeah, like, yeah. [CLAPS] [LAUGHS] We
have three minutes left. So I'm obligated to
say it's experimental. It is not fully supported. It's not stable. Don't use it, but
it is very cool when it is working in those
little moments of bliss to be able to have real-time
reflections, real-time GI, moving light, incredibly detailed
scenes-- super, super cool. Right now, only works on desktop. Only works in deferred rendering. And at the moment,
you might not actually even see performance gains over
doing even a ray-traced version of that same scene. So just be careful. The way you need to set it up is
very similar to how you would set up any kind of Lumen/Nanite project. You want to uncheck forward shading. You want to make sure you're
using shader model 6 for Nanite. You want to be in DirectX 12. You want to make sure you have mesh
distance fields, virtual shadow maps and textures--
virtual shadow maps there. Oh, and actually, one
thing I've noticed too is-- sorry, just flicking through these. Over here when you set up Lumen for
your dynamic global illumination method and your reflection method,
make sure your reflection capture resolution is low. In my experience, if I
make that too high-- which, I normally would make it high
for when I'm baking my reflection captures-- instant crash in Editor. I'm not sure why. But again, we're not stable here yet. And I generally wouldn't even
enable hardware ray tracing. That seems to have some problems too. Definitely go ahead in Oculus
and lower the vr.PixelDensity to have any kind of
stable experience. I tend to add some little buttons
for increasing it until I crash. And I'm like, OK, I guess 0.8 is
the best I can do, and now I know. With Steam VR, I'm sorry
to say that any time you try to change the resolution,
either with pixel density or screen percentage, not going to work. So you have to have
those set at 100%. And you also need to use a-- sorry I didn't write this down. You need to use a parameter,
no RHI thread in order to not get some very
weird shadow artifacts. Again, buggy, experimental, being
worked on, going to get better. Over here, there's just
some mesh optimizations, basic stuff about Nanite. There's some supported
features, unsupported features. Nanite now does support foliage and
VR, of course, to a certain degree. Does not support the
things on the bottom yet. And yeah, if everything
goes well, you end up having an experience like
this particular time where the entire time this is happening,
I was like, it's working. It's working. I have trees everywhere and all
these scenes from The Matrix demo. And it's going smoothly. It's running on my laptop,
which has a mobile 3080. And I can turn on the Nanite mode and
be like, look at all the geometry. This would never
render in a normal way. And it was all bliss, and
I've had this successfully happen maybe twice. Usually it'll crash pretty quickly. But it is very cool
when it is working with reflections and all sorts of
cool stuff happening like that. So I'll just let this
keep playing for a moment. Let's open it up to questions with
the little bit of time we have left. And then, yes, of
course, if anyone would like to try this demo for
yourselves, it actually does run pretty well on
the computer right here, so I'm happy to share it with you. But look real quick,
just that detail of all the little architectural
things in there. There's no LODs in this. It's all Nanite. And it feels really,
really, really good. So I'm really looking forward
to watching all that improve. This is sped up at this point
because I just wanted to-- I'm pretending that's
locomotion technique. It's not. I'm pretending to be a locomotive. That's all that's happening. But yes, anyway. Thank you very much. I will-- yeah. I'll show you how
to get a hold of me. [APPLAUSE]