SKYE: Hey, folks! Welcome to this week's
News and Community Spotlight! Power up your launcher because
Preview One for the 4.27-Chaos build is here. First-- head on over the
forums to find out more, and then experience the power and
ability of Chaos first hand, including cloth, vehicles,
and all the destruction your heart's desire! Continuing with
4.27-- Pixel Streaming is now production-ready
with Unreal Engine 4.27. Harness its power
instead of hardware to run your applications
entirely in the cloud. Watch the Feature Highlight
on the Unreal Engine Youtube Channel. Looking for more to feast
on while you're there? Watch our latest webinar
on-demand and discover the new virtual production
features in Unreal Engine 4.27 that are pushing next-generation
filmmaking forward and what this means for anyone
exploring virtual production! Speaking of engine technology... We're humbled to receive
the IGDA Global Industry Award for Engine Technology. Thank you to the IGDA and to
everyone in the community who selected us for this honor. Inspired by the works
of Alfred Hitchock comes a story-driven
observational thriller from the creators
of The Occupation, and Ether One- White
Paper Games tells the story of an 8-year-old girl that
goes missing from Dahlia View- an English town in the
1950s in which the game is set. Take on the role
of the investigator into the disappearance. Head to the feed to find out
how this British studio uses their honed
hand-painted textures and stylized environments
and the lessons learned from previously
shipped titles to develop this compelling narrative. Stepping into the shoes of Ready
Player One seems like a dream no longer. With the new era of digital
fashion- born on the Metaverse, RTFKT is leading the
fashion of the future, merging the worlds
of digital and physical. Run over to the
feed to discover how this 'digital Supreme' creates
a bold future for fashion. A many thanks to our
top weekly karma earners; ClockworkOcean, Everynone,
EvilCleric, T_Sumisaki, mmmmGoodDinner, Kehel18,
Bloodrunner2000, ugmoe, Devin Sherry, and chrudimer. Next up- our
community spotlights: Irradiation is a short film by a
CG Animation industry veteran of 13 years,
Sava Zivkovic (Zov-Koh-Vich). In his behind-the-scenes video,
Sava talks about why he
evolved his workflow from traditional
cinematography methods to Unreal Engine & real-time. Radiating his industry
experience and passion, Sava created this
film in just four weeks! Check out the behind-the-scenes
in full and the short film itself- it's quite the
reactive performance! Are you looking for
that perfect genre-bash? Finnish developers
and Unreal Dev Grant recipient; Vaki Games is
combining tower defense with hack-and-slash. Summon minions,
fight alongside undead armies, and customize your hero,
gain the experience to rival a guardian,
and craft your ideal hero or heroine to take it down! Wishlist Kingshunt on Steam. Lastly, we have 'Hollow
Road' by Ben Mcdonald- a fan-art environment inspired
by what Bloodborne could look like on current tech. They utilize trim sheets,
tileable materials, vertex painting,
and their technical toolset to create this beautifully
gothic real-time game scene. There are more stunning visuals
over on Ben's Artstation page. I've been Skye,
and thanks for watching this week's News and Community Spotlight! VICTOR: And we are live. Hi, and welcome to Inside Unreal,
a weekly show where we learn, explore,
and celebrate everything Unreal. I'm product specialist,
Victor Brodin. And with me today in the studio,
in person, we have Alexander Paschall,
Unreal trainer. ALEX: Hey. VICTOR: Hey, it's weird.
I'm talking here, but then you're over here. [LAUGHS] ALEX: It's like we
looking at our camera-- VICTOR: Talk back and forth. ALEX: Camera here, camera there. VICTOR: It's pretty good. Today we are going
to explore and cover the new VR template in 4.27. Since we have made
OpenXR production ready, this template is completely
designed for that. And about 15 minutes
ago I wasn't so sure if we were going
to be able to do this. And we'll get
into a little bit-- ALEX: You got it working, dude. VICTOR: Well, like not-- My computer has been-- I don't want to talk about it. We're doing this. We're doing this now. First,
before we dive into that, we do want to celebrate some
of the awesome creations that the community has
been using Unreal Engine 5 Early Access for. And so without further ado,
we're going to play the
little sizzle reel that we put together for you. So please. ALEX: Roll the clip. VICTOR: Roll the clip. [MUSIC PLAYING] All right,
that's absolutely stunning. We are not going to be
creating such stunning pixels today. ALEX: We're going to be doing some goofy stuff today. VICTOR: Some goofy stuff today,
yeah. ALEX: And some fun stuff. VICTOR: But it's
going to be fun. You can follow along today. However, I will note that
this is not kind of a tutorial where really you can pause
and go step through step, and sort of follow along. Everything we're doing,
you'll have access to. It's just a pure VR template. But we will be talking a lot
about the hows, the hows and the why in
between all the steps. We are not really
authoring content today for the sake of teaching
you exactly how to make that specific content. We are authoring
content so that we can talk about the tools
and the features that have been put together
for you that you're using in your template. ALEX: Yeah,
and Victor made me a nice hat. And I'm going
to have to wear it. VICTOR: And I'm going
to make you wear a hat. It's very nice. ALEX: It's pretty nice. VICTOR: Virtual hat. ALEX: Virtual hat. Yeah, everything's virtual. VICTOR: Cool, I hope everyone can hear us well out there. Like I said,
we had all kinds of issues before we got started here. But without further ado,
I think it's time to go ahead and start. If you didn't know, in 4.27 we
released a new VR template. OpenXR is now production ready. And with that, we redesigned
the VR template to follow suit. You also might
have seen a little bit of the sneak peek of the VR
template in UE5 Early Access. I should note that
is actually an older version of the template. And that's why we are
covering it in 4.27 today. But in UE5 main,
you will receive the fully updated VR template version. I'm actually working
on that next week. What is OpenXR and
why does it matter? If you haven't done VR development,
or you have, but you've sort of not
been paying attention to the development
in the industry, there is an open standard
known as OpenXR that is developed by the
consortium and the working group at Khronos Group
which Epic is a member of. And why does this matter? Previously, we have had many,
or at least several different VR plugins that we've had to use,
because each vendor sort of had their own SDK for
their particular devices. And when you were
developing applications in Unreal Engine
for several of these, for example, Oculus and SteamVR,
you had to use either SDK. And on the engine
side of things, we had to switch which
plugin was actually active. And this can lead
to some problems, since sort of all the vendors
are doing things differently. And to combat that,
OpenXR is an open standard API that-- there's no requirement for all
VR manufacturers to follow it, but it is recommended. And going forward
in Unreal Engine, we are going to be
supporting OpenXR. And brief, quick today,
I'm not going to go into too much details of
what goes behind an OpenXR, because actually
as the developer it's a little bit easier to work
with OpenXR than it was previously if you
were developing for several different devices. It's a royalty free
open standard that provides high performance
access to augmented reality and virtual reality,
collectively known as XR platforms and devices, hence the OpenXR. In short,
it's an API for XR applications. It can also be
considered as the interface between an
application and runtime. And we're going to talk
a little bit about runtimes here in a bit. If you don't know what that is,
don't worry. It's a mechanism for
interacting with VR, AR, and MR systems in a
platform agnostic way. And we're going to be talking
a little bit more about OpenXR. But if you do want to
know more and read the entire specification for
the 1.0 release of OpenXR, you can go ahead
and go to the links that are at the bottom
of this slide here. We will go ahead and
paste them in chat as well. We should have them. I'm seeing @s for
myself in chat here, so I'm not really
sure what's going on. ALEX: We're getting
your Twitter handle out there. It's @Victor1erp. But the L is a 1,
and I always forget that. VICTOR: It is,
because for some reason I managed to get my first
Twitter account suspended, by just trying to
password reset it. Anyway, let's try to not
get too off topic until we've finished with the slide deck. There is lots of room
to continue to do that. Moving on. In Unreal Engine 4.27,
and moving on 5 and onwards, we have one plugin
to rule them all. It's the OpenXR plugin. And if you have created a
new VR template in 4.27, you have already noticed that
OpenXR is enabled by default. Now, it's worth mentioning
here that if you start a blank fresh project
in Unreal Engine 4.27, SteamVR and Oculus VR
are both going to be enabled. And they are actually
prioritized over the OpenXR plugin for a couple
of reasons that we are going to get into
at the end of the talk. Because they're very
specific to shipping games like right now with 4.27. But what about extensions? If that's what you were
thinking right there, you have clearly
done your homework and read up a little bit
about how OpenXR works. And the question that you
might be asking yourself, well, wait a minute. If OpenXR is the only API,
what happens if I am developing a brand
new piece of hardware that none of the existing
vendors are working on? And I like to iterate? I like to innovate? Well, as an open standard,
Khronos is not unfamiliar
with this concept. And to solve this problem,
we use something that's known as OpenXR extensions. You can see a couple of
those on the marketplace already from Microsoft and Varjo. There's also two of them
that have shipped with 4.27. It might actually
have been even 4.26. These extensions allow the
vendors to extend functionality outside the OpenXR standard. And so hence,
you can see things. Microsoft has some specific
functions and features for their Hololens and
their mixed reality kits, as well as Varjo
who is running a very custom solution for passthrough,
where you can actually sort of do mixed reality in their headset. And these extensions is
what allowed these vendors and hardware manufacturers
to extend upon the standard API that exists in OpenXR. Cause, if you think about it,
everyone needs to agree on what
that standard looks like, right? And so sometimes that
takes a little bit of time. And then the manufacturers
have the option to use these extension plugins. Like I said, if you want to
learn more about OpenXR, I highly recommend
you go and just take a look at the specification. You could find the link here. I'm actually not sure if I
gave this to Skye just yet. If not, don't worry. I will go ahead and post all of
these on the forum announcement post as soon as we're
done with the stream. I really should have
done that before, but I had some problems. [LAUGHTER] ALEX: That's life. VICTOR: That is life indeed. And you'll see what I'm going
to be talking about in a bit. Today we got quite
the packed schedule. I don't know how long we'll be here,
but buckle in. Buckle in. Buckle up. Buckle up. Buckle up, and I hope you'll
be enjoying our content today. Here is a bit of an
overview of what we're going to go over today. And like I said,
if you're watching this as the VOD, I don't recommend
you follow along if you're watching this live. But if you are
watching the VOD here, you will be able to
follow along and build the little sort of sample
extensions of the VR template. I should have used that term,
cause I just talked about OpenXR extensions. But if you want to extend
the functionality in the VR template, you will be
able to follow along and do exactly what I am doing. SPEAKER 2: Your camera's frozen. VICTOR: All right. ALEX: Your camera's frozen. VICTOR: Apparently
my camera is frozen. Let's go ahead and
see if we can fix that. Nope, not immediately. What about there? Nope, definitely not. ALEX: Oh,
but that's a great pose. VICTOR: It is. Give me one moment while I-- ALEX: That's my thinking pose. SPEAKER 2: Plug in and
unplug the [INAUDIBLE]. ALEX: Oh, no. Oh, no. All right, we got it, y'all. All right, well,
while he's doing that, I'll just say hey to everybody. Hello chat. I know you can't see me,
but I see all of you in there saying hello. I just want to say, hey,
it's good to see you all again too. Yeah, "Alex is playing
with Twitter [INAUDIBLE]." Yeah, probably that freeze frame
you're seeing down there is me checking to make sure if I'm
getting any weird messages from y'all. And I am. Thank you. Much love. Much, much love. VICTOR: Hm, I am actually
not sure what's going on here. ALEX: Uh-oh. VICTOR: Why don't we just
go be right back real quick so I can fix this,
because today is actually one of the few times
when having the camera and seeing what Alex is doing
with the VR headset matters. So Houston,
if we could please be right back. All right, we're back. Oh, no,
I need to change it in both scenes. ALEX: Oh, there we go. There we go. VICTOR: Yeah, I know. The small one works. But the big one doesn't. And the big one's important. Great,
I'm not surprised by this at all. I'm just going to do this live. ALEX: Yeah. VICTOR: I'm just
going to do this live. ALEX: We're going to do it live. VICTOR: Going to go
ahead and add that right there. ALEX: Oh, oh. VICTOR: The capture's
working all right. Oh, it needs to be below. And boom, there we go. OK, cool. Back to-- (CLAP) ALEX: Production values. VICTOR: Nice sound clap. "We're doing it live. Edit in post." Yeah, they are a lot-- All right, back on track. The VR template in 4.27,
we are officially supporting the following devices. These are the devices
that we have extensively tested the template with. Now,
what's really cool about OpenXR, though,
is that theoretically any device that runs on a runtime that
supports OpenXR should work. What that means is that you
could even develop your own. And as long as you're
targeting the OpenXR API with your runtime, you would
be able to use the VR template. I'm going to talk a
little bit about-- there are a few specifics when
it comes to the HP reverb and how that works in engine. But we will get to that. Moving on. All right, without further ado,
all I've done here is that I have just
opened up 4.27, and I have created a brand
new fresh VR template. And Alex, I have no idea why,
but apparently my frame rate is fine now. One of the problems we
were sitting with right here before the stream was
that my entire computer was running like a potato. And we had no idea why. ALEX: We were
really getting tanked. VICTOR: I was about to
cancel the stream actually. Oh, see,
there it dipped down to 64. Yep,
we still have the same problem. But you know what? Hopefully most of
you out there are not going to bother.
Just know that this is specific to my computer
for some reason the day of the livestream. I haven't shown
content from this PC ever since I started
streaming from home. And the one day I do,
of course, the computer tanks. [CHUCKLES] That's just fantastic. Anyway, let's make sure
we got the mouse going on. Cool, I didn't screw that up. So we're good. Like I mentioned before today,
it's not explicitly about OpenXR. We are going to
sort of show you how you will be working with
OpenXR in the Unreal editor. And remember that it's not
explicitly what we're building, but how and why
we're building it, and the way that we're
building it that's important here. So if you're taking notes,
that's what you should be focusing on. I saw a question,
where did we get this? The VR template is
available right in the launcher. You just go ahead and
start Unreal Engine 4.27. You click Games as
the template category, and then the VR template. And you go ahead and create. And you just
create a new project. And this is what you'll see. Now, what is really cool
here is if you've set up your VR development,
your hardware, and your runtime, everything else,
your software on your computer to be able to do VR,
you can literally just hit Play. And we can now don the headset. And as long as
Oculus link is-- yep, and we immediately
have virtual reality. Which is great. It's fantastic. That's what we want. There's no opening any maps. There's no starting VR,
or any other configuration settings in editor. It just works. Yes, Alex. That means it's your turn now. ALEX: Aw yeah. VICTOR: And from this moment on, since I'll be driving
the keyboard here, Alex will help me. ALEX: Oh, yeah,
I get to do the fun job. VICTOR: Alex gets
to do the fun job. ALEX: You know,
I was one of the earliest people to do VR testing at Epic Games. And I used to throw
myself off buildings all day until I made myself almost puke. So I was one of the
early people to learn how to make yourself puke. There we go. VICTOR: The more you know. ALEX: Yep, that's right. All right. VICTOR: Right,
so we created a new VR template project. And if everything was set up,
software and your hardware, you should be able to just play. Now, I did mention runtime. If you're unfamiliar with
what the runtime are, when we're referring
to VR runtimes, that is Oculus VR,
the application that you install and run on your computer,
or SteamVR, which is also an application
that runs on your computer that manages the-- it's sort
of part of the interface between the device
and your computer. In this case, we are running
the Oculus runtime together with Quest 2 via Oculus Link,
which is the option that you
have to actually tether what is a standalone device,
the Oculus Quest. But you can actually
tether it to your PC and use it as a PC VR device. Very handy. It's very handy. And Alex is familiar with
the Quest 2 controller, so that makes perfect
sense why we are using that. There is one
important factor here. And I can't highlight-- I can't stress this enough. OpenXR require you to pick which
runtime you want OpenXR to use. So what that means is if
you have several devices, you got a Vive, you have a Rift,
you have a Windows mixed reality device, depending on
which one you plugged in first, that runtime might
actually be the one that is set to be used by OpenXR. You can find those settings
in both the Oculus application as well as SteamVR now. And you can set the
OpenXR runtime to the runtime that you want to use. If you launch the Unreal editor,
and you had a Vive plugged in, and you've unplugged that,
you plugged into Quest, and you are now trying
to use the Quest via Link on your computer,
and you're seeing that the Play in
VR button is gray, that could be one
of the reasons why. It might have picked SteamVR,
or vise versa, depending on how you do that. And that's just
important to know. We have documentation
on this as well, how you can change that
environment variable manually. But thankfully, at this day,
you can actually change them right in the software,
both Oculus and SteamVR, which is really nice. Before we get into some-- create things,
I did want to go over the project settings briefly. And I do know where they are. A couple of things that
are different in the VR template versus a blank
Unreal Engine project, main one would be that we're actually
using a different renderer. So forward shading is enabled. This means that you don't have
access to all the same buffers that you do in the deferred. Some things like
raytracing and other features do not work in forward. So that's important to know. Other than that,
we have mobile multi-view enabled. Now,
I am not smart enough so that I'll be able to talk to you
exactly what this does. But basically it is
something that you want to have enabled
for mobile platforms. And there are a couple of things
with this that we're going also to get into later. I know I'm saying
get into later. But there's just a
lot to cover today. And I'm trying to make this in
somewhat organized sections so that it makes sense. Mobile multi-view. And it's also important to
know that mobile HDR is set to false. That means that
you can't use all of the fancy
post-process effects that you otherwise could. Now, these last two settings
are only important for mobile. They don't actually
do anything when it comes to running
this on your desktop. Awesome. If you are new to Unreal
Engine development in general, it's important to know that
the VR template follows our typical gameplay framework. That is using a game mode. We have a pawn. There's a player controller. We're actually just using
the default one in the engine, cause there's no custom
code required for us there. That's why you're
not seeing one here. If you are unfamiliar with
our gameplay framework, I suggest you read
up a little bit about that. It will help you understand
sort of why all you're seeing is a player start
capsule in the world, and sort of how the pawn
actually appears in there, and how we possess it,
and so on and so forth. There's a little bit of
stuff that goes into that. So I suggest you go
ahead and read up on our gameplay framework. Cool,
everyone following along so far? Are you following along, Alex? ALEX: You know,
you're a really fast talker. I got to give it to you. You could do really
good auctioneer work. It's pretty impressive. VICTOR: Mm, you know what? I'm glad you bring that up. ALEX: Oh,
you're going to become an auctioneer? VICTOR: No,
that's definitely not happening. But literally 30 minutes
prior to our livestream, I was kind of running
around in here trying three laptops
to get you so that you could be on Twitch chat. And my own PC wasn't working. And then I had to do audio. So I apologize. I will do my best, folks,
to try to just take it down a notch. I am sorry, Courtney. ALEX: You're such a champ,
though. VICTOR: Courtney,
our captioneer-- ALEX: Oh, yes. VICTOR: --is the one
who's actually going to have the roughest time. ALEX: They were telling me. Courtney, what they were telling me,
it's like, oh, yeah, when you two get on chat. And you're like back and forth. I'm so sorry, Courtney. They don't deserve this. VICTOR: All right, Courtney, I promise you that
I will help you out with any of my weird
pronunciations of things. I am not a native
English speaker. So some of this stuff I say
can be a little funky sometimes. Anyway, time to learn. Let's go ahead and
open up the VR pawn. And I wanted to go over a little
bit about how the VR pawn is structured, because it is where
a lot of the interaction logic will exist, or can,
I should say. You can do things in many,
many, many very different ways, but this is the best place
to put it for our purposes that we're doing here. Like I said,
everything is set up for you to work. You have motion controllers
when you hit Play in VR. You have a
teleport functionality. You can grab objects. There are a couple
other features as well. You can dive into the
logic here and sort of see how all of that's
been put together. But the first thing
that I want to do here is actually to go ahead and add
a model for the head mounted display,
because currently all we're seeing is just a pair of controllers,
if we were to look at Alex through
a third person perspective. And yes,
I have a lovely kombucha right here that I am sipping on. [LAUGHTER] ALEX: They were asking where you're from, by the way. VICTOR: Oh, I'm from Sweden. Stockholm, Sweden,
born and raised. ALEX: Yeah, you can't tell because he has no accent,
right? Yeah. VICTOR: It's there. ALEX: I thought he was-- yeah, like when I first met him too,
I didn't realize until he said some specific word. And I was like, wait a minute. You're like really good, though. It's crazy. I lived in Spain for two years,
and my Spanish was God awful. I'm so sorry. VICTOR: Well,
you didn't speak Spanish from sort of first grade, right? That's the difference in Sweden. ALEX: You got a good education. VICTOR: Well, I mean, you need to speak English for first grade,
I hope. [LAUGHS] ALEX: Yeah, OK, yes. I got that one going for me. VICTOR: All right, so all I did here, to make sure
everyone's following along, I've added a static mesh
component to the VR pawn. We're going to go ahead and
parent that under the camera so that it will follow
our camera component when we move it around. In the editor we have access
to a couple of messages that come by default.
If you go ahead and search for HMD, you can just find the
generic HMD model. If you're not seeing that,
go ahead and go down to View Options,
and make sure that you've enabled
Show Engine Content. And while you're at it,
make sure that Show Plugin Content is enabled, because we are
going to dive into that a little bit later in the stream. Now that we have our
generic HMD model applied, it's a little small by default.
So I'm just going to go ahead and lock
all the entire three vector here and set this to 1.5 scale. That will do a uniform scale,
and I don't have to punch it into all
three of the float values here. I also do know that I want to
punch this forward 8 Unreal units,
which equals 8 centimeters. Because that will
actually put the display somewhat relative
to what it is in real life. So if you try to punch-- oh, I'm not showing. ALEX: Yeah. VICTOR: Yep, that's-- ALEX: There you go. I was like, wait a minute. There you go. VICTOR: It's difficult here. I need someone to yell at me. You should-- ALEX: I didn't realize. I was in the headset. And I looked over. I'm like wait a minute. Then I saw chat. And chat's always on it. Thanks, chat. VICTOR: It is. Today, though, it's like,
since I'm actually driving content, it's a little bit
difficult for me to pay attention to all
of the like six monitors we got going on in here. ALEX: All right, I'll put this down until you say go. All right,
so basically the idea is, he's setting up his components
in here so that when I go in, you'll be able to visualize
where my head position is at any given time. Plus,
I won't be seeing this VR thing. Make sure it's hidden. You know, owner hidden,
and all that. VICTOR: Yes. ALEX: Owner no see. Owner no see, I mean. Because we did this
before and I had like a big VR sticking out of my face. VICTOR: That was
when I found the values. ALEX: Oh, yeah. VICTOR: Yeah, it's good stuff. I wanted to go over to--
owner no see basically allows the owner of this
component to not see it at all. And that's exactly
what we want here. What's really nice is that
the VR spectator camera that we're about to use here,
it's not the owner of the pawn, or of the HMD. So we will actually
be able to see that. Now, generally,
when I'm working with VR pawn and we talk about collision,
say you'd like to headbutt
something with this HMD. I personally prefer to add
separate collision volumes, specifically for
something like this. And then you can also
switch out the HMD model, and have the same collision. And so I'm just going
to go ahead and set this to be no collision and
generate no overlap events. And then we'll also go ahead
and add a little sphere collision component here. And that'll make sense
in just a little moment here why we want this. We're going to go ahead and
set the collision preset to pawn. And we want to make sure
that we are generating overlaps. ALEX: Yeah. VICTOR: Overlap events. You're going to notice if
you're working with collision and the VR pawn,
if you have something that's blocking attached to like the
motion controllers or the head, sometimes when the pawn
spawns you'll notice that you don't have motion controllers. That means that you
have something in the pawn that when it's spawned it
was just blocking and colliding, and it's not able
to figure that out. Generally,
it tries to move the pawn. I'm not entirely sure what's
happening there, but just as a little caveat that
you should know. Let's go ahead and reduce
the size of this sphere radius. Cool, we are back. I am not showing just a camera. I'll also go ahead and
disable snapping here so that we can align that
somewhat with Alex's head. That looks like Alex's
head right there. All right,
compile and save that. ALEX: It's just a
basketball sitting there. VICTOR: It is. ALEX: Of course. VICTOR: All right,
let's go ahead and hit Play. ALEX: Basketball head. VICTOR: Alex, why don't you
go ahead and jump into VR. ALEX: Ooh, yeah, the cool part. All right, let's do this. VICTOR: Now, in the template if you are sitting on your keyboard,
you can hit Tab. And this will switch to
a VR spectator camera that the template comes with. This allows you to get a third
person perspective of the game world. And it's handy. You can use WASD,
Q to go down, E to go up. I think I also put
a Space Bar there. Yep, you can just use Space Bar. Mouse click to zoom. We're basically just changing
the field of view of the scene capture component there. And F if you want to fade,
just a couple of handy little functions
that we can use there. Alex, looks like you are not-- ALEX: No, I was going to say,
hang on a second. VICTOR: In Oculus Link? ALEX: It's in,
but it's in Create Guardian. So am I going to have to-- VICTOR: Just look
around a little bit. We might have moved
around some tables here. If you're unfamiliar with how
the tracking on these inside out devices work, if you
move all your furniture around, or change the
lights or lighting, occasionally they won't
be able to remember what the space looks
like to be able to do the SLAM technology. ALEX: Oh, oh, oh,
and its figuring itself out. It figured itself out. VICTOR: It usually does that. ALEX: All right, there we go. [CHUCKLES] OK, whoa, whoa. OK, ah, eh. VICTOR: All right, we're back. ALEX: Hey. VICTOR: There's Alex. ALEX: Hey, I can see you. You're the camera. VICTOR: Yes, Yes. I am the camera. ALEX: And you can
get out of here now. VICTOR: OK, well, maybe we'll make a little mini game
with that in the future. I'm going to go ahead,
and if you see the frame rate there, it is all my computer. I was not able
to figure that out. It's doing some weird
processing glitches. But thankfully,
everything's working good enough to the point where we
can demonstrate all of this. Perfect. We are able now to see Alex. He can play with all of the
default functionality that's in the template, but I'm going
to cut that short really quick. ALEX: Aw. VICTOR: Don't worry. We'll get to more of that. ALEX: Yeah. VICTOR: Yes, we will. I do want to dive into the
VR spectator camera a little bit before we do more,
since it's kind of crucial in the way that we will be displaying
all of our content today. It's much more
comfortable as a spectator to view the world through
a third person view. Or in some ways,
you can actually attach this spectator camera
to the VR camera component, and not have that like
cropped left eye preview that you are otherwise seeing. And so because of that,
I did want to dive into the VR spectator
camera functionality a little bit, because it is really cool,
fun, and very useful for the purposes
that we have here today. It's in right over
here in the corner. Control-E will open
up that Blueprint. And you can dive
into it and see sort of what all the actual
key binds are here. I should also note that you
can go ahead and find them under Project
Settings and Input. They are all down here as well. We are going to
dive into input as well. There's so much to cover today,
Alex. I had a difficult point putting
my notes together here, like where do we start? But I think we're just going
to churn through all of it. ALEX: We've got tons to cover. We got tons of time. I'm not in any kind of rush,
you know. VICTOR: Oh, well,
I'm going to buy you dinner later. ALEX: Ooh-ooh. VICTOR: I hope you're not. ALEX: Oh,
I'm getting dinner out of this. VICTOR: Yes, you are. You got me dinner two days ago. ALEX: Oh, that's true. VICTOR: One thing that I would like to do with the VR spectator
component that will make Alex's life a little bit easier-- ALEX: Oh,
I should have imported googly eyes. I'm so sorry. Everyone in here keeps
mentioning googly eyes. I should have brought those. VICTOR: Yes, yes,
yes, you should have. ALEX: On a flash drive. I should have. VICTOR: Yes, you should have. ALEX: Cause we can
slap those on any object. VICTOR: It would
just be immediate. ALEX: Ugh. VICTOR: Yeah,
that would have been great. Not today. ALEX: No, no. VICTOR: What we are going to do is we're going to add
sort of a selfie cam or the opportunity for Alex to
see what the actual camera, or in this case, the scene
capture component is actually drawing to our render target,
which is then drawn to our preview screen. You can find sort of all of that
functionality in the Enable VR Spectator function. This is where the
two sort of functions that make the view of
being able to hit Play, hit Tab, and sort of have
a different perspective, these are the two
functions right here that are actually
driving with that. We got documentation
on that as well. So I'm not going to
dive too deep into that. For our camera here,
we're just going to go ahead and add a
static mesh component. And this is going to
just be a simple plane. We'll go ahead and add
that to the camera mesh. And I don't think that's scaled. No, it's not. So that'll be real easy. Also, once again, in editor content,
we do have a plane. And I'm picking the one that's 100,
100. I do know that we're going
to flip this forward, and then 90 degrees. And then I believe we
turn it left 90 degrees. And all you have to
do here is just find the-- oh, actually,
we don't have a material yet. So we need to make a material. What we do have is
the render target texture. It's actually in the
VR Spectator folder. The render target
texture is right here. We'll go ahead and just
create a material out of this. Right click Create Material. Just go ahead and
open that up to verify that everything is good here. Something we can do just
to make this a little cheaper and I'll say easier
to work with is just go ahead and set this to Unlit. And the reason why we want
to do that is that because-- why am I-- there we go. All right,
setting that to unlit. And now we'll actually be
able to see what the spectator camera is seeing. And this is just a
little handy for Alex here so that he can
see what we are-- ALEX: Yeah, we were trying to do this last time. And I kept putting
on the hat crooked. And so we had to make
some sort of a mirror. And if you're not familiar with
how you make mirrors in Unreal, this is basically
like the core of it. And it's super fun and easy. And it should be
very reminiscent-- if you like older games,
it should feel very reminiscent of how
some older ones kind of like copied images
and rendered things. Oh, am I in? VICTOR: It's OK. Something that's
important to know here-- oh, I didn't apply the material. Something that's
important to know is that this scene capture
component here is actually running on tick, right? So every frame it's capturing a
1080p texture and drawing that. That is relatively expensive. And so this is a
feature that if you are planning to have
all of the players, including the lowest end
hardware that you're building this for to use this
during gameplay normally, you have to design your
game with that cost in mind from the beginning. Primarily what I use this for
is for recording cinematics, demonstration purposes. Yeah, trailer footage
and that kind of stuff. That's what I find
it is really useful for. But it is possible
to create sort of an asymmetrical multiplayer
game using this, which is cool. But you do need to know that
this scene capture component is relatively expensive. All right,
let's go ahead and make sure that we actually apply the
material that we made using our render target texture. And now,
if we go ahead and fly over to Alex-- ALEX: Oh, oh, yep, I can see it. I can see you. Eh, eh, eh. VICTOR: We're switching. You can see that he can now
see himself whenever we're actually active,
because as a performance saver, I stop all of the functionality
of the VR spectator when it is disabled. Yeah, OK, you do not get
to throw more stuff at me. ALEX: No, aw, aw. VICTOR: Not today. We got lots to cover. So that is sort
of just a little pre setup I wanted to do with
the pawn and the VR spectator camera. I am going to go ahead and
reduce my camera speed just a little bit. And then we're going to go
ahead and hold Shift, drag this over so that I don't have
to fly it over every time. Rotate it so that
it's right there. And then we have-- SPEAKER 2: You're
not showing your screen. VICTOR: Aw, I was not
showing my screen there again. ALEX: I can't see anything in VR,
man. I'm sorry. VICTOR: But are you going
to sit there the entire time? That's OK, Alex. ALEX: Well,
whenever I come out of it, it's like you got to
redo your bounds. VICTOR: That is fair. That is fair. That is fair. ALEX: So I'm going
to kind of chill in it. VICTOR: I think I was
mostly just talking there. I wasn't actually showing how
to do anything other than applying the material to the
plane that I have attached to the VR spectator camera. The scene is set to chat,
not see. You know what? I'm just going to stop
with the camera full screen. I was sort of saying,
like oh, it's so nice. I got my stream
deck set up here. I can just like switch
between camera and main. And everything is going
to be nice and dandy. But you know what? I'm going to stop that. Because I clearly am
not able to handle that live. Not today. All right, VR spectator ready. VR pawn ready. So now we can get
started to actually dive into some of the frameworks
that exist in the VR template that are here to make your life a
little bit easier when you want to build VR applications. ALEX: I assume you're talking about your new
component you made. VICTOR: There's a new component. ALEX: Ooh. VICTOR: There is a menu. There are a couple other things. But yes,
mainly we are going to go ahead and dive into the grab system. As you can see on
this little table here, we have a couple of actors. They all have
something in common, which is this grab component. This grab component,
you can find it under Blueprints
and Grab Component. It is a drag and drop component
that gives that static mesh, or that skeletal mesh actor-- it can now be grabbed. And so if you look at this,
yeah, this ball actually has a grab component as well. It is a drag and drop
as long as the actor that you are applying
it to is set to movable. So for example,
this cube down here in the corner, if I was to just go ahead and
add component, grab component, you can see that it's,
by default, set to static,
which means that I wouldn't be able to grab this. So if that's what you do,
you got to set it to movable. But everything else
is actually managed by the component itself. We're going to go
ahead and open that up. And as you can see,
there is not a lot of logic in here. It is mainly a place to store
some of these functions that we're using,
and also just sort of be-- it's component based programming,
or composition rather than inheritance here,
which is really useful for something
for a feature like this. There is one thing
in here that you can tweak by default,
which is the grab type. I have made two different,
let's call it functionality, by default.
We have the free type grab. And we have the snap. Those are the two that comes
with prebuilt functionality. The free grab is what
you've seen Alex do already. He is just grabbing
the cubes anywhere where he can grab them. And the snap is
what's being used by the pistol,
which allows you to orient the grab component in a rotation. Let's go ahead and
go to grab component. In a rotation that allows
you to align your controller with whatever you're grabbing,
which is perfect for
something like a pistol. It could also be good for
something like a remote, or I don't know,
lever or something, anything else you can think of. There's a lot of things
you can do with that. If you've already
played with this, you also might have
seen that there's a none option, which is just
really handy, this enumerator. If you set that to none,
it just makes it so that you can't
grab the actor. It's really just handy. The enumerator's just none,
and it's not going to do anything. We also have the Custom option. And if you try to use that,
it doesn't do anything, because it's intended for
you to script that logic yourself outside of the VR pawn and the
grab component, which makes it easier for you to extend
that without sort of having to play around and mess with
the system that's in place already. I'm going to make sure we
leave this as default as free. And that is,
if you just need objects that can be grabbed as
easy as what Alex is going to demonstrate right here,
if that's all you need, that's all you need. ALEX: Wait,
let me grab it. VICTOR: You can rotate
it for a pistol to align them. And you can-- is your
real world table aligned with the the virtual one? ALEX: It's slightly taller than the virtual table. So can everybody see that OK? VICTOR: Yes, yes,
you're looking real good. ALEX: Hi, y'all. I've got a box and a gun. VICTOR: MrMets
is asking if there's a way to do two-handed
grabbing with the component? Not by default. I was working on that,
trying to figure out a really good, generic way
that wasn't too complicated with two-handed grabbing. I wasn't able. I didn't think that any of
the solutions I came up with was easy enough to
understand and generic enough that they would work in
sort of all different scenarios of two-handed grab
that people would want. And so there is, by default,
no two-handed grab in the template. Know that the
template is supposed to be relatively barebones for
the sake of not being bloated if someone only wants
a little bit of functionality. But that is where the
custom grab type comes in. And you can actually extend
and build some of that logic yourself. We'll get into some
extras in the end there. ALEX: Pew pew. VICTOR: I'll actually
be showing you how to use the custom feature. Not for two-handed,
though, I apologize. That is way too complex
for something like today. ALEX: If you want an
example of like a two-handed weapon like that, I believe you
can go into the mod tools for-- if you go into the mod
editor for Robo Recall, we have some guns that
actually are two-handed like that, I think. So you could use
that as an example. But that has a unique system,
because all that pump action. I wanted the (RELOAD SOUNDS). VICTOR: I haven't
checked that out myself. I have built two-handed grabs. And there's also-- ALEX: Oh,
there's a lot of effort in it. VICTOR: There's
a lot of ways to do it. All right,
now that's all fine and dandy if you just want
to use it as is. But I do want to dive
into a little bit about how this actually works,
and some things that are important to think about. The motion
controller components, they are our representation
of the physical device in our virtual game world. This is the component that
takes all the data that the runtime is feeding through OpenXR. And that gives
us the possibility to see our controllers
in the virtual world. If you've noticed there's
actually no models here. And so you might have asked,
hey, why am I seeing controllers? Where are they coming from? And that's actually
a nice little feature of the motion control
rig component itself. It's called Device
Visualization. It's enabled by default
in the VR template. And it's just set to default.
Wide OpenXR, this is really handy. Because the runtime is going
to pick an interaction profile. That is also something I-- I'm just going to say that,
and then we'll go into the interaction
profiles later. It will pick an
interaction profile that matches the
controller that you're using. And that interaction
profile will then, in turn, pick a controller model. You can find all of
these controller models if you go down to
the OpenXR content. This is why I told you
earlier to make sure you enable the plugin content. OpenXR Content Devices. And here you can
find all of the models that we shipped with 4.27. I really do have
to know-- and I'm going to turn on
the camera for this. The Oculus controllers as
well as the Vive controller are a couple of
centimeters offset, including a little
bit of an off rotation. That is not on you
if you're seeing that. That is totally on
us in the library. And we are working on
getting that updated for 0.1. They just need to be
re-exported and imported with the correct origin. Now, talking about origin,
that brings us to a difference between
how the previous plugins and how the OpenXR plugin
interacts with the motion controller. Oculus VR and SteamVR
previously picked individual, let's call them points on-- individual points on
the controller to say, hey, this is my zero. OpenXR has made this standard,
which is now referred to as the grip position. And it's intended to be
nicely aligned with your hand. So however the controller fits
in your hand is how whatever that value is going to be. So what that means is that
when you ask the motion controller component, hey, I want your
world location, what you're actually getting will be
equalled to the grip position and grip rotation that the
OpenXR runtime is returning to you. You can see an example
of more kinds of data that you can currently access
through a function called get motion controller data. We are using that for
the teleport trace function in the VR pawn. I have mentioned that
we have a grip position. And I'll try and
drag this out here so we can see
a little bit better. I have mentioned grip
position and grip rotation. And that is actually equal to
the motion controllers' world position. The aim position
and aim rotation are actually going to
be a standard placement on the controller that
is nice to use for aiming, and so perfect for a teleport trace,
which is what we're using it for right here. And you can use the get motion
controller data function, break that struct, and you have
that information right there. I'm going to go ahead
and minimize that again. Now, a little interesting thing
that Jeff Fisher on the team here told me, that if you need
a sort of hard reference to this, and you don't want
to run this function, you can actually go ahead
and add a separate motion controller component. Now, let's call this
motion controller aim left. And if you scroll down,
we do not want the display device model allowed on this. But we do want to set our
motion source to aim, left aim. If we do this-- oh, yeah, I saw this. This is what happened
earlier yesterday, Alex, when the reference
sort of just disappeared. And I'm not sure why. I'm just going to
plug that in right there. ALEX: Oh, and there,
it did again. VICTOR: It did it again. So we do have a repro step,
which is great. So it means I
can log that issue. I was worried that
wouldn't be the case. ALEX: See,
we do actually find stuff and report it, y'all. I assure you. We got it under control. VICTOR: That has
been a lot of my work for the last couple
of months actually, has been using
all of these tools, and reporting the stuff
that doesn't work in JIRA. People that did
a good job for all the good stuff they did. ALEX: Yay, JIRA. VICTOR: Yay, JIRA. ALEX: And then thank you for everyone in the wild,
by the way, all of you out there who send
us in reports and stuff. There are human beings who
read your crash reports, by the way. I used to do that. VICTOR: And I do too. I also should
mention for everyone, going back to camera-- I promise, I'm going to
leave my button on the main. That feel free to
submit your questions. I will be going through
them once I'm done with sort of the planned presentation. And so if you don't have time
to stick around to see them answered,
know that you can later watch the VOD on Twitch.TV/UnrealEngine
or our YouTube channel. The VOD will be up there
as soon as we go offline. ALEX: Yep. VICTOR: Now, so what I did there is that I added another
motion control component. And I set its motion
source to be left aim. What this actually means
is that when Alex now-- here, why don't we go
ahead and visualize this. I think this is a-- I'm just going to go ahead
and add a little sphere here so that we can to ahead
and visualize what this actually is. So this sphere is parented
underneath the motion control component that is using
the motion source left aim. I'm going to reduce the size
of this sphere to something like really small, even smaller. Yeah, let's do 0.05. All right. ALEX: I appreciate all of you in chat who are realizing that,
yes, we do really actually read those reports. [LAUGHS] VICTOR: All right,
I'm going to go hit Play here. And Alex,
if you bring up your left controller. ALEX: Let's do this. Oh,
I have a ball in my controller now. VICTOR: Yes. ALEX: Look at that. Isn't that cute? VICTOR: Now,
that motion controller component is actually getting
the generic aim data that whichever runtime
that's currently active it's returning
through the OpenXR, and then through
the OpenXR plugin to our motion
controller component. And that is able
to give you the aim location for the controller. And why is this a big deal? You might ask. If you haven't worked
on developing applications for different devices,
this stuff had to be hardcoded
for each controller. You had to say like,
oh, hey, is the player using an Oculus Rift? Is the player using a Vive? Is it an index? And then you had to change
where you got all of this data from. Say you wanted to
have a sphere right there, you would have to move
that manually sort of on begin play or something. OpenXR just generifies that. And so you as the developer,
all you have to care about is, hey, aim position,
aim rotation, grip position, grip rotation. And that makes
your life designing an application for
devices that might not even have been developed yet. They can also work if they're
following and using OpenXR. So I'm just going to
leave that motion controller aim left right there. I'm going to Control-W.
We're going to make a right. And then we're going to
go ahead and make sure that we set this motion
source to right aim. And this is something that I'm
hoping I can push in 0.1, because I actually
really like this solution. We're going to go ahead
and drag the widget interaction component to aim left and
the widget interaction right to aim right. What this does
is that it actually puts the menu cursors
in a comfortable position. A little bit of an
oversight of my end there, since the menu
that's in the VR template actually works
with just one hand. It can flip between the two. And now we actually
have it working well when you are using
two hands as well. So just a nice little point
of using this before 0.1. All right, that's initial-- just a little bit of background
there on sort of the grab system and how we're able to
use that data, and why it works. So another good
thing to note here is that if you're using
a Vive controller, the pistol will be
oriented correctly as well, because they're both based
on that grip position and grip rotation data, which is
default by the motion controller. I can't repeat that enough,
that grip position and grip rotation,
even though you are able to grab
those individually from this function,
if you just get the motion controller and you have that motion
controller set to the motion source of left and right,
which should be the defaults, that information is the same. Grip rotation equals
motion controller location and motion controller rotation. Any questions, Alex? ALEX: What? VICTOR: Moving on. [LAUGHS] All right, let's go ahead and
make a custom grab actor, because I think Alex is
looking a little boring just there with the HMD. ALEX: I said, like I think I know where this is going. Now that we've gotten the
controller orientation and all that stuff put together,
I think we could make me a
generic hat asset, or generic concept of
an item that I can lift up. VICTOR: We are, indeed. I am working on it as we speak. ALEX: Oh, oh, yeah, so Victor and I are both in
the same school of thought that we should
try to do everything that we can in the editor. So I'm obsessed
with like control rig and animating an editor,
and the new modeling system that now exists in the engine, right? In the editor, I mean. And so Victor and I have
been discussing this before. It's like,
can we just do everything in here? And I've found that you
can do basically almost all the things, except for
skeletons and skeletal weighting at this point. So here comes the hat. VICTOR: Here's the top hat. Is that looking-- ALEX: Yeah. VICTOR: Is that looking good? ALEX: It's looking good. VICTOR: All right,
what colors do you want? ALEX: Let's go with
purple and green. VICTOR: Purple and green. ALEX: I'm a big fan of bright,
fancy colors. VICTOR: Pretty sure-- yup. ALEX: If anybody's
seen where I live, it's a big, purple yurt
next to a big, green shed. VICTOR: I'm just going to
create a material instance off of this projectile,
because it already has an exposed vector
parameter for its color. You said purple, right? ALEX: Yeah,
you could do purple on one. Do like the rim in green. VICTOR: Purple on the rim,
and then green top? ALEX: All right. VICTOR: Is that actually
audio coming from-- ALEX: Oh, yeah, yeah. It's the VR lounge when
we're in the Oculus lounge. You can hear that behind you? All right, there we go. ALEX: Oh, it heard you,
cause it stopped working. VICTOR: Well, no,
cause I clicked it. ALEX: Oh. [LAUGHS] VICTOR: Control-W
to duplicate an asset. ALEX: Oh, yeah. VICTOR: And then we're going to go ahead and make
the material instance green here so that Alex
can get his-- this is really important work here, everyone. ALEX: Yep, this is it. This is very core. VICTOR: All right,
we got Alex's top hat. Really nice feature. So I just dragged
in two cylinders, applied two materials. And I'm going to select both of them,
right click, and then go ahead and merge. ALEX: There you go. VICTOR: Merge actors. ALEX: What is a metaverse,
really, without top hats? All right, let's see. VICTOR: What is the
metaverse without any hats? That would be-- ALEX: I know,
it's got to have tons of hats, like lots of goofy outfits. Can't be having no
boring metaverse. [LAUGHS] VICTOR: Well, all right, I've gone ahead and made a somewhat large,
I think, top hat here. Whoops. ALEX: Hm. VICTOR: This object here
is just a model right now, which doesn't do
anything for us. But if we go ahead and
make a new Blueprint-- ALEX: Oh, that's right. We have to do the whole thing. VICTOR: I'm going
to make the Blueprint. Copy the top hat. ALEX: It doesn't understand that,
clearly, we want out of box. This will just be a hat. You have to do a
couple-- VICTOR: It's just a hat. We'll go ahead and add
a static mesh component. We'll set that as the root. And we'll go ahead and
apply the top hat here. What's nice about
the grab system is that it actually gets the
parent that it's attached to, and it sets the
collision profile to the collision profile that we
are using for the grab system. So you don't actually
have to set that. But if you want to be thorough,
you can go ahead and just
set it to physics actor, since that's what we're using. We'll also go ahead and just
add a grab component as the child. Compile and save that. And we'll go ahead. And I don't think Alex needs-- nah, you know what? You might want that. I'm just going to go
ahead and get rid of these. Give you a little
bit of space here. We'll drag this top hat out. And you should now be
able to grab your top hat. ALEX: There we go. All right, oh, there we go. Hey, there you are, buddy. VICTOR: Hello, buddy. ALEX: OK,
so I'm going to just hop over here, and I can grab it. VICTOR: He can grab it. Yay. ALEX: And then it kind of-- VICTOR: Right,
and why is it not falling? ALEX: (SINGING) I'm
just wild about Mary. VICTOR: Yeah. ALEX: (SINGING)
Mary's just wild about-- Aw. VICTOR: If you want the
object to interact or behave like a physics actor,
you can set the static mesh to simulate physics. And now the grab system actually
knows that this was simulating. Yeah, go ahead. ALEX: Oh, ah. VICTOR: Get in there. ALEX: OK, I'm good. I'm good, OK. VICTOR: That's why you're here. ALEX: I'm here. VICTOR: And then
cause I-- ALEX: Oh, wait,
there, it's falling. No, my hat! VICTOR: All right, it's falling. ALEX: No, it fell on the ground. VICTOR: Great,
I'm going to make that less jittery. All right, but a top hat is
nothing unless you can actually don the hat, right? Yeah, I mean,
he can have it fall on his head, but that's not good. Where's the rabbit? No rabbits today. No rabbits today. We want to make sure that
this hat stays on Alex's head when he lets go of it, right? And here comes a little
bit of a handy feature that actually comes with
the grab system by default. If you select the grab
component in the Blueprint actor, and you go ahead and look
down here on the events, there are two event dispatchers
that I've rigged, On Grabbed. And you also have On Dropped. And what these
do is that whenever the parent component
that the grab component is attached to gets grabbed,
these events dispatchers will fire. If you do not event dispatchers
and other forms of Blueprint communication, I highly
suggest you go watch Zak Parrish's talk on
Blueprint communication. I think we should have that
link sitting in our main doc because we keep
mentioning that often. I will even go as far as
saying if you're using UE5, that livestream is
a gold mine when it comes to understanding how to
communicate between Blueprints and just write logic and
functionality in Unreal Engine. So go ahead and watch
that if you're unfamiliar with event dispatchers. But they're fairly easy to
use once they've been setup. And so all we need to do here is
that when the grab component is dropped, we're going to go
ahead and look for a head, and then attach
the hat to the head. ALEX: Oh, and anyone who doesn't like to
watch videos and would prefer to see that stuff written out,
one, Zack is amazing. You should totally watch it. But two, docs.unrealengine.com-- VICTOR: Oh, nice call out. ALEX: There is a Blueprint
communications page. It's docs.unrealengine.com. Go there. Look up Blueprint comms. It's super important to learn
Blueprint communications and flow control. VICTOR: Skye is dropping
the links in both Twitch and YouTube here. Just stay tuned. They're dropping. And you can also
Google those words that we've uttered in the
last couple of tens of seconds. ALEX: Buzz word,
buzz word, buzz word, buzz word. VICTOR: Buzz, buzz,
buzz, buzz, buzz. All right, the first thing
we need for this logic here is to have a place
where we are tracing from. And a friend of mine
who's been working with Unreal Engine
for many years did not know about this feature. So I am going to show it I like to do this
a lot with models. And it also makes it
easier for the sort of artist designer relationship
once you put this into production. You can author-- so I'm in
the static mesh editor here. And you can create a socket. I'm going to call
this hat socket. Now, this socket is just
a position and rotation relative to the
mesh itself that you can reference in Blueprints. And this is super useful,
because imagine if you had 3,000 hats, right? Or the one might
be a little tricky. But you had 3,000 hats. And the Blueprint is just like,
give me one of these models. And you all want them
to be relatively accurate, when you put it on your head. And you sort of want
the rim to attach, right? You don't want to trace
from the top of the hat. And some hats might
be top hats and tall. And some hats might
be really low and short. And so this socket allows
you to get a universal point that the artist can place. And the artist can just say like,
hey, this is where the rim of the hat is. And give that to the designer. And the designer can
then work with that logic. So we're going to get our
static mesh component. We're going to
get socket location. And that's all we need here. Socket name, hat socket. Very clever. If you didn't notice,
by default, since like 4.24, I think, if you hit Q,
you can run the align function on this straightened
connections function. It's just default to Q.
Very, very, very handy. My old friend Nathan
showed me that back in the day when it was not default.
And I have actually relearned it for Q.
It used to be Spacebar for me. Anyway, we're going to run a
sphere trace for objects right here. You can do any of them really,
but we're picking objects for this case. I'm going to make an
array for object types here. We're just going
to punch in pawn. If you remember,
we set the VR pawn sphere to be profile
collision preset pawn. So then we'll actually set it
to be the object type of pawn. And so that's what we're
going to go ahead and look for with our top hat right here. Start, end, same thing. For a sphere,
it will just do it from there. We'll do something generous
like 20 centimeter radius to make sure that Alex
can hit his head when we trace for this. Hold B, left click,
drop a branch. Plug that in just to
make sure that we don't need to execute logic
unless we actually had a hit. We'll go ahead and
break the hit result here. Drop that down. And what we are
interested in here is actually, we can just do this off
of the component itself. Now, simply, if this is pawn,
and we hit pawn, we want to grab this component. And we're going to
go ahead and attach to. And so attach
actor to component. Target, self. That is us, the actor. And our parent is the
hit component that we hit, which we are only
specifically looking for object types of pawn. And I believe-- ah,
we need to make sure that our rules for location
here is set to keep world. Otherwise it would
take the relative location that we were at previously
and put us somewhere else that we didn't want it to be. All right, moment of truth. Does Alex get a
top hat on his head? ALEX: All right, can we do this? Can all of VR, XR,
and the metaverse come together to make
me like Johnny Mnemonic? Oh, wait, hang on. Eh, oh, other way. And this way. Oh! VICTOR: Oh! ALEX: Oh, oh! VICTOR: Ooh, woo! ALEX: Look who's
got the top hat now. VICTOR: Yeah. ALEX: Look at this gun. I need to get over
here and get a gun. VICTOR: Yeah, you know,
you can move the table. ALEX: There we go. I'm just going to move myself. There we go. So I got a hat and a gun. [CHUCKLES] VICTOR: Oh, snap. ALEX: Oh, yeah. VICTOR: All right. ALEX: This is it, y'all. VICTOR: And so from here on out,
I'm fairly certain Alex is
going to put on that top hat. You can even see
the rim in your view. That's kind of cool. ALEX: Oh, yeah, you kind of can. Hang on. Let me put it at a
jaunty angle for style so everyone can
see how fancy I am. VICTOR: For style points. [HUMMING] All right,
we're at the hour mark here. So I'm going to try to move on. We got some more
stuff to build here. ALEX: We got a bunch
of stuff to do here. VICTOR: But all right, so I want to recap a little
bit what we did here, and clean up my editor from
all of the materials and things that we are not using. I'm going to leave the VR pawn. I'm going to leave the top hat here,
grab component. That's all good. All right,
that looks a little bit better. So what we did here was
that we created a new Blueprint actor, top hat. We added the mesh
that we made in editor. We gave it a grab component. And then we wrote a little
bit of logic on Dropped. And so you saw
what happened there. I didn't actually mess with
any of the grab logic that's already been implemented. We are just expanding
on top of that. And this is really useful. It's a really useful way
to work as a designer because now you're not--
imagine this was a much larger grab system, and there was all
kinds of complex solutions there. You're not just
building on top of that. And you don't
have to go and mess with the code that is actually
the foundation for that system. That's been tested. And it's working as intended. Here you're just
building on top of that, which is really, really handy. And like I said,
you just click grab component. And you have those
event dispatchers there. If you want to dive into that,
you can go ahead and find
them in the Try Grab function inside the grab component. The event dispatcher
is being called right here. And I've tried to add document--
or comments everywhere to make this as
clear as possible. You can also find it
on the try release here, and that we actually
have the column dropped. Good to know that they only
fire if this was actually dropped, if it was actually grabbed. It's not only when it was tried. Now, that moves us
into to talk a little bit about the custom grab type. If you want to
build a custom grab type on top of this system,
let's go ahead and just copy the top hat
for the sake of purpose, top hat two. We'll copy top hat two. If you go ahead and set the grab
component to grab type custom, you would now
be required to use-- you can also go here
and reference on grabbed. You can do it this way as well. But when you do have a
component in Blueprints like this, you can also just go
ahead and use that function, or event if you prefer to. When this is set to custom now,
nothing is actually going to
happen unless you script that logic yourself
on grabbed and on dropped. The system itself is going
to just say, yes, I was custom. Let's go ahead and
start from try grab here. We're going to try to grab. And we're going to say,
hey, grab type custom? OK, we're just going to be held. You know what? I'm just going to let
the developer there deal with that themselves,
which means that the on grab function is going to call. And what you're
required to do there is to actually punch that
logic in yourself on on grabbed and on dropped. A few suggestions
for what that could be is something like a dial,
something I was experimenting with earlier,
like a lever, or any of those,
from things you can put them into. Just build them on
top of the system. Or if you're feeling
adventurous, you can just go ahead
and expand the grab type enumerator, which you can
find under VR template Blueprints. It's right here. You can actually go ahead
and add more grab types. And that will allow you
to select them and have that functionality
sort of by default. Or you can just inherit
from this actor.custom. There are a myriad of
ways of how you can do that. Punch that logic. I like punching in logic. [SLAP] ALEX: Whoa,
that was a good sound. VICTOR: I'm sure
that was pretty loud. ALEX: Can we get a
foley actor in here to-- [SLAPS] VICTOR: I'm sure
that's pretty loud, since your microphone
is sitting right-- ALEX: Got a good one. [LAUGHTER] Someone who is really into
foley is going to love that. VICTOR: Mhm, OK. So we have now worked a
little bit with the grab system. We have worked a
little bit with the VR spectator that comes in handy. Oh, I should note, we do have
documentation on most of this. I had that link somewhere,
VR template documentation. All right,
we'll go ahead, and it's in the forum announcement post. You can go find it. That's where we announce
all the livestreams. ALEX: On the forums. VICTOR: On the forums,
in the events category. And we call them the
forum announcement post. Moving on,
there is something here that-- there's also
another system that allows you to do cool stuff. And if you've taken
a look at the pistols, they are using the system
that I'm talking about. I'm actually not ready,
because this here needs to be fixed. ALEX: Yeah,
they have like the two thing where you can grab them on one side. But you can pull the trigger
to also shoot them, cause it has multiple input types. VICTOR: Correct,
Alex has been paying attention. ALEX: I have been, somewhat. VICTOR: It is listening
for a Blueprint interface event. And this one is trigger pressed. Once again,
Blueprint communication documentation or a Blueprint
communication video by Zak Parrish will tell
you a little bit about this. But to use it,
if you're just looking to use the VR template as is,
this actor, the pistol actor,
if you go to Class Settings, you can see that we've
implemented the interface VR Interaction BPI. And if you are a C++ programmer,
you know that this basically acts as a parent for all of the
actors that are implementing this interface. What that means is that we
are able to talk to any actor that implements this interface
without actually knowing anything about that actor,
other than the fact that it's implementing
the interface. So if you go to the VR pawn,
and you go down to the input trigger
left and the trigger access left, right,
and vise versa, you can see that we are getting
the component that is held. We get the owner,
which is going to be the actor. And we're then calling the
Blueprint interface message trigger pressed. And in the pistol,
we are specifically listening for event trigger pressed,
spawning a projectile, just from muscle location. That's just a scene component. That's sitting right here,
very typical to how the first person shooter template is set up. We spawn the projectile. And then we play
a haptic effect, which is for the controllers. Now, even though this is
the only piece of the Blueprint interface,
sort of input functionality that exists in the VR template,
you can extend it. Because I did add a couple of-- there are a few examples of
the Blueprint interface events here. If you wanted to pass,
say, the face buttons, you go ahead and add new. Face button one. I would do face button two. And that would be like A and B.
Why did that not work? I want a new function. There we go. Face button two,
and we would use these. You would also have to go
to the Project Settings, Input. And you will have to
go and add these events and listen for them so that
you can listen for them in the VR pawn, and then do the
same thing we're doing here, and call that on the actor,
or yes, the actor that you are currently gripping. It's a really useful way,
because there's no hard references here. So designers can just
go ahead and make as many pistols, remotes,
and dials and levers, and such and such. And if you set all of those
inputs up in the VR pawn, you don't actually have to do,
anything there. And you can leave
that checked in. And there's no problems trying
to grab the pawn back and forth over source control. Yes, a little mention of the
VR Interaction BPI there. Let's go ahead. Should we build
something for that, just to show how
that could work? ALEX: I don't see why not. Oh, did we decide if we're
going to try to do the ragdoll thing or not in this demo? VICTOR: We're not doing
the ragdoll thing today. ALEX: It was a bit-- VICTOR: Yeah, I have way
too much to cover for that. ALEX: Yeah,
it was a little bit, eh. VICTOR: Yeah, why don't we-- there's a chance I might
actually do a livestream next week, cause we-- ALEX: We have
a lot of cool stuff today. I'm just saying. VICTOR: Well, yes. That would be a
little bit [INAUDIBLE], though,
because I haven't planned anything. I might have to
come on next week. We'll see. Hopefully not. I want other folks to come
on and talk about the cool stuff they're working on. ALEX: Oh,
there's a game dev category now, as of 30 minutes ago. VICTOR: Wow, clearly,
we were too late. ALEX: I know. We did this all wrong, y'all. VICTOR: Skye, if you can add that, please go ahead and do that,
cause that would be great. ALEX: Thank y'all in
chat for mentioning that. That's awesome. VICTOR: I don't know, we should do a little bit
of celebration over that. [HUMMING] OK. ALEX: OK, that was cool. All right, back to VR. VICTOR: Back to VR. Yeah,
why don't we go ahead and just build an example of
what that would look like if you're making a
custom actor that actually takes some form of input. And why don't we just make
that the A button and the B button on the controller,
X and Y on the left, and B on the right for
Oculus specifically. And we'll actually
go through that. That seems fairly good. All right,
I've mentioned remote a couple times. So why don't we just go
ahead and make a remote, because it has buttons. It can be held. We have somewhat of a
TV that Alex is looking at. ALEX: Yeah,
I'm monitoring everything. Oh, you mean the TV-- oh, yeah,
look at the TV over here, dude. You mean the
screen that I look at. Yeah, yeah, yeah. VICTOR: The screen
you're looking at in the-- ALEX: In the VR space. VICTOR: VR world, yes. All right,
we'll go ahead and just add a simple cube here for the
sake of demonstration purposes. Actually, no, we'll just go
ahead and build it out here. Cause that gives me
some form of reference of how big this thing is. So scale in VR
is very important. If things doesn't seem like
the things in the real world, they just feel very unimmersive. And your reptile brain won't be like,
oh, that looks real to me,
because it's in stereo, and it got depth, and it's
the right size, and all that. So scale is very important. Not that I'm going to nail
perfect remote scale here. But OK, that looks pretty good. All right, Alt, drag to copy. Move that up a little. Try to find a good button size. All right, there we go. I'm going to ship that. All right, we got top button. We got A and the B buttons here. And since we already have
some nice materials here, we'll do that. We'll do purple. We'll do green. We'll do purple. We'll do green. And here's our remote. Once again,
Control select all of those. Right click Merge Actors. And now we will have
a nice little SM remote. We'll go ahead and delete
those from our working scene. And there we go. We have a remote. All right,
let's go ahead and put that inside our BP remote
Blueprint instead of this cube. ALEX: This is where
you just duplicated the hat again basically. Or is this the gun you
duplicated into this? VICTOR: Come again? ALEX: Did you duplicate
the hat or the gun to get this. I forgot which one
you started off with. VICTOR: I didn't
duplicate anything actually. I just made a new actor. ALEX: Oh,
I would have duplicated something I already did, because I'm lazy. But that's me. VICTOR: Yes,
but our top hat actor already had-- and the pistol Blueprint
is the skeletal mesh with a bunch of logic. I didn't want any logic,
because I'm going to write all the logic myself. So instead of deleting it,
I made it just a new Blueprint. ALEX: See,
now it's all making more sense. VICTOR: We'll call this remote. ALEX: It's all coming together now,
folks. VICTOR: We'll add
a grab component. And for the sake of it,
we will just go ahead and set that to snap,
so that this is held in the right way. The way that I would
like to orient this is to temporarily add a
static mesh component. And that will not be SM Remote. We'll call this
Controller Preview. And we'll go ahead
and find Oculus. It's V3 for the Oculus
Quest 2 controllers. I will just pick the right one. Oh, did I-- oh, yeah,
it's just really big. Yes,
my remote is just really small. Yes,
because apparently I punched that in. Controller preview, which is
up there now and still really small. What happened? Oh, it's all of the-- me doing bad stuff. Unreal being smart. That's how it works. ALEX: Yeah,
it's using like the parent information as opposed to
being absolute, or whatever, right? VICTOR: Yes. ALEX: Yeah. VICTOR: And so
what I've done here is that I've added
just a controller model as a child of
the grab component. And I'll go ahead and set
this to be editor only, just as a precaution in case this
would happen to get shipped. I'll also make sure
that collision is set to no collision, no overlap. This is only for
visual purposes. What you can do
now is that you can just grab the grab component,
and you can align that with the controller to
find like the orientation that you would like
to hold the remote in. And you can try-- you can add-- that
was a Control-W. I'm not sure what
happened there. You can go ahead and add more,
and then pick the Vive controller, et cetera,
here if you want to align them. Let me go ahead and do the new ones,
right. Oh, that's the Cosmos. Well, you know what? That works. They both need to be
children of the grab component. That's important to make
sure that we zero out all of these here. And this is just to
give you an idea of sort of how all
of these controllers are going to be grabbing this. Now,
remember I said the Oculus Quest ones were a little bit offset? And that's why you're
going to see these two here being a little bit different. Will be gripped
somewhere about there. Cool, I'm going to go ahead
and delete one of those because it looks
a little bit weird. All right, we now have a
remote that can be grabbed. Oh, you know what? I'm going to make sure that
this is set to hidden in game as well. [INAUDIBLE] is asking if
the Cosmos is emissive. It might actually be. ALEX: Oh, that material-- VICTOR: Dive into that yourself. Yeah, I'm not entirely sure. It might be. I don't know if the real one is. I have the Focus 3. I don't have the Cosmos. All right,
so what we wanted to do here was to demonstrate now
how you can implement your own sort of logic. You can now grab this. It's set to snap grab. We oriented the grab component
relative to the remote mesh in a way that it should
feel relatively nice to grab. And remember, it's across
the board with all the controllers that you could be
using with OpenXR. But on top of that,
we now want to listen for any events that the controller
or the player is pushing on the controller. So we'll go ahead and
go to Class Settings. And we'll go ahead and
add the VR Interaction BPI. Now, earlier,
I already added the face button one and the face button
two events here, or functions, I should say. And so we are going to make
sure that we have that input actually being listened for. Like we actually need to
know when those buttons are being pushed somewhere. And we do this
in Project Settings. And this is where you
can find all of the inputs that have been set up. I would also suggest you
read up a little bit on the input system if you don't
know how it works. But it's fairly straightforward. We have our action mappings,
which is a sort of binary
pressed and released. And we have our access mappings,
which is going to return a float value. So that works really
well for stuff like triggers, or thumbsticks that are
going to be axis mappings and buttons, like these. Alex, can I have-- I got more controllers. OK, yeah, give me that one. ALEX: Just take that, yeah. VICTOR: Thank you very much. Buttons like these, like the A
and B button are very useful. Or we're going to
do action mappings. And so I'm just going
to go ahead and add-- we'll call this-- I like to call
these generically, like face button one and two. Because you should
definitely not call them A and B.
Because the Oculus has like X and Y, and the A and
B is all different. So face button one. We'll go ahead and add Oculus. Oh, oh, and we actually
need to do face button one. We'll start with left, zero. And we'll do Oculus. ALEX: Thank you to whoever posted that on the Unreal
Engine official channel, setting up inputs. VICTOR: X press. Is X the bottom one? I think X is the bottom one,
right? ALEX: Yes. VICTOR: Yes,
X is the bottom one. So that will be face button one. And we'll go ahead
and add face button. So OK,
I am adding left and right here. You don't need to do that
unless you actually have an object that,
you're going to grab with both hands, and use both inputs
at the same time in that object you are holding. So let's just do it
that way anyway. Face button two, left. Yeah, it's best to be thorough. Let's be thorough. And this will be Y, Oculus Y.
Y Press. All right, and we'll go ahead
and add the face button to that one right. And that is Oculus A press. And we'll add another one,
last one, face button two, right, Oculus B, B press. All right, now that we've set
up these action mappings, we can go ahead and listen
for them in the VR pawn. So go ahead and right click. Face button one, left. And face button two, right. I'm just going to say,
Alex, so far I've not had to restart anything. And everything
is actually working. So it looks like we're
going to be able to finish the stream without a restart. I restarted my computer
like eight times in the half hour before we went live. We have those events right here. I'm just going to trust that
they're going to execute and that I did everything right. And so what we want
to do here is basically do similar logic to what
we got over on the trigger. We want to grab the components
that we are supposedly holding. That's why this is a
validated get function here. I'm just going to show
you how that works. If you're getting a held
component left here, if you grab that reference
to that variable, by default it looks like this. If you right click, go down
to convert to validate and get, you get like the variable
reference and the is valid check all in one. Very handy. It makes the Blueprint look really,
really nice and smooth. ALEX: The inverse of this is like if you're
doing a cast too, you can do a pure cast
by right clicking on it and then getting rid
of the is valid check. So there's like a few nodes. I realized this when I
was teaching people. There's a few nodes
that if you right click them, they have like
secret extra functions. So you guys got to try
right clicking random stuff and seeing what you get. Also, in animation Blueprints,
there's a lot of times that it's like add extra
inputs or add extra outputs. And it's all in the
right click menu. People won't realize
that they're there. VICTOR: I am just adding. ALEX: Oh, knock on wood, because you had said
things are going too well. Had to knock on wood. [CHUCKLES] VICTOR: I shouldn't have said that,
should I? ALEX: Thanks, Deviver. "De-viv-er?" "De-viver?" VICTOR: "De-viver" is how
I would pronounce it, yeah. ALEX: All right. VICTOR: But you can
correct us if we're wrong. ALEX: Yeah, yeah, yeah. VICTOR: Yeah. I just went ahead and
added pressed and released for both of them,
since that's what we're actually getting from the VR pawn here. So we're calling face
button one pressed on the owner of the potential-- if it was valid,
if we were holding something, the component. We'll go ahead and do
face button one released. ALEX: Deviver, OK. There we go. VICTOR: And here, we do not want held component left,
because that's our left one. We want to look for the right. Go ahead and copy those,
paste, spaghetti. ALEX: So we're getting a lot of questions about the
metapet April Fool's thing, like the metapet. Oh, by the way,
I did actually locate that dog model. The artist who made it,
she's fantastic. I did locate the dog model
inside of stuff. And really,
it was just made for that one prank. It doesn't do anything more
than what you see in that video. So just a heads up. I know a lot of people were like,
dude, you got to go on to the
internal stuff and get it, and show it off. Like it's really
just the one video. Sorry, y'all. VICTOR: Wah, wah. [LAUGHS] ALEX: Yeah. The googly eyes is real, though. That is actually a real thing. VICTOR: Did a
livestream on that. ALEX: Yeah, yeah, yeah. That was fun. VICTOR: All right, so we've now hooked up the action interface,
or sorry, the action input event in
the VR pawn on pressed. We're looking if we
have something held. And if we do,
we're going to call the face button one pressed interface function. And what's great here is that
our logic here doesn't actually care if the owner of our held
component implements this or not. It's just like, hey,
just send this event. Send this message. And if it's listening for it,
great. And then it can
deal with its thing. If it doesn't,
then it probably doesn't care, which is the case for
most of our objects out here, since none of them
are listening for it. But for our BP remote,
we do want something to happen when we're
calling these functions. And since we did add
our implemented interface via Interaction BPI-- check one. OK,
I'm just making sure I'm not muted. We'll go ahead and right-- oh,
I shouldn't have said anything. Ah, there we go. OK,
I'll go ahead and right click. There we go. And now face button one pressed. And you'll see several here. What you actually want,
you want the event, right? You're listening for
when the event is called. You don't want
to call the function. You want to listen for when
the face button one is pressed. We can delete all of this. And also,
because we're not going to do anything on tick, we'll just go ahead and disable that. And we'll save a
little bit of overhead. And now,
this remote here when it's grabbed, we should actually call this. And why don't we do this the
proper way of actually seeing if this logic works
before we start working on some more logic. So Alex-- Oh, I've got to add
the remote to the world. We'll get a remote
going in the world. And we'll snap it, if it has
collision, to the nearest vert. All right,
we'll go ahead and hit play. And hopefully, yep,
you're good to go, Alex. You should be anyway. ALEX: Can I have my hand back? VICTOR: Oh, you need your hand? ALEX: Yeah, I kind of need that. VICTOR: You know,
you could use your left hand. ALEX: I could. I guess I could. Eh, all right, let's see here. Oh, I got to have to back it
up and not walk into the table there. Aha. VICTOR: Aha, ah, all right. Could you now go ahead,
and when you're holding it, can you press the lower
button on the controller? ALEX: How do I-- A? oh, ah! VICTOR: Oh, oh. ALEX: Nope, wrong button,
wrong button. VICTOR: OK,
so here's an interesting thing. That A button on
the right controller is actually assigned
to teleport as well. ALEX: Yeah. VICTOR: So we're going to
go ahead and get rid of that. That's an accessibility thing. ALEX: The B button, the trigger? VICTOR: Let's see. Go ahead and go here. Movement axis, right, Y.
Yeah, here, we'll go ahead and just remove that. And now, let's see if
we've got that going for us. See,
I can't actually do anything. Oh, I can go ahead and
bring up questions here. ALEX: Oh-- Oh, there we go. OK, so you said which
thing is to activate it now? VICTOR: Yeah, the A button. Same that you were
using to teleport there. ALEX: Oh,
but when I'm doing this-- VICTOR: Yeah, I just want
to see the hello print string. Or at least that's
the goal here. ALEX: Are you seeing it? VICTOR: We are not seeing it. So let's go ahead and debug. And so we take it from the top. We're going to make
sure that face button-- Oh, it's because we
used face button two right. We want face button one right. ALEX: Yes. Yeah,
that would have been the B button. I could have got it to work. VICTOR: You could have, yes. ALEX: I didn't realize. VICTOR: All right. ALEX: Debugging is fun. All right. VICTOR: Hello, hello, hello. That's what we want to see. ALEX: Oh,
I can see it in the corner of my eye. VICTOR: All right. So here you can see how
we've set up some new custom logic with more inputs
for the controllers. And then we are listening for
that using the VR Interaction Blueprint interface
event face button press. But just printing hello is
not a lot of fun for a remote. So let's go ahead and interact
with like the VR spectator or something. We'll make this real easy. We'll break some rules here. We'll just get all
actors of class. I know. I know. But there's nothin-- ALEX: There's
nothing wrong with that. VICTOR: No. ALEX: It's on button press. VICTOR: Yep. ALEX: There's
nothing wrong with it. VICTOR: Yep. ALEX: It's not on tik.
It's not on some uncontrolled tick or something. VICTOR: You know what? I'm not even going to care
about making sure this is valid, because I know what's in there. And we're just
going to go ahead, and why don't we enable-- ALEX: Boy,
that's the game developer logic that'll get
you in the long run. VICTOR: Yes. ALEX: I know it'll be valid. Oh, sure you do. VICTOR: OK, OK, fine. Here is is valid index. ALEX: One little is valid check makes
Alex so much happier. VICTOR: It does. And then we'll do an and here. And we'll is valid. ALEX: Beautiful. VICTOR: Tap these in, B.
Hold B. Click left. All right, there we go. Now, all fine and dandy. Nothing will break, hopefully. All right, we have the VR
spectator object reference here from our get actors,
get all actors from class. And why don't we go ahead and check,
is enabled. Because there's
actually enabled-- get spectator enabled. If we are enabled, we would
like to simply just disable it. This will give Alex
power over me. ALEX: Power. VICTOR: Power. And just to make
this real nice here, we can go ahead and
put all this into a function. I just mark all of them,
right click, collapse the function. I will call this spectator on,
off toggle. ALEX: Ah, the collapse features. They're just instantly you get
your head space back, right? It just feels so much freer. VICTOR: And why I love this is because here I can
promote to a local variable, local spectator ref. And now we don't have to drag
a whole lot of spaghetti around. We can make this look
a little nice and tidy here. ALEX: And don't forget, y'all, you can always double
click on any of your wires in order to reroute them. VICTOR: Very, very nice tip. ALEX: And you can press
Q to make everything line itself up. VICTOR: So if we are enabled,
we want to disable. And if we are disabled,
we want to enable. All right,
so we're doing a little magic here as if there was some
form of ray being thrown. But this right now,
if everything is hooked up correctly, it should allow
Alex to have a custom object. And the only logic he
had to write was the custom
logic that he wanted. ALEX: All right, so we're going to go up in here. I'm going to pick
up this remote. All right,
so how do we know it's working? Don't you have to like
fly around or something? VICTOR: If I'm flying around and you're hitting that button-- oh,
yep, see, now you've got the power. ALEX: I press the button
and you can't move. You can't move. VICTOR: I can't. ALEX: And then I press it. And now we all move again. [LAUGHS] And I press the button,
and you can't do anything. And I press the button,
and we're going. And I'm going to get my hat. VICTOR: Get it on there
while I get some questions up here. Cause I do want to make sure
that we cover some of them. While Alex is
having plenty of fun. ALEX: Yeah,
please, this is great. I got to fix my hat now. Hang on. VICTOR: It's good
that you're having fun. [MUMBLING] ALEX: [INAUDIBLE] my hat. VICTOR: All right,
we got two pages of questions, cool. I'm going to dive into
them in just a moment. ALEX: Aw. VICTOR: There is
one more sort of feature that the VR template
comes with that I wanted to cover in detail. And that is the built-in menu. Alex, can you do us a
favor and put the headset-- ALEX: Oh, wait. [INTERPOSING VOICES] ALEX: There you go. VICTOR: Can I have it? OK. ALEX: There you go. OK, and you need the hands. VICTOR: Someone said,
Victor, why are you not donning the headset? I was like, well, I will. I will here. And I can show this off. Got going. All right,
so on most controllers, you can click both the left
and the right menu button. However, on Oculus they
have that right one hard coded for their system menu. And so the left one is
the only one that works. If you tap that,
the little simple menu comes up. You can use the left thumbstick
to move the cursor around. But you can also just
take your right controller and just point it at the menu. Oh,
I lost track in there for a moment. Cause there's not
a lot of reference. There's like a translucent
bottle behind. And now you can also
use the right controller and point it at
the left controller. I think we might be
losing battery here. Anyway, it's really nice. This system just automatically
switch back and forth. I can use my thumb stick
to move back and forth. Or I can just point the
other right controller to it. Why I like that is because
it's an accessibility feature that allows folks who might
not have two hands to be able to use the menu. Otherwise, you prevent them
from playing the game entirely. Now, I do know that some
design requires two hands. But I think you
should always try to strive for the possibility of
playing the game with just one hand. Also makes it really
nice when you're working on it,
because you don't have to pick up both controllers
every single time you're debugging something. OK,
I'm going to give this back to you. ALEX: Oh, yeah. OK, OK. I'll tell you,
like way back in the day, Lauren, well, Ridge back then,
Barnes now-- Lauren Barnes, our programmer. We used to have
this issue where it's like you know how they have
like the internal detector, right? VICTOR: Mhm. ALEX: Same thing with like,
oh, everything has to detect something. And what she would
do is she would just put like a piece of tape
on the inside of this thing so it would never turn off. And she could
just keep it going. VICTOR: That's clever. There's actually a
tool specifically for the-- ALEX: Oh, yeah,
they got tools now. VICTOR: Yeah. ALEX: They got stuff for it now. This is back-- VICTOR: You can disable it. ALEX: This is in
the before beta days. VICTOR: You can disable
the proximity sensor. ALEX: Oh, yeah. VICTOR: Oh,
it was like operating like three devices at the same time. ALEX: Oh, yeah. VICTOR: Where was I? Remote. We are talking about the menu. Great, thanks, Alex. ALEX: Yeah, you're welcome. VICTOR: You'll find
the menu consisting of a Blueprint actor,
which is just called menu right here. There's a bit of logic in here. Some of this is
a little bit complex in how it's switching
back and forth. If you just want to add
options to the menu, and you like the
functionality that's there, all you have to do is to go
ahead and find the widget class that is applied to the menu. You can find it here underneath
the 3D widget component. Go read about 3D
widget components if you don't know
how this works in VR. The same goes for
how we are actually interacting with this 3D widget
using the widget interaction components that are
attached to the VR pawn. I'm not going to go into that. What I wanted to show you
is how you can extend and add your own options to the
menu in the VR template. The widget menu is
what that class is called. We can go ahead and find it. It's also just under Blueprints. Go ahead and open that up. And this is where you can
see the two options, Restart and Real Life. If you want to add another button,
you just go ahead and Control, mark these two. Control-C. Select the
vertical box that they live in. Control-V. And now we have another option. We can click the text
component in here. And we can pick. Alex, what are we doing here? What should the menu do? ALEX: Well,
so real life doesn't actually-- does that exit the
full application? VICTOR: Yes,
that exits the application. ALEX: Oh, I wasn't sure if it takes you back into
Oculus or if it just shuts it. Oh, OK, you could open a level. VICTOR: Open level, I like that. ALEX: Open level is a
very commonly needed little quick button. VICTOR: Yes, and I can tell you that that's already
what the restart button-- ALEX: Oh,
that's what the restart button does. VICTOR: That's what it does. ALEX: Well, yeah,
it's an easy open level. And then it restarts the level. That makes sense. It could either be that,
or let's see here. We could just like
change a random value in something of course. VICTOR: Change a random value. ALEX: Oh, yeah, like a dev tool. Like make it a dev tool
button that changes a value. VICTOR: Like a console command? We can do a console command. ALEX: Oh, yeah, there you go. Make it run a console command. VICTOR: OK. ALEX: Yeah,
make it run a console command. I love console commands. VICTOR: We'll call it
mystical console command. Does this even work? That's terrible. I'm just going to write
console command. ALEX: There you go. VICTOR: Console commander. All right, console command,
beautiful. All right, so we should go ahead
and name this button as well. So we'll call it console
command button, just like that. Compile that. Always save. Go back over to the graph,
which is where we write all of our logic for our widgets. And we'll say, console
command button, and on clicked. One of the reasons why I
wanted to show the widget portion of this is because when
you are using VR controllers, and you're pointing at a menu,
by default, the UMG,
Unreal Motion Graphics, is designed to be used
with like a mouse primarily. And so what that means is
if you look at these buttons here, by default we're
using the on clicked event. In VR, that is not always
the right one to use. And the reason why
is because it's really to have like shaky hands. Some people do. And if you're using
the on clicked event, you need to press
all the way in, like the trigger,
or the button if using that. And you also need to
release on the button for the clicked event
to actually execute. And so what you might
want to use instead is the on pressed button,
which only requires the user to aim in the right spot
when they press it down, and also not when
they release it. Should probably have done
that here for the default ones, cause it makes it a lot easier. But that's what we're
going to go ahead and do for the console command button. We're going to
do that on pressed. And then we're
going to just execute. I like the idea of adding
debug options here. Execute console command. And something that we're
all always very, very particular about when we're doing VR
is what frame rate are we at? So I'm just going to go ahead
and add the command, stat FPS. And so this should give Alex the
opportunity to open up the menu and turn our FPS
counter on and off. ALEX: All right,
let's take a look see here. All right, Menu,
Console Command. There you go. VICTOR: Oh, there we go. ALEX: Oh, yeah, there it is. VICTOR: And like I've said, my computer is not performing
the way it's supposed to. It's actually really strange,
because it's jumping between all kinds of numbers. And there you restart the level. ALEX: Yeah. VICTOR: Beautiful. So that's how you can add
a custom button to the menu. It's fairly straightforward. Remember that it needs
to go in the vertical box, because there's
actually a little bit of logic here in the menu that
makes sure that you can't move the cursor out of the box. And the vertical box
is what we're using. So just put it in there. All right, before I continue
with some of the extras I have added,
I do want to cover some questions, because I am sure
there are some things. ALEX: Oh, yeah, let's see. What questions do we have? VICTOR: We have a lot of-- here,
can I share this? Oh, no, you're not on
a PC. You can't see this. So it will have to be all me. ALEX: Read them out loud and I'll see what I can do. VICTOR: OK,
we can switch-- [COUGHS] Well, I'm going to pick them. And then I'll read
them out loud. Cause there's no point
in us picking questions that we can't really answer. ALEX: Oh, yeah, yeah. VICTOR: Samir
Ghosh is asking, "For features like Oculus
Quest pass through, can we expect to
use OpenXR in 4.27?" No, that is not a feature yet. That is part of the open
access specification. However,
I am sure Oculus is working to implement that as part
of their extension plugin. I really hope so. Cause I would love
to work with that. ALEX: Yeah, it'd be cool. VICTOR: Freemason is asking,
is UE4 VR OpenXR, UE4 OpenXR only?" No, in UE4 you can use both
Oculus VR, SteamVR, and OpenXR. And it maybe important
to talk about the difference between sort of Oculus VR,
the SDK, Oculus VR, the runtime, and like Oculus VR,
the plugin in Unreal Engine. Cause actually, it's different
how you work with them. In the VR template, we only
have the OpenXR plugin enabled, which if you're using
an Oculus device, you're most certainly using
the Oculus runtime, which is the Oculus app
in the background. I mentioned this
earlier in the stream, but it might be
worth doubling down. If you start a new
blank project in 4.27, and you have an
Oculus device plugged in, you will actually be using
the Oculus VR plugin, which is Oculus' native SDK that
is implemented in Unreal Engine, and not OpenXR. That actually
gets priority over. And so I think the question
here was in regards to if it's OpenXR only? And that is not true. We still do have the 4.27
version of the Oculus VR plugin, as well as the 4.27
version of the SteamVR plugin. OneLearner here is asking, "When sharing assets
from a VR project, like the sample Apollo 11,
to another 4.26.2 project is there a trick to
getting the files to work? Have they been optimized in
such a way they are harder to use?" A little bit of a
tricky question. When it comes to
specific projects like that, like the Apollo
mission where we're doing like pixel streaming
over for a Hololens, there's probably
a lot of stuff that's been configured
in the INI files. And you should probably
just use that project. If you want to just
use the assets, you can go ahead
and export them. But if you're trying to
grab all of the functionality in that project and
move it somewhere else, I would probably just build
it on top of the content that's already in that project. That would make
that a little easier. Let's see. Same duplicate
pass through question. [INAUDIBLE] is asking,
"Is Nanite and Lumen going to work in VR in my lifetime? I expire next year." That sucks to hear. We are working on making our
new high end rendering features work in VR. However, these are such
expensive rendering operations that actually
getting them to run at what you would be
considering a decent frame rate, so at least like 72,
even on a PC that's very unlikely. ALEX: Yeah,
especially with what Lumen is. Like I understand if
somewhere in the future Nanite,
because it's like this thing that helps you better compress and
understand polygons, maybe. But what Nanite is in the end,
it would take a lot more then that. I don't know, yeah. I would say we'd have to bring
in like one of our senior devs and really-- cause the more I'm
thinking about that, it's like it takes time. I don't want to say-- I would never say
no on anything. You can never say no. VICTOR: I mean,
hardware will get faster. The code will
get more efficient. And at some point
they all will work. But having them run at
90 FPS in the next year-- ALEX: In the next year-- VICTOR: I'm going
to say that that's a no. ALEX: Love your optimism,
though. Really like the optimism. I really hope so as well. It will work eventually,
but who knows what-- yeah, basically. VICTOR: And saying like so why is Epic even working on making
these rendering features work in the editor right now if it
doesn't run in performance? There are enterprise
applications where turning on ray-tracing,
and Nanite, and all these things actually work
even with a lower frame rate. And so at the very least,
having the features functional in multi-view is
something that we'd like. ALEX: Yeah. VICTOR: Let's see. JimmyAnimAll is wondering,
"Is it possible to use pixel
streaming for AR project?" So Windows Holo remoting
is actually pixel streaming. And so, yes, you can do that
with the Hololens, Hololens 2. I am not sure how
you would do that with sort of a handheld AR,
like ARCore, ARKit. I don't know. ALEX: Yeah. VICTOR: Let's see. Sigrid is wondering,
"Can we get an Unreal VR tutorial on collision and
physics based free locomotion? For example, like bone works. It's not as straightforward
in Unreal as it is in Unity." Because of that being
such a specific thing, and it can be implemented
in many different ways-- and doing that in Blueprints
would be very specific and a lot of hard coded things,
I don't think we'll
produce a tutorial on that. However,
there are resources in the community. Folks have already done that. So if you want to
find out how to do that, there are sample projects,
marketplace packs, and other various plugins,
et cetera, that are doing that
kind of stuff already. So you can go ahead
and search for that. Bushfire97 is wondering,
"Will all of our previous projects work on the new OpenXR? Can OpenXR effect corrupt
any of the exports to Quest?" I mean, is there a chance? ALEX: This is software. There's always a chance. VICTOR: Yeah,
pretty much what I wanted to get to. ALEX: I've learned to never talk in definitives here. VICTOR: Yeah,
the main difference you'll see, if you were using
Oculus VR previously, so you were running 4.26,
and you were not using OpenXR, is some of the stuff I talked about,
about like the motion controller orientation. You're going to have to
redesign your implementation for your motion
controllers based off of the new orientation. That's one thing. That's one main thing. You do not have hand
tracking using Oculus Quest and OpenXR right now. If you want Oculus
Quest hand-tracking, you actually need to
currently use Oculus VR. But it won't be too long before
the Oculus VR OpenXR extension plugin supports hand
tracking on Quest. And so that's what I was
mentioning a little bit earlier when we started about actually
shipping on the Oculus Store right now. The Oculus OpenXR extension plugin,
that is actually enabled in the VR template,
and we're shipping with that. I should have
mentioned that earlier. ALEX: Oh, yeah, in chat, Deviver did answer one of
your previous questions about Boneworks physics stuff. There is on the marketplace
something for that that they created. So yep,
just going to throw that out there. VICTOR: Just yep,
throw it out there. ALEX: I love our community. It's like every time I'm like,
oh, I don't know if I could do that. VICTOR: Post a link. Post a link. ALEX: Yeah, post the link. And there's always like a
good plugin or something that someone's been working on. And it's awesome. It's really great. Yeah, thank you. Please, post that. That's great. VICTOR: Please, keep-- it's always great to
hear what kind of content you're looking for. There are occasionally
opportunities for us on the livestream to cover
things that we otherwise wouldn't have done. And so make sure you fill
out the survey later when we drop it in chat. ALEX: Oh, yeah. I saw a question earlier,
but I don't think it got prompted with question. And it was just something like,
I want to work at Epic Games. What do I do? Or something like that. And I just wanted
to address that. You can go to
careers.epicgames.com. And you can see
everything we have open. And so we are
very cool with just come in,
drop your resume in the thing. Careers.epicgames.com. Just want to throw
that out there. It's not a grand secret. It's just that we don't
always advertise it very well. VICTOR: Quick mention of
the Oculus OpenXR plugin. So this is Oculus
OpenXR extension, which is the plugin that
they will be working on to support some of the
specific features to their devices. So keep an eye out
for when that gets updated with some of
the features that it currently doesn't have,
like hand tracking, for example. Early days, folks. It's a good ride to be on,
though. Just being able to see the-- like I've shipped VR games that
works on both Oculus and Vive. And previously
like the logic would be so hard coded between
the two different platforms. And here,
the fact that I can just-- I actually generally
have my Vive plugged in, and the Oculus Quest. And all I have to switch is
just the OpenXR runtime. And I restart the editor. And now I hit Play in VR. And I can switch
between the two devices. And that is super handy. Device visualization,
just rendering the right controllers that I'm using, it's really,
really, really, really handy. Let's keep going
through the questions. Good questions here. Keep them coming. ALEX: Yeah, they're really good. VICTOR: AntiAnti is wondering,
"Can I get platform independent
rotation of fingers with OpenXR?" Currently the only
hand tracking that we're supporting with OpenXR
is through the get motion controller data component. And this is actually
a good question. It leads me into a
topic I should talk about. Here in the motion
controller data function, we actually do have hand key
positions and some of the data that you might want to use. This is currently
only functional for the Microsoft Hololens
and their hand tracking, not for any of the other ones. I am going to write up a post
on the forums at some point how to use index
finger tracking, and how to build that
with the-- cause Oculus, these have touch sensors. Even though they don't
know what all fingers are doing, they do know what your
middle and your pinky-- sorry, your index, middle-- I know English-- index,
middle, and thumb, they do know where they are. And building a generic
system for all of that is something that I really
want us to have a sample for. ALEX: Yep. VICTOR: And I
know that's tricky. I haven't figured it out myself. ALEX: You got to also throw in-- I will totally dust
off my Leap Motion if we can get that
hand tracker working. I love the Leap Motion so much. I feel like it was
under utilized. VICTOR: Have you tried
the hand tracking in the Quest yet? ALEX: You can
actually see your real hands? I haven't done that, no. Oh! VICTOR: OK, well,
I apologize, everyone. I did not know that
Alex had not tried the hand tracking in Quest. ALEX: No, I haven't. VICTOR: So the last time you did that was with the Leap Motion? That was quite a while ago. I'll make sure you get a
demo of that before you leave. ALEX: Oh,
the careers at epicgames.com thing does reroute to a different website,
by the way, everyone, but that does work still. I got a question about it. Yeah, that's so funny. VICTOR: OneLearner asked,
so use a sphere always for collision instead of the
VR camera controller thingy. There are very few times
there's an always in game development, my friend. ALEX: Yeah, very few. VICTOR: Very few times. ALEX: Contextually
in this instance, for what we were building,
it worked very well. VICTOR: For the way
that I wanted it to work-- ALEX: Yeah, exactly. VICTOR: That's what I wanted. ALEX: Oh. VICTOR: There's
a lot of personal-- I like the collision sphere
because it's generic enough. And you can always
reference that. Say you also had a device
visualization for the head mounted display model. You were switching those based
on which device the user was using. Which by the way,
we did update the get HMD device name function. This will actually return Quest one,
Quest two, index, Vive, et cetera. So you can get that. But we do not yet have device
visualization for the camera component like we do with the
motion controller components. Maybe in the future. Let's see here. Oh, OK, LeRemy.
I think LeRemy is-- let me clarify the
question here. That's not the question. "Once you have selected
the OpenXR plugin and disabled the other plugins,
you can change--" Oh, you're answering someone. ALEX: Oh, OK. VICTOR: Yes,
so if you are developing a project in Unreal Engine using OpenXR,
you as the developer, you can work entirely with
just the Oculus, or the Vive, or Windows Mixed Reality. And then when you ship that,
your user will probably have one of those,
or should have one of those runtimes enabled. And they have to pick that as
the OpenXR runtime to be used. The only time that
you as the developer have to switch
those is if you want to, A, try a different runtime. Because you can actually
use like an Oculus device with the SteamVR, with the
SteamVR as the OpenXR runtime. It might be a little
complicated there. I generally just go
with the native ones. But you can. And the only times you want
to change that as a developer is if you are testing
different devices and you want to test the
different runtimes with all of the hopefully agnostic
logic that you've written for your game, or experience. OK, let's see. Joe [INAUDIBLE] is asking,
can I use the grab on a static mesh to, for example,
grab a wall and climb? No, it is designed to grab an
object and attach that to you. However,
climbing logic is not very difficult. ALEX: Yeah,
it's basically doing it in inverse. VICTOR: Yeah. ALEX: Where it's like
as you pull something, you're just going to be moving
the view, really the HMD's perspective more than anything. So if you're trying to do
something like the climb, then yeah, you just have
to invert some of your math basically. Apply it inversely. VICTOR: Here's the trick. This function, negate vector. It's basically-- ALEX: Oh, yeah, I always forget that that's there,
negate vector. VICTOR: Get your
motion controller. Get world location, like on
tick, right? You want to, yeah, on tick. Negate that. And then add this to-- ALEX: Yeah, you've basically got to get the delta of the
difference between where it was to where you're moving. And then you just
take that and use that. VICTOR: You can just
add an offset-- ALEX: Yeah, oh,
yeah, there you go. [INTERPOSING VOICES] VICTOR: Add world. Add actor world offset. Yeah, add actor world offset. Yeah, this here. Yeah, I'm not going to
build it all and test it out. But the VR expansion plugin
actually has that climb mode. I should note that
one of the things that I would like for
us to put together for maybe not 5.0-- that
would be a little tricky. But the plan is to build a
VR content examples library. So just like we have
a content examples library for a lot of the
other things in Unreal, we do want to build one for VR. And different
forms of locomotion is probably the first thing
that I would build for that. And I have a pretty good way
of doing it just in Blueprints. That makes it easy. Preferably, it should all be
used in the character movement component, because that
would be great, because then it works in multiplayer and such. But that's a little difficult
to do as is right now. ALEX: We should definitely have at least one bad example
of how locomotion should not be done so I can make people puke,
just cause. A little revenge. A little revenge. No? So oh, yeah,
what are the questions we got? I saw [INAUDIBLE] had asked
us a question about looking at different-- to preview materials, do I have
to keep putting the headset on and off again? There is a command for
looking through one of the eyes, but at some point
you will eventually want to look at
stereo in the headset and look at the material. So that's going to be kind of-- VICTOR: Wait, was the question of how you can develop
VR without a headset? ALEX: No, it was how you can like look
at the materials without putting on
and off the headset, and know how it's going
to look in VR versus not. And there's a way to
just look through the eyes and have them-- VICTOR: You can add
dash emulate stereo. ALEX: Thank you. Dash emulate stereo. VICTOR: Dash emulate stereo. I actually haven't tried this. ALEX: It's been
years since I've-- VICTOR: You have to
do it on the target line. You have to launch game,
and then dash emulate. ALEX: Yeah. VICTOR: I don't-- ALEX: Because we had
to use this all the time when there was a
Vive shortage basically, and we all wanted
to build stuff out. And then we had
an Oculus shortage. But during Robo Recall,
half of the development was done using
this tricky technique. VICTOR: Yeah,
I don't off the top-- I know you can
add a target line. What I suggest,
if you don't have a VR headset and you do want to develop VR,
Microsoft has a Windows
mixed reality simulator. ALEX: Yeah, there's that too. VICTOR: Yes, they have a-- yeah, Skye,
if you can Google that for us, please,
and drop that link in chat, the Windows Mixed
Reality Simulator. And you actually are able
to simulate a VR setup, and hit VR preview in
the editor without actually having a headset plugged in. ALEX: See, Gladpus is mentioning my Quest
keeps falling asleep. So Lauren,
she had that issue with hers. So Lauren just put a piece
of black tape over the thing so it never fell asleep. And it just assumed
it was always in VR. So that's where I was
talking about that it's funny you mention that. VICTOR: PavlovTM,
is that the Pavlov devs? That'd be awesome
if you are here. ALEX: Oh, wow. VICTOR: Love the game. Maybe not. Maybe it's just someone
who likes Pavlov. Either way, question. 5.0 with Visual
Studio 2022 support? I don't know. Good question. Please ask that on the
forum announcement thread. I will find out
after the stream. ALEX: Yeah,
it'll take a little digging. VICTOR: Because after this I'm going to be a little fuzzy
from all the talking and all the prep today. So please ask it on the
forum announcement thread, or just ask it on the forums,
and @VictorLerp. ALEX: And he has
to take me to dinner. So don't y'all interrupt that. That's going to be very nice. VICTOR: No. Let's see here. Tom-- Thom the M is wondering,
"Any plans for multiplayer support in the template?" ALEX: No, wait, that is Tom. The H is silent. VICTOR: Tom, oh,
yeah, you're right. ALEX: I'm sorry. It's an English-- VICTOR: Yeah, yeah, I know. ALEX: I'm so sorry. It's a rarity in English. It's like Geoff,
Geoff, G-E-O-F-F. VICTOR: What? It's not "Gee-off?" ALEX: No, I know. I know. I do that to poor
Geoff all the time. VICTOR: Tom the M is wondering, "Any plans for multiplayer
support in the template? I feel like there's
barely any information on good common practices
for multiplayer in VR, which makes it
difficult to learn." So we don't have native
functionality in Engine that would make you ship
like a competitive game. But the VR expansion plugin,
Mordentral has been working for years. I have shipped several
games using that plugin. Just VR expansion plugin. He's produced so much resources. It's up to date. Just yeah, VRE UE4. I can write this in chat myself. VREUE4.com. There's tutorials. He does weekly
updates for each patch. Bye, Luos. Thanks for hanging out. ALEX: Oh, yeah, see you, Lewis. VICTOR: I'm being rude
for posting the link myself? Well, I figured I can
do a little bit of the work. [LAUGHS] Let's move through questions. We don't have too many more. And then I'll dive into some of
the performance considerations and some known issues,
and a little bit of extras that I just want to point out. DarkMatter4 is wondering,
has UE 4.27 OpenXR been tested with the
Magic Leap device? The only OpenXR head mounted,
augmented reality device that we support with OpenXR
is currently the Hololens 2. Magic Leap, we have not tested
Magic Leap with OpenXR in 4.27. I doubt that would work. It is still a platform in 4.27. So you'll have to
build it for Lumen, which is not the
global illumination method, but the platform
that you build for Magic Leap. You'll still have to do it using
their native SDK in platform. ALEX: "Unreal Engine
deleted your message." [LAUGHTER] "Unreal Engine
deleted your link." Because it's like,
that's someone else's job, mister. VICTOR: Next time
I'll get banned I bet. I need to read chat, OK? It's important. James O'Loughlin is wondering,
"I haven't dived into how
OpenXR handles hand tracking. How does OpenXR handle
hand tracking between platforms, specifically Quest, Hololens,
maybe Magic Leap, and Alter Leap, if supported?" So that specification
is still in the works. It's something that I want
to dive into moving forward. I do know that currently it
only works on the Hololens 2. So stay tuned for that. I know that's important. I really want that
myself as well. I mean,
having generic hand-tracking between different devices
like that would be great. Currently, you cannot do
Oculus Quest hand-tracking using OpenXR the release version
of the Oculus OpenXR plugin. And you actually have to
enable the Oculus VR plugin, if that's what you want. And you can
actually have both on. Oculus VR will take priority. But if you are using
like a SteamVR device, and you have only Oculus VR
plugin and OpenXR enabled, OpenXR will actually be used. Let's see. BryanTolar was wondering,
"Where are Focus 3 models in OpenXR content? Is Focus 3 supported yet?" The Vive Focus has
their own implementation. I know that they are working
on supporting OpenXR. I don't know when. But currently,
if you are using the Focus 3, you have to download the
HTC's SDK for the Focus specifically. You cannot use the tools
that we shipped in 4.27. And I've done that actually. It works pretty well. Cause the thing is you're set
up for development immediately. It's really handy. ALEX: So not a question, but I saw some
people in chat talking. I saw Shadow and
Cipher are talking about me making my evil,
make-them-puke VR thing as actually like a
good example of things, how to do stuff
and not do stuff. And mentioned in there,
it's like you can actually train yourself out of VR sickness. And it's true. You can actually train your
brain out of motion sickness in a lot of ways,
which is what VR sickness kind of is. It's like motion
sickness basically. However, there is a trade off. You might do what I did,
and accidentally-- like you'll fix
your VR sickness. I never get VR sickness anymore. However, in real life,
if I look down a really far distance, I might get vertigo,
which I never had before I did all
this really early stuff. And I threw myself
off a lot of buildings, which was what I think
caused it is I was jumping off of stuff a lot and trying to
test how I could get myself sim sick. So heads up. There can be trade offs. The other thing I
discovered is never, ever, ever put a VR system on
anyone under the age of like eight. Children can very easily be
convinced that VR is 100% real. And they'll come out of it
talking about it the next day like they really did
whatever it was. And they'll have no
concept of-- yeah, so depending on their psychology
of how their perspective is, just a heads up. I don't design games
for really little, little kids, because that was such a
weird thing that we bumped into. So just a couple of
VR notes from the years of odd experiences. And then, yeah,
but I would like to get a good model about vehicling, you know,
how we can pull people around. And if they feel like
they're in a vehicle, it reduces simulation sickness. So that's one thing. It's also why we do caging,
where we'll put like a false
cage that you can see. And we'll do teleporting as
opposed to pushing people. So there's a lot. Oh, yeah, did we talk about
that ages ago, Shadow? I'm sorry. Yeah,
because sometimes I forget. I've had these
conversations a lot where I'll talk to someone
who's like, I'm doing a VR game. I'm like, let me just tell you,
straight up. We had some kids,
and it was not good. It wasn't good. The psychologist that we
had and all that was like, this is not safe. So yeah, it can really mess
with your perspective of reality. And children don't have a
very solid grasp on it already. So that was like a thing. VICTOR: Thanks
for entertaining chat. ALEX: No, I just wanted to make sure that we
talked about a couple of very,
very important side notes. Vehicling is important. Knowing your
audience is important. VICTOR: I wish I've had
a co-host for the last year, because like it's
really nice actually. It's really nice to have you here,
Alex. ALEX: I miss hanging out. It's just I've been
doing so much training, like one on one with people
and clients and stuff now. So we'll have to do
something about that. Don't make a VR
multiplayer game by yourself. Also a good suggestion. VICTOR: I've been there. ALEX: Yeah, everyone has tried a little something
or other at some point and realized like,
you need someone to handle-- VICTOR: Or not. I shipped it. ALEX: Yeah, I'm glad. VICTOR: I would not
recommend that to anyone. ALEX: That's not easy. VICTOR: That was terrible. And I had never done
multiplayer before that. So that was quite
the experience. ALEX: Oh, yeah. VICTOR: Learned a
lot during those years. Let's keep going
through questions here. TimotheAwesome is wondering,
"Does this work similarly with widgets
for in-game menus?" I'm not really sure what you are
referring to as in-game menus. ALEX: Oh, wait, what was the beginning
of the question? VICTOR: "Does
this work similarly with widgets for in-game menus?" ALEX: Oh, like 3D widgets? Or we can probably get
some more clarity on that one. VICTOR: Yeah, yeah. Feel free to-- ALEX: I would think that that means like the
3D widget interaction stuff. Cause all UMG in VR
needs to be 3D widgets. You can't impose
things onto the screen. So it all has to be 3D widgets. VICTOR: CerebralFrost
is wondering, is that menu system
specific to the template? That is correct. It is. I built it in Blueprints
in the template. There's the menu
Blueprint Actor that's being spawned on the VR pawn. You can go all
the way down here. This is the bottom of
the VR pawn event graph. You can see here,
I'm running the toggle menu function. And that spawns or destroys
the menu actor, which is here. Yeah, and then everything
is self-contained in the menu, which is really nice. The menu has the widget. It deals with all the
logic in regards to-- it looks a little funky here. I've tried to make it
as clear as possible. But it's switching
back and forth between being the like hand
cursor that you're pointing, or you just being able
to use it with one hand, using the thumb stick. And that's just automatic. It's nice. I did that for a little mini
game I built at some point. And I really liked it,
because it allows no matter how the
user wants to use the menu, they can just do that. Feel free to go in and
tweak some values. There's cursor speed. And yeah,
basically cursor speed. That's actually the
only one you should be messing with unless
you're changing the logic. ALEX: There's a question in chat that scares me a little bit. I'm going to address this. OneLearner is asking,
"Can you simulate the motion of a
vehicle vibration of a vehicle in VR Vehicle? Or is it pointless?" I'm going to go ahead
and stop you there. Do not vibrate the camera. VICTOR: Don't move the-- ALEX: Do not. Don't grab the camera. Don't move the camera. Don't vibrate people. Don't use steps. Like if they're walking forward,
don't do stepping. Don't do anything
that's going to move the camera outside of how
the player is moving the camera. That is how you
make people puke. So we found out, if you can't
use like the camera shake, do never use camera
shake tools on VR. When you're
rolling in a vehicle, you want the character to be
very smooth and just following wherever the vehicle is going. You just want to
be smooth with it. They'll move their
head around a lot. They won't even notice. But you cannot,
under any circumstances, just start pulling the camera
around for the sake of realism, cause they will totally vomit. That will do it. VICTOR: Cipher is wondering,
now, widgets react to raycast? Or it's still using colliders
around the buttons? I'm fairly certain
it's a raycast. It's not like a-- I haven't manually
built-- like it's all based on UMG and the
3D widgets component. Let's see. Timothy Harris is wondering,
"What lighting tools are integrated for OpenXR?" So I'm not sure how
you're looking at-- those are very different things. The lighting tools in
Unreal Engine are generic, the tools themselves, no matter
which platform, whether or not you're using native Oculus,
or OpenXR, or SteamVR, or you're building for a console,
or a mobile phone. The tools themselves
are the same. There's no difference in
what lighting features are supported between the old native
VR plugins and SDKs and OpenXR. There's no difference there. The world authoring
is exactly the same. MissedThing is wondering
if they can use a Playstation VR set to work with Unreal. You have to be a
Playstation developer. You have to be a
Playstation developer to be able to push and
develop for console. ALEX: So but just to clarify,
you can do that. You just have to
go to Playstation. Go through their
developer stuff. VICTOR: Registration, yeah. ALEX: Yeah,
thank you, registration. Get yourself approved. They'll create an approval
through us as well. And then you will
get special support. And I think there's even like
a special forum and stuff like that that you will get to go to
with the support of a console. But you do need approval. VICTOR: But it's all under NDA. That's important to note. ALEX: Yeah,
everything is super-- VICTOR: Yeah,
that development is under NDA. ALEX: I don't even know if we've ever
mentioned that there are NDA forums for special things. Like that's how much
we don't talk about it. But yeah,
if you want Playstation VR Unreal, go to Playstation,
and then they'll-- VICTOR: Talk to Sony. ALEX: Sony, sorry. Yeah, and then everything
will come through each other. VICTOR: Let's see here. Oculus already announced
a move to go with OpenXR. Do you have hints or clues
about other VR manufacturers like Pico or HTC? I know that Pico have
very limited OpenXR support for the moment. By the way, thanks for your
explanation about this new template. You're welcome SpeedLaw. And no, I am not going
to talk for the companies. Stay tuned for whatever
announcements they make on their social media
accounts and blogs, et cetera. ALEX: Yep. VICTOR: Let's see. ALEX: Oh, yeah, all the people talking about racing
games and spinning out. Yeah, that'll do it. That's a good puke
factor right there. VICTOR: Chronoc was wondering,
"Is this the same VR template that UE5
has in it right now? I love the spectator camera,
but it goes to a black screen when activated in the UE5." There was one fix that
went in for 4.27 right after we branched. And that fix is to make
the VR spectator camera work for OpenXR. And unfortunately,
that did not get in time enough for UE5 Early Access. We have iterated
or talked about this in a lot of places over again,
but it's worth mentioning that so the
VR template itself is actually more up to date. And the VR tools,
features, and support is actually more
up to date in 4.27 than what it is in
UE5 Early Access. But all of these
things are going-- all fixes, improvements,
and feature support, as well as all the
changes to the VR template are going to get
rolled into Unreal 5.0 when that gets released. Boydy222 is asking,
"Using the grab component within the template,
is it possible to do something like a pump action
with some kind of slider?" Yes, that would be a
custom grab on the left one. And you will be looking
for specifically when that's grabbed, and then animate. ALEX: I know the
answer to this one. You open up the mod
editor kit for Robo Recall. And you look at how they did the
pump action shot in that. VICTOR: It's in there. ALEX: Cause it's
in there for that. Because that was actually
one of like the first things that everyone's like,
you've got to be able to do the (GUN RELOADING)
thing with one hand thing. And 100%,
if that didn't go in the game, I don't think they were
going to end up finishing it. But they got it working. So check it out. And also check out
how they implemented how the shotgun will shoot. It shoots pellets. But it shoots them using
a delay zero over time. So it's like a bunch
of frame delays to trick the VR into being
more performant while shooting. So there's a bunch
of performance hacks inside of that as well. Yeah. VICTOR: Yeah,
it's pretty much the entire game. ALEX: Yeah, yeah. VICTOR: That game. That game. ALEX: Yeah, Chance Ivey. Thank you Chance and the whole
team that worked on that. VICTOR: Well, OK, yeah. But I was going to say,
the poster did not come from Chance. ALEX: Oh, no, no, no. But the Robo Recall game. And that was the
team around that, I think it was like Nick
Donaldson and Chance IV. VICTOR: Lots of
resources and all of that on our YouTube
channel as well as the forums. Shout out to Alex Morrow for
getting me the awesome poster that I've had on
the wall for a while. CyrusFBon is wondering, "So what happened to the mannequin
hands in the new template? It's not in there because
of the OpenXR?" Not specifically. The reason why
we decided to not go with the hands in
the new template is because, one,
they don't look very good. Two, they're not very immersive. They don't really feel
like your real hands. And three,
the plan is to actually have full, generic-- a standard
for hand tracking support. And even if you don't have
sort of full hand tracking, most of the VR controllers
do have some idea of what your fingers are doing. And so, ultimately,
with OpenXR, the goal is to have that data
so that you can actually draw a good looking hand. How all of that is
going to shape out, I'm not entirely sure yet. I don't know if anyone is. But that's what we would
like to see in the future. And so the OpenXR template,
or now, the VR template is designed for some of the
standards and features that currently exist in the
OpenXR specification. And it's in this way you can
get all of the device models. So that's why we went
with the device models. Actually aligning like
that mannequin hand with all of the
different VR controllers, like and making that
look good in template, I don't think you can do
that without it overlapping, or doing a bunch of
custom animation work. It's not fit for the template. Now, if you do want them,
you can just go ahead and download 4.26,
start a new template, and export those hands out,
and implement them yourself into the VR pawn. And oh,
got a few more questions. ALEX: Hey, Tom Lumen's in chat. Hey, Tom. VICTOR: Max
Sarkai is wondering, "Quest 1 performance with
the new 4.27.0 VR template is very low, around 20 FPS. Any suggestion
what's causing this? And what to change for
it to run 72 FPS again?" ALEX: 72. VICTOR: The template
has dynamic lighting. Ooh, and that actually,
you know what? This is good. I'm going to take that
as the last question. And then we're
going to dive into-- ALEX: I think you'd be wanting to hit 90, not 72. VICTOR: Well,
Quest 1 actually is at 72 hertz. ALEX: Oh, oh, oh, oh, oh. VICTOR: Yeah,
Quest 1 is at 72 hertz. And the Quest 2, you can
run at both 72, 80, 90, and 120. You set that up
inside the settings. And we support all of that. ALEX: Yeah,
I think I'll skip straight to 90. I know what anything less
than that does to a body. VICTOR: Cool, so with that, I do want to talk a little bit
about the performance, cause that is something that you
do have to consider a lot, especially if you're trying
to develop something that is across mobile
VR and desktop VR. The template actually
has stationary lighting. And so directional light. The directional light
is a stationary light. And it is drawing
dynamic shadows. Which on mobile
is very expensive. On Quest 1, I would not do that. I would not do dynamic
shadows on Quest 1. On Quest 2,
this actually runs at 90. But this brings
me-- so right now we're going to talk a
little bit about performance considerations,
but also some of the known issues. Now, we're talking specifically
about lighting on mobile. And there is one
thing that you have to change if you want
dynamic shadows on mobile. And that is to go to
the directional light, find a dynamic shadow distance. And this defaults to zero. So everything works in editor,
which it's kind of strange. We're going to work
and figure this out. But you just need
to set a value here. And now, boom, save that map. And now you actually
have cascade shadow maps, the dynamic shadows
on Oculus mobile. Now,
this is still very expensive. And I probably wouldn't want-- It always depends
what you're doing. If you have a lot of draw calls,
a lot of dynamic,
movable actors, they're all drawing dynamic shadows,
that's going to be really expensive. But the preferred way to do
dynamic shadows on mobile is to do modulated shadows. And to do that, you hit the
cast modulated shadows button. The reason why
this is not default is because we have
mobile multi-view enabled. And currently, modulated
shadows in mobile multi-view, it doesn't work. It doesn't work. Those two, it doesn't work. We're working on that. It's supposed to work. We want you to use
modulated shadows when you're working on mobile in VR. But currently, in 4.27,
it is not working. I don't have a date or
timeline as to when it will. But we are working on
making sure that works. So right now, yes, you can
have dynamic shadows on Quest. No, it will probably
not perform on Quest 1. But on Quest 2, it will. But you have to set a
dynamic shadow distance for the stationary light. ALEX: [INAUDIBLE]
lighting tricks that we did in Robo Recall
also for fake dynamic lighting that's actually static
meshes that kind of like follow stuff around and
provide fake lighting. VICTOR: Cool, so for Quest 1 and performance in general,
what I would go ahead and do is to set the directional
light to be completely static. Just flip that, bake it, and now
you can implement blob shadows. Or what I did when I designed
my little project for Quest 1, was that I actually
made it entirely unlit. And then I just built a simple
little master material that did look at a direction,
and a vector, and just built a shadow
bias like on the material itself. ALEX: Yeah,
that makes sense to me too. VICTOR: There's
plenty of posts about that. I didn't figure the
core out of that myself. VICTOR: I just Googled it. ALEX: Dynamic shadows in VR are always like this fun thing,
because it's still the Wild West in a few parts. Where everywhere in
like the standard 3D world there's like 100
pre-made concepts of making good dynamic shadows,
blah, blah, blah. But in VR,
because you have to move it twice, there's all these like cheats,
and hacks that people come up with. The Robo Recall
has a really clever way of handling this stuff too. And it does some of this kind
of globular [INAUDIBLE] stuff. Oh, no, has my microphone fell? My microphone fell. Someone said I'm hard to hear. VICTOR: Oh, wait. Sound check. Oh, is your battery dead? ALEX: I don't know. VICTOR: Here check your-- [INAUDIBLE] OK, I have a spare for you. ALEX: Yeah,
but I think it was just my mic fell [INAUDIBLE]. VICTOR: No. ALEX: Oh, oh no. I must have leaned back,
and it killed it. VICTOR: Oh, you turned it off. ALEX: Oh, low battery. No, it's the battery. VICTOR: OK, here. Let's switch real quick,
cause I do want to go through a little bit of
extra stuff here. I'm going to do a quick mute. [Singing in the tune of "A Whole
New World"] A whole new mic! Yeah, yes,
you're getting a whole new mic. ALEX: A dazzling mic. OK, hang on. VICTOR: Here, you just-- yeah,
you know what? I'm going to mute this. ALEX: All right. OK. VICTOR: All right,
all right, cool. I got Alex sorted with a new,
fresh battery. Oh, so I saw a question from
MasterAgentMiyazaki in regards to the georeferencing
subsystem class. Unfortunately, I have no
idea about the intrinsic types. Post the on the forums. Yeah,
post that on the forums so we can get that in front of
[INAUDIBLE] and Sebastian. And perhaps they might
have an answer for that. ALEX: There we go. Victor sings the
Disney classics, coming to a store near you. Please! VICTOR: That's Inside
Unreal After Hours. ALEX: Please. We've got to have like a-- VICTOR: It might happen
after a few beers-- ALEX: I was going to say, we have to have
like a stream where it's like,
we've been a couple of drinks in. We'll do some
karaoke or something. Because I got-- VICTOR: It can't be a stream,
though. We'd get DMCAed. ALEX: Oh, you're right. Oh. VICTOR: We'll have to
make it all up ourselves. ALEX: Have to write
some original songs. VICTOR: All right,
I think that's it for questions. So I'm just going
to dive in to consider some of the known issues. And then we'll wrap it up
with some neat, little extras that exist in the template
that you might not know about. Talked about dynamic lighting. There is also a best
practices documentation page. VR. I'm going to go
ahead and link this. ALEX: Oh, I'm getting some echo. Am I getting feedback? Volume high. VICTOR: It's echoey. ALEX: Oh, so I have high volume and you're echoey. Where is my volume control? VICTOR: You shouldn't
have any on the-- ALEX: Oh,
there's no volume control on this? VICTOR: No. ALEX: Oh, it's on that? OK. VICTOR: I think
it's just how you put-- no the microphone
looks fine here. Let me-- ALEX: Pull the
microphone way down here. How you doing? VICTOR: Check one, hey. ALEX: This is the chest
hair livestream now. Got to get this working. VICTOR: Wow, I think the
FX was applied right there. ALEX: Oh, oh,
FX was being played. Ah. VICTOR: I'm actually
not sure what happened. It does sound really echoey. I don't know why. You know what? We don't have too much left. So unfortunately,
I'm just going to leave that on. It sounds good in here. Is that something
going through OBS? No. ALEX: OBS seems
to be picking it up. VICTOR: It looks fine. I am not sure what it is. Nope, check. Oh, check, hey. Actually, Alex,
I'm sorry about this. No, that's all good. I'm not sure. I think it's just a combination
of the two microphones right now. ALEX: I'll put up a wall here so they don't pick you up too. VICTOR: Yeah,
I'll turn Alex down a little bit. ALEX: I'm a little sensitivo. VICTOR: All right,
can you give me a sound check? ALEX: There we go, fixed. VICTOR: All right,
doing it live. ALEX: Yeah, yeah, yeah. Use your laser eyes. I will. Oh that's Kate. Hey, Kate. VICTOR: All right, so back to some of the
performance considerations. I was going to link you-- oh, that's a 404,
page not found. ALEX: Whoops. VICTOR: We have a
documentation page for best practices when it comes
to configuring your Engine INI. Go ahead and read that. There's a lot of
good notions in there about things you can toggle on,
toggle off, set quality levels lower and higher. Some of them have already
been applied in the VR template. So if you don't want to
mess with that too much, you can just go ahead
and use the VR template. And you should be
good for a lot of cases. But I should note that, yes,
like developing for mobile VR-- developing for
mobile in its own is difficult to hit performance,
and mobile VR even more so. So make sure
that you test often, and have a good reference point
of when you're running a frame rate and when you don't. And know that you
can roll back and then figure out what's actually
happened in between there. It can be real tough. All right, back to my notes. Some of the known
issues that I've listed them on the-- there's a forum thread
for the OpenXR VR template that you can go ahead and visit. If you're working with it,
doing experiments, or just want to talk about it,
finding bugs, et cetera, please, go ahead and post there. We'd love to hear your
feedback and what you're doing. It's always exciting. Like I mentioned earlier,
the controller models are slightly offset from
what they're supposed to be. The Oculus and the Vive
Control 1 Index is fine. All the other ones are fine. I'm working on getting
that fixed for 0.1. In the meantime,
you can go ahead and re-export them. Move them in like Blender,
and then export them back
into the editor. As soon as I have
those offsets figured out, I will go ahead and
post them on the forum so that you can do that
yourself until 0.1 hits the shelves. And we talked about
dynamic shadows on Quest. We also talked about how the
widget interaction components should be added to motion
controller components that are using the aim
motion source instead to get them to actually
point in the right direction when you're looking at that. And then last thing I
had in notes here at least was that if you are
using the HP Reverb, you might notice that
when you hit Play in VR, you're seeing the Oculus
classic touch controllers. And the reason for that is
that natively in Unreal Engine we don't have input support
specifically for the HP Reverb controller. We do for Windows Mixed Reality. But the HP Reverb
specifically has its own-- like its controller
is different. And so this is where
OpenXR is actually really cool. And even though you're
seeing the Oculus Touch classic controller-- oh,
you're looking at me. ALEX: I don't know. I'm just looking at you
cause you're talking. VICTOR: All right, I'm talking. Even though it might seem like,
hey, this is wrong, I'm seeing the Oculus
classic touch controller, what's actually happening
there is that you as the dev actually haven't implemented
support for the HP Reverb. And the runtime is going
to pick the closest matching interaction profile. And that is the Oculus
classic controller, because it has the
A and B buttons, and the thumbstick,
and the grip, and the trigger. And this is where OpenXR is awesome,
because that actually enables hardware that hasn't
even been developed yet to actually work on games
that no one might even be iterating, or updating,
or patching anymore. And so that's actually--
it's a good thing what it's doing,
even though it's wrong. Now, I do have a fix for you,
though. If you do have the
HP Reverb and you want to support it with the
correct input, go to Plugins, and enable the HP
motion controller plugin. And it gives you
controller mappings for the HP Reverb
G2 motion controller in OpenXR and SteamVR. If you enable this and
you actually restart-- I'm not going to do that,
because this has been running fine the entire time. So I'm not going to restart. If you do enable that,
you can then go ahead and find the
specific input for that. And you can go in here to grab left,
and add another input. And search for HP Reverb. And you can
actually add an input. And now that you
have that enabled, the device
visualization profile-- sorry, the device
visualization in the VR pawn will actually go ahead and
draw the right controller, or render right controller. Something that I really
want to work on in the future is to make sure that we
actually get access to the device visualization so that you
can update the mesh on that, disable collision on and off,
visibility, all that jazz that's currently
not exposed to Blueprints. But I would like for
that to be the case. So I'm going to work on that. All right,
that's a lot of stuff. I have more. [LAUGHS] I have more notes. Let's make this quick,
because I know Alex is hungry. I'm sure I am hungry. If you've been
watching this entire time, thanks for sticking around. This is definitely getting
drawn out a little bit here. But there is a lot of
things to talk about. 100 more things we could do. What I did want to mention,
though, as a last is that
the template comes with a really handy
master material that Chris Murphy
and I put together. If you go to Materials,
and then you look at M grid rotation, this is the master
material that's used on all of the static
geometry in the template. And this is a really
handy prototype material. I'm going to show a
little bit of an example here what you can do with the
grid default material instance. You can see that it's actually
world aligned so that it's changing when we're rotating. But if you look at the
MI grid accent material instance, you can see that it's not
actually world aligned, and it's relative
to the object itself. This is a nice
little check box that exists inside that is exposed
through the master material. And so if you go
ahead and you check that for am I grid accent here,
you can see how all these
stay world aligned. I just wanted to mention that. Because that's
really nice and handy to have that sort of
in the editor by default. You don't have to
download anything to just get a material that
works well to prototype with. Especially important for VR,
because you need some form of reference. If it's all just gray or white,
it's like really difficult to tell distance and scale. Oh, and something else
that people were discussing in Unreal Slackers,
and that I was like, huh, yeah,
where did that actually go for 4.27. There is now the possibility
to expose local variables in the material editor. And I always forget
what it's called. Cause it's not actually-- I even wrote the
GitHub link down here. And then there's a commit note. I forgot since I looked at this. But I do want to bring it up,
because it is super, super useful. And this is actually a pull
request from [INAUDIBLE] They originally submitted it. And then we just
modified it a little bit to make sure that there
was no name coincidences. What was it called? If someone in chat knows this,
please let me know. I was using this earlier. It had a different name. I found it. And this is why I
wanted to bring it up, because it was difficult for
me to actually find a solution. Do you know about this, Alex? ALEX: What's the question? VICTOR: Well, it's just what the create local variable
in the material graph is actually called. ALEX: What? You mean making a parameter,
or making a constant? VICTOR: So you're able
to promote your values to a local variable,
just so that you can clean up your material graph,
and by putting it somewhere else. And I don't-- ALEX: Ah. VICTOR: I know
[INAUDIBLE] would know. ALEX: Well, I'll answer a couple of small questions
while you're working on that. VICTOR: Go ahead. ALEX: LordHeanacho was asking about-- let's see here. Was asking about the
status of ray-tracing, and basically all
the various headsets. I just want to make
sure I got that. Ray-tracing and support on
the Oculus Quest, Vive, Cosmos, or Valve Index. Now, as far as I know,
none of those support the right kind
of rendering in order to do ray-tracing. So I'm not sure
that that's going to be in the immediate
future for any of those devices. VICTOR: We already support ray-tracing in multi-view. ALEX: In multi-view? VICTOR: Mhm. ALEX: But do we do
it in those headsets? VICTOR: So our headsets,
most of them are using multi-view
for the two monitors. So ray-tracing is
supported in multi-view. But I think there is a-- First of all, you need to use
it in the deferred renderer. And we are using
forward in the VR template. ALEX: That's what it was,
thank you. It's because it's
a deferred feature. And so you have to change
it into something else, right? So it's like something
weird like that, right? Ugh, there was some reason. Yeah, "What are use cases
for world aligned textures? Don't you pretty much always
want the object aligned?" Yeah, it's-- oh, I see. Someone said
landscapes and rocks. Yeah. VICTOR: Oh, wait, I found it. ALEX: There's a few
different examples of when you would want
the textures to actually shift based on their
position in the world in order to give them some
sort of like randomization, quote, unquote. But it actually just
makes it look cohesive. VICTOR: Overlaps,
you found it just as soon as I did. It's called a named reroute. ALEX: Named reroute. VICTOR: Named reroute. And this allows you to
expose that [INAUDIBLE] local-- ALEX: That's a new
one for me actually. VICTOR: Yes, it is. And it's so handy. Like look at that. ALEX: I thought you were just talking about parameters. I'm like-- VICTOR: So this is like, whoop. This is just like that. ALEX: Named reroute. That's nifty. VICTOR: Uh-huh,
and I've wanted to show this off, cause people are
talking about it. And I couldn't find it myself. And now I clearly
had to look it up again. It's called a named reroute. Super, super useful when
we're on the topic of materials and handy solutions for things. And I think with that,
I don't have any more notes. ALEX: Oh, wow. VICTOR: Those were all my notes. ALEX: That's what that does? It's like a tunnel? That's like a material tunnel then,
is that? That's cool. VICTOR: Going to make
sure we have a few more questions. ALEX: Yeah. VICTOR: Bryson
Abel is wondering, does or will the template have
examples of three point or full body IK use in the new
IK systems. The template probably won't. Once again,
our templates are meant to be a very small,
bare bones, quick way to get started,
very much like a hello world example. The VR template has quite a
lot of more functionality than-- not more,
but a little bit more than some of the other templates already,
because it's a little bit special. However, that said,
I did mention that we do want to work on
content examples for VR. And full body IK would be a
great example to have there. That is definitely on my to do. So not in the template. But yes, it's something
that we'd like to work on. Then some other
unrelated questions that I can't answer,
unfortunately. Yeah, and that's-- ALEX: So I saw
one question in there that's just like,
can you speculate on? And if at any point you
feel like asking us about speculating on a thing,
the answer is, absolutely cannot. That never goes in our favor,
unfortunately. VICTOR: Let's see here. HitPause,
the VR template actually does deploy to Oculus Quest out of the box. All you got to do is make
sure that you have your Android development set up on your PC. And then you accept
the Android SDK license. And you hit Launch. So if you've already
been doing like mobile VR or mobile development on your PC,
the VR template in 4.27 is like out of the box,
you can deploy it to Quest, which is really,
really, really handy. Because that was always
quite the nuisance otherwise. And I think with that, Alex,
it's about to wrap up. ALEX: Yeah? VICTOR: Yeah. And you know what, though? Let's go ahead and go to
Unreal Slackers for a bit. So if any one of
you out there would like to come hang out
with us after the stream, we're going to jump
into the VoIP channel in Unreal Slackers. ALEX: Yep. VICTOR: Which is
UnrealSlackers.org. It is the community,
the unofficial Unreal Engine Discord community that's run
by some very awesome folks. ALEX: Yes. VICTOR: Big shout out. We'll be hanging out in the
VoIP channel for a little bit until we get too hungry and
we need to go get dinner. But this has been pretty fun,
Alex. I think we went through most
of the stuff that I had planned. ALEX: Yeah, you managed to get through all the things. VICTOR: And no one said like, I didn't understand anything of
what you were talking about. So that's good. ALEX: Then you can
always rewatch stuff, y'all. And it will have captions on
the videos after the fact as well. VICTOR: Yeah,
I'm too tired to do a spiel today. And since I'm technically
not the host anymore, maybe I don't have to. ALEX: I don't know. You have to do your outro? Got to do the whole outro now. VICTOR: I probably should,
though. I'll mention a couple things. ALEX: Thank you
all for joining us. VICTOR: Yeah, thank you all for joining, for real,
though, not [INAUDIBLE] ALEX: Do a couple shout outs. Hey, TheMainStark. Hey, MissionaryGamer. Hey, HitPause. Hey, JimmyAnimal. Hey, OneLearner. He, FrederickLH. VICTOR: Deviver. ALEX: Yeah, thank you. VICTOR: Yeah,
thank you all for hanging out. It's been great. Hope some of this
information has been useful. If you have more
questions about OpenXR, this is not the last time
we're going to cover that. Feel free to hit me
up on the forums. I am working on this daily now. So there will be more
developments there. And I will try to update the
VR template discussion forum thread that's up on the forums
with any and more information. Actually, I do-- I wasn't even thinking. Oh, oh,
there's one thing I-- just real quick, just real quick. ALEX: All right,
we're going to go, but no. VICTOR: Real quick,
there is a feature. We have started to expose some
of the functionality in OpenXR. And the one thing
that I wanted to show is that you can actually
get play area bounds. This is the generic
function that will return x and y value
for your play space bounds across runtimes. ALEX: Yeah, your safe zone that you've drawn out. VICTOR: Yes, and in 0.1,
I actually built-- I updated the BP-- the VR teleport visualizer to
be entirely based on Niagara. And it's actually taking a
user Niagara parameter that's being set from this function. Jesus, I'm lost. Set from this function,
and actually drawing that by default. So in 0.1 you do not
have to set that up. It's just going to come
with the template. Cool,
I had that on the list somewhere. So I had to mention it. ALEX: Eh, it's a lot of bullets. It's a big list. It's a big list. VICTOR: It's a big list. And the next time it
might even be even bigger. If we got content examples,
woo, then there's going to
be a lot to cover. Anyway, thanks,
everyone, for hanging out. Make sure you check out
forums.unrealengine.com. That is where we post
the event announcements for all of our livestreams. Do go check out
and if you missed it, we did play the UE5
community sizzle reel earlier in the stream. If you missed that,
go check it out. There are some really, really,
really awesome projects that people have been working
on with UE5 Early Access. And with that said,
I think it's time to say goodbye. ALEX: Yeah, we'll see you all. VICTOR: We don't have a
planned outro or anything. We're just going to-- ALEX: We'll see
you all on the chat and in Slackers. VICTOR: Yeah,
come hang out on Slackers if you want to talk to us. ALEX: Come watch
this video on demand, on YouTube,
or wherever later on. But you know,
we'll see you all around. See you next
Thursday and all that. Well, you'll see him next Thursday,
I'm sure. VICTOR: Me? No. I'm not supposed to do
this every week anymore. ALEX: Oh, oh, that's right. We're swapping
out different people. That's right. That's right. VICTOR: I got another job now. ALEX: We're all
doing different jobs. VICTOR: I'm working on XR,
XR stuff. But yes, there's a chance
I'll be back next week. With that said,
have a good weekend, everyone. We'll see you next
week at the same time. Don't know who. Don't know when. Or we know when. Don't know who. Someone will be on the
stream talking about Unreal. Love you all. Yeah, take care, everyone. Bye. [LAUGHTER] [RUMBLING]