♪ [MUSIC] ♪ [NARRATOR] Welcome to Unite Now where we bring Unity to you
wherever you are. So let's kick off. Welcome everybody to Meet the Devs
on the new Input System. I'm Will Goldstone. I'm Product Lead
for Workflow here at Unity. And I'm going to hand over to-- we have two guys who introduce
themselves in a moment. But first, I'm just going
to show you the agenda that we're going to do today. So I'm just going to share
my screen and do that. Tat-ta-da. Okay, so this, like I said, is Meet the Devs is part
of our Unite Now programming. And today, we're going to have
a quick introduction. So we're going to meet the team. I'll be introducing you to Andy
and Rene, our developer. And then Rene is going to give you
a quick breakdown of the new Input System,
the kind of the what and the why. And then Andy's going to show you
a live in-Editor demo of how you do a few things
with this new system. And then we're going
to throw it over back to you to give us a good set of questions. So just bring us all
of your questions, like, you know,
"Does the new Input System support the Nintendo Power Glove,
or Nintendo Zapper," for example. Or you know, just more
easy questions, like, "What's the compatibility
with DOTS," or something like that. So without further ado,
this is the team. And like I said, I'm Will. And I'll throw you over to Rene
to introduce himself. So I'm going to stop
my share right now. So you can see Rene. Hi, I'm Rene Damm. I'm the Input Team Lead
and main code monkey on the Input System, and have been for a while. - There we go.
- Andy? My name is Andy Touch. I'm based here in Denmark,
in Copenhagen. And I'm on the evangelism team, building content, demos,
examples, projects, and helping people to learn
Unity and new features. And one there have been folks on
a lot recently is Input. Great. So speaking of which,
let's just have a quick rundown of where we are, why we've gotten
to where we are with Input, and what we're kind of replacing. So we talk about this being
the new Input System. And so I think it's kind of useful
for us to be able to say, oops, it says "sound gone." I'm getting people saying the audio
is broken down for some reason. Okay, someone else
says they can hear us. Okay, great. We're going to continue
and assume everything is good. Yes, so Rene, I'm going
to throw it over to you. So we had the Input System
for the longest time. We now have a new system. What's the big difference, and
why have we developed this system? Well, I think it's no secret
that when it comes to input, like we, with respect
to the old system, we found like quite a bit--
like the old system dates back to quite a while back now, when gamepad
and mouse and keyboard, that was pretty much
the dominant form of input. Touch wasn't even
really on the horizon. So everything that came after that
was a bit of an afterthought, and like we've been moving
pretty slow on this space. So this kind of to finally get us
to a point where we have something modern in place that really
goes across all platforms and all across the various devices,
the various needs and use cases that we have now. Also with a pretty rapidly
accelerating pace in the XR space,
and all the stuff that happens with input there, which is finally really breathing
some new wind into the whole sector. So yeah, there was a lot
that needed to be done and with the new Input System
we're finally getting to the point where we're starting to address
the needs that have emerged over that pretty long period
of Unity being out there. Awesome, and what were some of
the kind of the big missing things from the old system, if people
aren't that familiar with that? For example, there was a lot
of variety between platforms, and not a lot of standardization, even to the point where you
could have a gamepad, you plug it in on Mac,
and plug it in on Windows, it looks completely different. So we had to account
for a lot of differences between platforms, which is ironic, because Unity has always
prided itself on making it easy
to go from platform to platform. Then, all the various means
of input that came out in very wildly different ways. So, like touch
was completely separate from mouse and keyboard input. Joystick kind of came in
through the same path, but it was sort of jammed in there. So there was not a lot
of consistency. And then a lot of things
were just plain not possible. Like it was sitting
all locked away in native, you couldn't feed in
your own input, you could not create
your own support for your own devices,
and that kind of stuff. And yeah, even to the point where you couldn't even rebind
things dynamically at runtime, these kind of things. Awesome. So I remember spending
a lot of time googling what our bindings were
for an Xbox 360 controller, and "Oh, this is input 28." "Oh, cool, that's the X button." So yeah, you're obviously solving
a lot of those problems. I think everyone has, oh, sorry, I think everyone has
that one graphic image from Google, which is like, this is the-- I've seen it in so many studios,
they've printed it and put it on the wall,
the Xbox mapping thing. So it'd be cool
that this is being looked at. Yeah, ConorB says, "Yes!
I have that Xbox ref saved." Exactly!
I think we all do. So we've put pay to that,
or Rene, you've put pay to that. So thank you. So I think the next thing to do
is to really dive in and take a look at the system. So Andy, tell us a little bit
about what we're going to see from you today in this demo. Yeah, absolutely. A project I worked on recently
is a kind of example project that covers a wide variety
of common scenarios to do with input. So this can be everything
from mapping a joystick, to a movement of a character. Which is often the core fundamental of most games
that a lot of people make, especially, 3D games. All the way through
to switching bindings based on if you're playing the game
on a keyboard or a controller, such as a-- I don't have
anything fancy like Will, but a PlayStation 4 controller,
Xbox controller, things like this. All the way through to--
what about a touchscreen? A touchscreen
doesn't have joysticks, so how does this work
with a virtual joystick? And it covers
a couple of other areas, such as debugging inputs. So again, that reference sheet. So there's actually now tools
inside the new Input System that allows you to preview
the devices connected. There's a couple of other
really neat tools and tricks and features. Kind of things that before
I joined Unity, like seven years ago,
I kind of wish I really had then. I wish I had a time machine
to take this package back in time now,
and show it to myself. Pulling out less hair. But a lot of these
elements are now in. And this demo project,
which is available now on GitHub, and we'll show a link at the end, kind of covers a lot of those
core foundational features. All the way through
to like remapping of UI so you can change an X button
into a trigger button for an attack. And so that people
can kind of prepare for when they get that link
and download the project, what version of Unity should they
be opening that in? So the master branch of the project
is a GitHub repository, is on 2019.3
or 2019.3.7 to be exact. But it should generally work across
the latest 2019.3 versions. But I've also created a branch
for 2020.1 beta. And I think it's beta 4 branch, but I'm going to keep these
kind of in parallel, so that a lot of people
that download the project will be on 2019.3, and eventually,
when 2020.1 becomes out of beta, like the f-release,
or the release candidate release, I can then upgrade that
and kind of almost swap them over. So those people who like
to jump ahead a little bit, and experiment
with 2020.1 in the betas, there's a branch for that. But the main one is 2019.3. I'll show you
the 2019.3 one as well. Yeah, totally. So people can go
and start their download in Unity Hub now,
and be totally ready to load up your project
when they get the link. Cool. Okay, let's take a look, man. Okay. So I'm just going
to jump over to... and share my screen. Just while Andy does that,
just a quick reminder to everybody, we are taking questions
using the Q&A function in Zoom. You can also fire your questions
onto the forum. It's more likely we'll get to
the forum questions after the webinar, so if you've got questions now,
feel free to ask them as we go, and we'll kind of track those
while Andy is presenting to you all. So I'm sharing my screen now. So hopefully everyone can see
a very handsome warrior standing in the middle
of the screen. I guess Will can
let me know because... [WILL] I can see it.
I think it looks great. So go for it. [ANDY] Clearly modeled
after your facial hair, Will. So here I have a character. And on this character,
if I enter "Play Mode", like most people would,
he has some very simple movement. So using my keyboard keys, W-A-S-D, he's running around
in x and z axis. And when I hit space bar,
he's going to have a simple attack. So these are kind of two
very simple inputs. One being axis-based,
and one being button-based. You push a button,
something happens. You push a direction,
and the Vector3 gets converted into a direction. So this is a very simple setup
with this "Warriors_Project". Now, this is currently using
right now the old Input Manager. And in this project,
I've tried to include both the old Input Manager code, and the new Input System code
simultaneously. So you can kind of see differences
and comparisons with a joystick and joystick axis movement,
and also a button press. So if I leave "Play Mode" now, and go over to
the "Package Manager", let's see how it can take it
from this simple movement like this, and actually use
the new Input System and create a new control scheme
for the warrior, and use a lot of benefits
that the new Input System comes with it. So if I go over to "Package Manager,"
I've already done this step early, and installed
the Input System package. We're using the "Package Manager"
to distribute features, and also distribute projects
to come with them. And you notice
with the Input System, we have here lots of different
samples to go with. So some of these
are like "Editor Window Demo", "In-Game Hints", "Input Recorder",
"On-Screen Controls", and there's a whole wide variety to kind of cover
a lot of core elements of a typical game's Input System. I'm actually using one of these,
and that's "Rebinding UI", which is actually inside
this project. Because it's something
that a lot of people are asking about,
and they're curious about, and it was a big bug there
in the old Input Managers. But with the new Input System,
it's a lot easier. And this project demonstrates
how to do this. So with this "Package Manager"
Input System package installed, you can then start using
the new Input System. Now, one thing you might have to do
is underneath the "Project Settings" is locate "Other Settings". And there'll be an option,
which is "Active Input Handling". You can choose
the "Input Manager (Old)", "Input System (New)", or "Both". Now, I've included "Both"
to show the API for the old system
and the new system. So if I go back to here,
now we've got this set up, let's actually start. Now, one of the things
that the Input System brings is a new asset that you can create,
which stores "Input Actions". And this is basically creatable from the Project Asset
creation window. You notice it's nested here,
or set here as "Input Actions". When you create one of those,
you can double click it, and it opens up
a new contextual window. Kind of like if you created
a shader graph, or a VFX graph, or say, an animator,
animation controller for the animator system. The Input System's very similar
where you have this asset which holds a UI that allows you
to kind of sculpt and create your player controls. So if I now jump over
to this window, and go through it, I'm going to set up
this warrior's "Action Maps", "Actions", and "Bindings". So here for the warrior
and its input actions, there are three columns: "Action Maps", "Actions,"
and "Properties", and all of these relate
to one another. "Action Maps" are kind of like
control sets. So if you think of a typical
open-world game, let's say, "Grand Theft Auto",
or "Red Dead Redemption", or something like this, you would have
different control sets for different scenarios. For example, you'd have one
for third-person movement. So you have a joystick
to run around, perhaps an "A" button does jump. Then you'd perhaps have
a control scheme for riding a horse, driving a car,
where a joystick is going to operate differently
in the movement. And the "A" button is going to be,
say, accelerate for a car. And then maybe you have also
a "Pause" menu or some kind
of inventory selection. The joystick's not going
to be movement, but it's instead going to choose
different items or different weapons,
and things like this. And "A" is going to be
a confirmation. So with these "Action Maps"
you can create multiple of them. Currently, I just have one for the player controls
for this warrior. But you can create multiple
"Action Maps" and, through a very simple API,
switch between them at runtime. Now, for this Editor demo,
I'm just going to have this one, "Player Controls", for this warrior,
but when you download the project there'll be one
for a simple menu toggle to turn on and off a pause menu. And I'm looking to adding
some more options in the future. So under this "Action Map"
we have two "Actions", "Movement" and "Attack",
and these are very abstract. So notice that these aren't saying,
you know, spacebar, joystick, and things like this. The way that the Input System works
is you create these actions, and then you assign
multiple bindings to them for different forms of devices
and platforms. So for "Movement",
because our warrior moves around using an axis, you notice here
with it selected the "Properties" have some options. So we can choose a movement type,
so we want "Action Type" is a value, because a joystick is not really
a button, it's not a pass through, it's going to return a value. And "Control Type" is kind of what type of value
it's going to return-- is it going to be an axis,
is it going to be an analog. In this case, it's going
to return a Vector2, x and y of the joystick. Now, if I go to "Attack",
that's going to be a button, because we want to push x
or push spacebar, push a trigger or tap a screen,
to be able to attack. So it's just going
to be a button action. Now, for movements, what I can do
is I can click this little plus, and I can add a binding,
or a 2D Vector Composite. And if I create
the 2D Vector Composite, it's going to create a 2D vector,
with "Up, Down, Left," and "Right", which is basically going to replicate
kind of like a joystick or an axis. I'm going to call this "WASD Keys". You can name your bindings
anything you want. But this one's the most accurate for real gamers
or true gamers out there. And for each of these bindings,
I can then choose a path, or what "Up" corresponds to. Now, if I select this "Path",
and open this option up, we get a very long list, and I'm not going through
every single option, of all the different types of bindings
or different types of inputs that you can set up. You notice there's some
common familiar ones. "Gamepad", "Joystick",
"Keyboard", "Mouse". There's also a couple
of other options. Such as "Pen", "Sensor", "XR Controller", "XR HMD",
"Other" even. And you get quite a lot of variety
straight out of the box. Now, I haven't installed
like an Xbox SDK or a Nintendo Switch SDK,
or something like this, or anything to do with VIVE,
or other XR platforms. A lot of these settings,
a lot of these tools, already come with that package. So if I go to "Gamepad",
you'll notice that there's options for "Android Gamepad", "iOS", "PS4". This stuff all comes
with the package. So obviously, I want to set this up
for keyboard input, so I'm going to go to "Keyboard",
and notice that we then get a very long list,
or I can choose any key. So I get a very long list
of basically every key on the keyboard. I only want very specific things. So I can instead actually just type
into this "Keyboard", and it will filter by that search. So I'm going to say "Up" is "W",
"Down" is "S". To save time, you can also click
this little "Listen" button, and what it's going to do
is it's going to listen out for any key that I push. So instead of cycling
through that list, or searching, I can set just here,
"Left" is the "A" key, and I can set that up. So if I go through here,
and set this all up. I've now got "W, A, S, D",
"Up, Down, Left, Right". And that's all I need to do
for this movement binding. I'll come back to it
a bit later on. And what you'll also notice
for these bindings is you get this control scheme,
so here I have a keyboard, and you can set up different
control schemes for different types of things,
like gamepads, XR controllers,
and things like this. So these are all going to be set up
with the "Keyboard" control scheme, which you can switch later on
on an asset I'll show you in a bit. Let's set up "Attack." So if I click "Attack"
and click that little plus option, now instead of doing
a 2D Vector Composite, which is for a joystick,
I'm instead going to do a binding. And this also has a path, and what I want to do
is I want to bind this to "E". So "E" is going to be "Attack". And this is also part
of the "Keyboard". So now I've got these two "Actions":
"Movement" and "Attack". And of course, I could create
a lot more "Actions". And "Movement" is bound
to "W, A, S, D", and "Attack" is bound
to a "Keyboard". After I've done setting this up,
I can click "Save Asset", just like you would
with a shader graph, or a VFX graph, or kind of
most other contextual assets that hold something,
hold some data that you then use. And this asset is then ready to use
in the project for the warrior. Now, if I select the warrior,
what you'll notice is that we have
this "Player Input" component. And the "Player Input" component is kind of like an instance
of this asset. So with the "Player Input" component, think of something
like an audio asset, an audio clip asset. And you have an audio source
which kind of takes that audio clip and instances it inside the world. Now, when instance is in the world,
it's then going to take that clip and then play it
or do something with it. Similar to do with
an animation controller being played by
an animator component in the world as well,
or in the scene as well. And the "Player Input"
is very similar to this. So if I enable this, I can then
choose which type of action sets, or which type
of action asset to use, because you can create
a multiple of them. So I'm going to assign the "Warrior". I can choose
which default scheme to use. So "Any" or "Keyboard". I get a couple of other options,
which I'm not going to use. But one of the most
interesting things here is that the behavior
is going to be defined on what you're going to actually do
with these "Actions". So we set up the "Actions"
"Movement" and "Attack". What you notice here is
if behavior is set to send messages, we get a couple of messages
which are going to be sent locally to this object
and its various scripts. We get some built-in stuff,
like "OnDeviceLost", "OnDeviceRegained",
"OnControlsChanged", things like that. We've also got "OnMovement"
and "OnAttack" based on these "Action" names. So if I set up another one,
say, "Jump", it's going to add "OnJump" as well. Now, if, like me,
you're a little scared of the "Send Messages",
you can also change this to be "Broadcast",
"Invoke Unity Events", "Invoke C# Events". So if I just switch over
to "Unity Events" what you'll notice
is that we get "Unity Events" to be able to bind callbacks
to each of these different things. So "Movement" will then fire off
on this method on this particular script. The first time, I'm just going
to use "Send Messages", because it's very straightforward and that's how I've set it up,
but you can obviously go in and change this however you want, especially when you set up
your own games and systems. So now that I've got this set up,
so "OnMovement" and "OnAttack", I'm just going to go to this
"Player Controller (Script)", and turn off this
"Use Old Input Manager". And I'm now going to go in
and have a look to see what this script
actually does. I won't read through
everything here, but if I go down
to these two methods here, "CalculateMovementInput",
and "CalculateAttackInput", I've wrapped them in a boolean
for "useOldInputManager", and the old Input Manager's
usage for movement is pretty straightforward. It's basically you'll get
the "GetAxisRaw("Vertical")", "GetAxisRaw("Horizontal")". If you run any input code in Unity
using the old Input Manager, you're probably
very familiar with this. And it's going to take
these two values, construct a new Vector3,
and pass them to "InputDirection" to then be used later on, to be used to move the character
in a certain direction. And "CalculateAttackInput"
is very similar as well. It just, has:
if(Input.GetKeyDown(Keycode.Space), then it plays the animation. Both really straightforward,
very simple things. Now obviously,
with that boolean turned off, I can scroll down to the bottom,
and have a look to see what the new
Input System code does, and how it differs. So notice I had those two callbacks that were visible
in the player input: "OnMovement" and "OnAttack". These are now here;
these are now callbacks I can then use, and use their data. So, "OnMovement", you can then get
the value of Vector2. Because, remember, we set up
Vector2 in the property for movement. And now I'm basically taking
the axis result and then passing it back
into "inputDirection", just like we did
with the old Input Manager. "OnAttack" is even simpler. It basically just takes
that value of "OnAttack", detects it's a button,
and it just plays the animation. Now, this is obviously
a much simpler entry-level example, but I really wanted to show you
like a really simple example of the old Input Manager code,
and the new Input System code. There's obviously a lot more
use cases and scenarios that you can run into
a bit later on. This is kind of like
the core fundamental for a simple movement,
action system that this demo has. So if I now go back. Check that everything's set up. So I got my warrior,
got the "Player Input" components, got the "Input Action" asset. It's sending the messages. And I have "Old Input Manager"
turned off. If I click play, what you'll notice
is that the warrior is running and animating. I can push my arrow keys,
"W, A, S, D", and I can hit "Space",
or I can hit "E", even, it wasn't space bar,
and he attacks. So it basically looks
almost identical to what I had before. But this is now using
the new Input System, with this visual Inspector. I didn't actually have to write
really that much code. Just a few lines. And most of the actual binding
is set up to do with, setting up to the keyboard
is all done in this interface. Now, one thing you'll notice
is that this warrior has a little UI floating
above its head in world space. Showing number one,
showing it's one warrior that's spawned to the scene. And it has basically
the input device connected to, which is "Keyboard". So its displaying input values
from the Input System. And if you look over here into
the "Player Input" component here, we have some "Debug" values
at run time. We have "User", which is "0". I've shifted the integers up one. So, obviously, games usually have
player 1, 2, 3, 4, not player 0, 1, 2, 3. We've got the "Control Scheme",
we're using "Keyboard", and we've also got "Devices",
in this case a "Keyboard". And this data will change depending
on what type of device is connected to that "Player Input"
and that input "Actions". And all I've done
is I've just written some visual display code
that takes that debug data and then presents it
dynamically in this UI. So I figured I'd add this
to this project just to show how
you could use that data, but also present it
in your game in some form. I've seen a lot of games that have
Player 1, Player 2, Player 3, but it might also be useful
in some menu settings to show what type of device
is currently connected. So we have a character
running around like this. Now, this all works
and this is all super cool. And you're probably thinking,
"Great, you're back to basically where you were
five to ten minutes ago." And what I want to do now
is kind of now build on top of this control scheme, and add a couple more
new fun features that the new Input System gives you. So if we go back to this window,
let's set up another "Control Scheme." In this case, this one's going
to be for gamepads. You can name these "Control Schemes"
anything you want. So maybe you have like
a "Steam Control Scheme", or you have a "Control Scheme"
for a, I don't know, a conference demo,
or different platforms, it's really up to you. In this case,
I'm calling it "Gamepads". And I'm going to add
the "Gamepad" list. So now that I've created
this "Gamepads" list, one thing you might notice
is that "Movement" and "Attack" have lost their bindings. But they haven't actually
lost their bindings. They haven't disappeared anywhere. As I'm filtering
this "Control Scheme" by "Gamepads", It's not going to display those
to do with "Keyboards". So if I switch that filter
back to "Keyboard", it's going to show you
these bindings accordingly. And if I switch to "Gamepads",
it's going to have the same actions, because our warrior still,
with a gamepad, has movement and attack. Just it's going to have
very different types of bindings. I can of course, choose
"All Control Schemes", and show kind of "Keyboard",
"Gamepads", all these different bindings
in one location. So let's set this up to work
with a PlayStation 4 controller. So I have a PlayStation 4
controller just here. Like so. I've also got
an Xbox controller, and a couple of other
devices with me. And on a PlayStation 4 controller,
we have a couple of joysticks. But we also have joysticks
on an Xbox controller, and Nintendo Switch controller. And actually, most general
controllers have some kind of left joystick
or some main joystick. What we want to do
is we want to set up movement to not just work
with a PlayStation controller, or Xbox controller. We want to be very general,
very generic. Work with kind of all
controller joysticks. We could do that quite easily
without even touching the code. So if I go to "Movement" here,
I'm going to add a new binding, because we can actually add
multiple bindings to an action. And on this binding
I'm going to choose "Gamepad". Now, I could go into these options. So I could go to "PS4 Controller",
and choose one of these. I could go to "Switch Controller", and set up bindings
for each platform. But instead, I'm going to go
to this generic list at the top and just choose "Left Stick". What you'll notice here is it says,
"Left Stick (Gamepad)". It's going to work
for the left stick for any gamepad that we support. Now, if I go to "Attack",
this is a button, this one's much easier. I want to add a new binding
for "Attack". And let's do this to... Well, this one's
a little bit trickier, because a PlayStation controller has "x", "circle",
"triangle" and "square". Now obviously, an Xbox controller
has "a", "b", "y," and "x". There's no such thing as a square,
there's no such thing as a triangle. They're all very different,
in slightly different positions. But the actual position
of the button is actually the most
interesting part of this. Because with the new Input System,
thankfully, Rene has set up "Button East", "North",
"South", and "West". So this is indicating which one
of those four directional buttons. In this case, I'm going to do
"Button South", which is basic controller's
"x", or "cross", Xbox controller, "a",
Nintendo Switch controller, "b". They made it easy for us
to get a little bit mixed up when setting up input. So thankfully, Rene has made it
much, much easier for everyone. So now that we've got
this all set up, I can hit "Save Asset", and then,
this will work straightaway. Now, we mentioned...
before I jump into the Editor, oh, I'm also going to set up
for the "Gamepad" control scheme here just by ticking that button. But we mentioned a bit earlier on that you may have difficulty
knowing what the references are, or perhaps debugging, you know,
what is a "triangle", what is "right trigger",
and things like this. So if I now go to "Window",
and go to "Analysis", there's a new option that comes
with the Input System package called "Input Debugger". If I open up this "Input Debugger", what you'll notice is that
it has a list of connected devices. In this case,
"Keyboard" and "Mouse". Now, if I plug in
a PlayStation controller, which I'm going to do right now, notice that the list updates, and it's updated outside
of "Play Mode", which is super cool. And it says,
"DualShock4GamepadHD". And I can actually
"double-click" this, and open up this window
just for this device. This also works for "Mouse"
and "Keyboard", and things like this. What you'll notice-- actually,
I'll make this a lot bigger so you can actually see
the full list. Is we've got all
the different buttons to do with
the PlayStation controller. And if I now open, and I'm going
to have tactically do this, I now hold up
the PlayStation controller. Notice that as I push
different buttons, this interface is basically
going to-- one sec. I need to probably make it
a little bit smaller. Okay, basically, you can see
the values are actually updating as I push them, like "Right Stick",
"Left Trigger". If I close this one
for the PlayStation controller, and open up, I don't know,
"Keyboard", and push keys on the keyboard,
notice it's logging those. And we have all
these "Keyboard" keys. So for each device, it's going
to have contextual settings or contextual values
for basically that device. So you can actually test
and simulate all the input debug stuff
when you plug in a controller, before you even use
the input "Actions", or before you even use any code. So now that I've got this PlayStation 4
controller plugged in, make sure that everything works
in terms of the warrior. So we got the "Warrior". We've got these different
"Control Schemes", so I'm now going to use
a "Default Scheme" of "Gamepads." Now, when I enter Play Mode,
what's going to happen is this warrior
is now using PlayStation. He gains a very snazzy blue look. Not exactly placed Sony branding,
but it's very close enough. The recoloring of the team colors
is actually done with shader graphs. So if you're watching this,
and you think, "Okay, this is cool, but I'm more interested
in shader graphs and VFX", you can even download this project
and study that part if you want to. I've already set up the action--
the control map for you, including one for just the beard,
which is kind of cool. But I digress. So if I now go back
to this project, as I move this PlayStation
controller around, and I have to do this
while I've got the webcam, it's the character's moving,
I can push "x". And basically, the character's
now going to attack using "x". And notice that I did that without actually changing
anything in the code. All I did was just choose different
bindings for these actions. And the other cool thing is you can also switch
these at run time. So you notice here that
the "Debug" controls are saying User 0, it's using "Gamepads", it's using "DualShock". And if I push a key
on the keyboard, it's going to switch
to the "Keyboard". So I'm instantly switching between
PlayStation controller and also "Keyboard" on the fly. It's basically doing autoswitching. This is going to be really useful
for a lot of people doing games on say, Steam,
or other type of platforms, in Game Jams, in games
with different types of gamepad support where they're going to be able
to support a wide range of different platforms,
a wide range of different inputs. So yeah, we have now got this. Now also, if I plug in
an Xbox controller... So notice I can just plug in
an Xbox controller. If I go to my device list,
notice we got here Xbox controllers picked up. I've actually plugged in
two Xbox controllers to show that when you plug in
more than one of these devices, such as multiplayer
for four Xbox controllers, or four PlayStation Controllers, it actually detects each of these. And I've got
an Xbox Controller here. I've plugged it in.
Let me just retry that again. I now got the Xbox 1
controller here. And it's working
with the "Button South" as well. So it works with "x"
on PlayStation, and now, the "South" button,
which is "a" in Xbox. So notice I just plugged in
an Xbox controller. I didn't actually change
anything in the code. I didn't even re-save this asset. It just worked because
the general gamepad option is all set up for that. So that's pretty cool. That was a moment
where I tried this, and said, "I wish I had
this T.A.R.D.I.S. time machine to go back in time
and take Rene with me to all these projects I did
in Unity ten years ago." Yeah. So thank you, Rene. I've got a couple more
fun things to show you. So now that we have
multiple devices plugged in, PlayStation, Xbox 1, and Xbox 2, what about having
multiple warriors at runtime? What about multiple
local multi-player? This is a very common thing. A lot of people like
to build and play, especially with Nintendo Switch
having Joy-Cons and things like this. So in this example project, I've got a
"Spawn Multiple Players" button. And basically, it's going to take
this "Warrior" prefab and spawn multiple instances of it,
in this case, three. So now, if I hit play... And check that's "Warrior", okay. And enter "Play Mode"... Okay, yeah, I forgot to assign
the "Action" asset. That was my fault. What you'll notice is
it's now spawned three warriors, with "1", "2", "3"
getting that "Debug" value. I can now actually just spawn
these warriors, run around, I need more hands to play this. And they're clearly not respecting
social distancing here. I now have this PlayStation one,
Xbox one, and this is the same prefabs spawned multiple times. It's not actually saying,
this is Xbox prefab, PlayStation prefab,
and things like this. It's just the same "Warrior" prefab
spawned a couple of times. And just to prove this to you,
in the "GameManager" I've got this spawn script which is just a very standard
instantiate players, and with the "Player Input"
component it's then automatically detecting
a device and assigning it to it. And also at runtime,
this "Input Debug" is very useful in showing the different types
of users connected. So here we've got
one of the... "User #0". And it's connected to a DualShock. So you can actually debug
and test all this at runtime. You can see, "Oh,
the Xbox controller to this one," or this one, and things like this. And just before I leave "Play Mode", another thing I've added
into this demo, and this was also a big pain,
is actually disconnect and reconnect to a controller. So I take the PlayStation
controller, and unplug this so it's now, the wire is unplugged, what you'll notice is that the UI is updated to
"Controller Disconnected". I've changed it to an "x", and the character
actually gains a white beard. It's just a simple aesthetic change. Note, that's because
there's a callback in the system now, on this "OnDeviceLost" here. Which basically, you can then
detect and pop-up a menu saying, "Hey, Player 3 controller
has run out of battery," or "Disconnected"
or something like this. And you can actually
re-plug it back in, all in runtime, and hey look,
PlayStation controller is now working, and everything's
working great again. So a lot of these little tools,
a little menu, a little APIs, and little things like this is really going
to make it a lot easier for people to create games
with different forms of input. And hopefully, this project
can be useful for people to kind of dissect
and extract these chunks of code in these little examples. Now, before I pass it over
to Rene and Will, there's one last thing
I want to show you, and that's mobile, because I knew
some people ask about it, so I put that
into the project as well. So here I have a UI input here
of a little axis, a little joystick axis,
and a little sword button. Now, a lot of people
make mobile games, and one of the elements
of mobile games for something like this,
is a virtual joystick and a virtual button. So this UI, which is using
the Unity UI system, U GUI, has components
that are compatible with it, which come with the Input System,
and at this opportune moment, my neighbor is now drilling,
so that's great. And with these onscreen stick and with this onscreen button
components that come with this, you can set up a control path. So for the onscreen stick,
we can set up a control path which looks almost identical,
well, is identical to this properties bindings path. And here, I'm going to choose
"Gamepads", and "Left Stick", because remember,
we've already set up "Left Stick" to work with movement,
so we might as well use this UI to work with it as well. And if I go to "Button Attack",
we go to "Gamepads", we do "Button South". It's going to kind of piggy back
off of this "Gamepad" controls. So now, if I enter "Play Mode". If I enter "Play Mode" and spawn,
let's say, one warrior, make sure that prefab
didn't have the... And we'll wait. Yup. Enter "Play Mode" here. Notice that when I move
this little joystick here, it's basically simulating
basically like a touchscreen. So I can click this little button
to move around, I move this little joystick
to move around, and I can click
this attack button as well to attack with the character. And if I now open up my phone, because I've already
built this to Android, just to prove it doesn't just work
or simulate the touch inputs in the Editor. So I've already built this project,
this exact same project, to device. So I've now got the warrior
moving around with a virtual joystick,
and I've got tapping a button to attack, and he's got
a nice purple. If you build it to a device
and use the phone as a gamepad, he has a nice purple branding
to it as well. I try to change the colors
to be topical. So we've now got basically
the same Input System without having to hard wire
different code working on a touchscreen device. And just to round off,
very quickly. I'm going to spawn five warriors, with all the controls
and inputs plugged in. And what this
should do is it will... basically spawn a different warrior
for each of these input devices that I've shown you. So we've now got one
for the keyboard, which is currently active. One for one Xbox controller,
one for the other Xbox controller, one for the PlayStation controller,
and also one for this on-screen UI. So just using that one prefab
spawning those instances, all the devices plugged
into this machine, can piggyback off of it. It also works with
the Nintendo Switch Pro controller, and probably lots
of other gamepads. But I need to borrow
a lot more gamepads, and a lot more hands,
and ask my roommates to help me. Because juggling
all these controllers and wires is a lot of fun, right? So yeah, hopefully,
this demo was useful in sort of giving
an overview of the project, giving an overview of the project, giving an overview
of the Input System. There's a lot of other fun stuff,
so I actually implemented a pause menu to show
that you can actually rebind different buttons
to different things. So I can rebind the PlayStation
controller's "cross" to "R1" and then I can hit "R1"
and it will basically attack. So you can actually
download this project. There's lots of really nice
little tools and things that you can learn from
and play with. I'm going to look into extending
more of it in the future. So I'm now going to stop sharing
my screen and pass it back over. Before you jump out
of screen share real quick, would you mind showing us
the "Listen" button? I'm not sure
if we covered that in this. As in listen for input. I did show it briefly,
but I can show it again. It's just something
that always comes up as people wanting
to quickly find the input that they're pressing. I know the analysis tool
obviously shows you that you are, also confirms
that you're getting input. I think it's always useful
to remind people that they can click on the
"Listen" button if they want to just
hold up a gamepad and press something to assign it. Yeah, sure.
So I'll share my screen again. So I'm sharing my screen again. What I'm going to do now
is I'm going to... I want to leave "Play" mode. I'm going to go to this "Attack"
and add another binding. And when I click here
on this "Path", obviously you could go
through this list, and we could do a ten-hour webinar
where I go through every single option,
and ask Rene for lots of help. Because some of these spawns
I haven't even played with yet, such as "Pen" or the "XR" stuff. We have this little "Listen" button,
so if I click "Listen" it's going to say,
"Listening for input..." So if I now, for example,
push Xbox controller, like "R1", or the "Shoulder", notice
it's automatically detected. "R1" for the PS4 controller,
or "Right Shoulder" for Gamepad. Now, I could choose "R1" if it's going to just be
a PlayStation game. But let's say, for example,
we want it to also work with Xbox controller,
which also has a "Shoulder Button", or a Switch Pro Controller,
as it also has a "Shoulder", I can then choose that. This also works if I say,
like joystick. So if I just move,
wobble the "Left Stick", it's going to show all
the different forms of "Left Stick" there
you could filter to. It also works if I, say,
push like a "Keyboard" key. So let's do, we really, really
want to bind something to "Enter"-- oh no, not "Enter", let's choose something
that's not going to move to like... [Will] I think you're in
the search [inaudible]. [Andy] Yeah, sorry. "Control", so notice
it's picked up "Control", "Right Control",
or "Any Key" on the keyboard. So listen is super handy
to not go menu hunting, but instead filter
to what you want. Kind of like shader graph,
you can type in, "color" and it'll pop up
all the color-related ones. It's similar but also handy. Especially if you have
a controller, I mean, people are used
to PlayStation, Xbox controllers at this point, but there's a lot
of new controllers on the market which people get wrapping
their head around. And if they want to understand
what the terminology or the geography is
of the controllers, and listens can be
super handy for that. Yeah, and worth kind of reinforcing
that when Andy's showing you there the gamepad versus PS4
or versus Xbox controller, that is that generic binding
that will work for all controllers. So I, myself, I'm working
on a game where I want to support Xbox pad, and PlayStation pad
at the same time, because I don't know
what my play testers might happen to have,
or even their 360 pads, then I use their gamepad binding,
because I want it to just work for all of them. If you are working maybe
on a console game, specifically, only if they're
going to be on PS4, then you might choose
that specific binding. [ANDY] And I guess some people are going to filter
a little bit more. Because obviously the Switch
has "plus" and "minus" buttons for various options,
like "minus"... I think it's "minus"
on Animal Crossing, opens up "Pause" menu, right? So maybe you're making a game
you want to utilize "plus" or "minus"
for camera zoom, I don't know. They could be anything. Whereas, obviously, PlayStation
has a "Share" button. So you can obviously filter it
to be more specific, but you can also be very general. The choice is there. I guess one thing I'll jump over
to Google Chrome now. So I've got a GitHub repo. So this repo is actually public
and live now, and I think it's in a slide
Will will show or Will has. And I've got the project here
with master's 19.3. Branch is 2020.1. And I've also tried to write
like kind of a bit of a project overview
with lots of information. Usage, you can download
the project, learn from it, use the tools, learn from how,
I don't know, Shader Graph set up
with the Input System, or kind of a bit of anything. You can also build on top of it
in your own projects. If you see it and you think,
"Oh, great, I'm making a game with warrior vikings
that run around in a very open, boring,
matrix-y environment," then this is going to be perfect. Or if you want to extract code,
and you think, "Oh, I probably want to rewrite
most of that, but I really like the text floating
above the character's heads." Then you can use that as well.
It's really up to you. So you can feel free to dive in
and go to this repo. And open up issues,
and I'll have a look at the issues, and I'll fix things and look into
how I can extend it and make it better in the future,
and things like that. I'll stop sharing,
because, obviously, we're running out of time. Thanks, Andy.
So that's a fantastic demo. For those of you who want
to pick up the demo, we've put the link into the chat. But if you're watching this
after the fact, you can use this handy link. So I'm just going to present and then quickly
just share that link. So if you look on the screen now,
you can go to: "on.unity.com/InputSystemWarriors." And that's a quick link
to get to Andy's GitHub repo. But without further ado,
let's jump into some of the questions that have
been kind of flooding in through Zoom Q&A. If you are watching us
after the fact, just a reminder that there's
a dedicated forum for this. So you can use
the link on the screen: "on.unity.com/input_system"
to get to the forum easily. So we're just going to take
the live questions for now. We'll be answering things
on the forum later. So I'm going to unshare real quick. [HUMMING] Okay. So Rene, where would you like
to start with all of these questions? We have a really good amount
of questions. Some of them are similar. Is there anything that's caught
your eye so far? By the way, I don't think
the link was visible. Oh, I'm sorry. I'm just going to put that
on the screen one more time. Maybe I didn't complete
my screen share properly. Oh yeah, yeah, now it's visible. Oh yeah, our next slide, cool. Okay, so here's the links.
Sorry about that, everyone. Meanwhile, we'll look
at the questions on our other monitors. So yeah, Rene, where would
you like to start? Cool. Now unfortunately
with the time we have left, we probably will only get
to a fraction, yeah, like you said,
there's a forum section, and there's a part in there
where we will... like whatever is asked here
we will answer there, whatever we don't get to. But yeah, otherwise, okay, another thing that has
come up repeatedly are DOTS. Will we support DOTS? Yes. Basically, in terms of roadmap,
the three biggest next items we have is improvements
to the action stuff. There's things like having
a route to the input, basically, where an action that can switch,
for example, should be... prevent an action that can just be
from a triggering, that kind of thing. Then we have gestures.
really big thing, like we... Right now our tech support
is very basic. We want to flesh out
a whole lot more, and then third item,
I mentioned it last, but it doesn't mean
it's the least important, not at all, DOTS support,
like specialized support for DOTS, where you no longer have to
restrict yourself to make only APIs
and Unity Engine DLL. We will work on that pretty soon. Cool. Another one
I've seen come in a few times, asking about Joy-Cons versus
the Nintendo Switch controller. What's the status there? Right now, we support the Joy-Cons
and the Switch Pro Controller with the full features
only on Switch. So you need to have
the console-specific package. We do support
the Switch Pro Controller, but without motion sensors
on Mac and Windows, I think. But not the Joy-Cons. It's on the list.
We have to see there... it's not a known, documented format that Nintendo wants other
people to pick up. So we have to see, we may
not be able to support the motion sensors. We'll see, but it's on the list. Cool. Let me see
what else we got here. Tap versus double-tap versus hold? Key combos.
Jay-J is asking that question. Key combos or touch? Well, they're talking about like
if you're chaining together certain things, like how
would someone go about that? Oh, right. So we have a model
in there for interactions which includes double-taps,
slow taps that kind of thing. It's extensible, so you can write
your own interactions and add them to the system. They work just as native
as the stuff we write ourselves. And there is the same, the composite stuff
that Andy showed. It's also an extensible system.
You can write your own composites. There are some composites in there that we supply
that also has something for doing the Shift+B,
or Shift+Control+B, and that stuff. We plan on extending both,
like right now, for example, we have the problem
if you want a binding for Shift+B and B,
on two different actions, both will trigger as soon
as you press Shift+B. As I said before, that's something
that is on the shortlist of things that we want
to get to next. Great. Another one I've seen come in.
So David Antonielli asked, "What considerations
are involved in deciding between the input callback styles,
send message, Unity events. Is it just your preference, or are there technical
considerations? There are. Like the message stuff
is really good for prototyping. Like it has zero setup,
it's super quick. But it's not efficient,
especially the first call, Unity has to do some really
extensive looking up there. It does some caching internally,
but still, it's a very slow path that in production settings
is usually good to avoid. But for game jam stuff,
and this kind of thing, it's perfect. But, in general, I would recommend
going with UnityEvents, they stay entirely
on the managed side, they have a pretty
low performance overhead. Cool. An interesting one from the mysterious R. Will there be support
for Input Systems, like hand-tracking, eye-gaze,
or external interfaces, like Tap, etc.? I think that's two questions
in one really. Hand-tracking and stuff? Well... Oh, oh, like that. Well, there's definitely movement
on the XR space to build out tracking to become
ever more fine grained. I mean if you look at index,
and we're already getting to a point where it becomes
a much more articulate, and we will see that increasing. So we can get much more
tracking data and much more fine grained. And I expect that ultimately,
we will have full hand models. And probably more than just that...
arm models as well and that kind of stuff. And yes, I expect that
to be fully integrated. So it's not something
that works now, but am I right in thinking we'll probably
make that available as part of the XR
interaction toolkit, at a later date,
at some point, right? Yes, and the low-level inputs,
they will probably surface at the Input System level. We already support some... I'm not sure about
index support, specifically, but I know that, for example,
the Oculus touch controller has near touches
and that kind of stuff. We expose all that kind of stuff,
and this will only continue. Cool, thank you.
There's one from Lee Perry. It says... I think we kind of
already answered this with Andy's example
of five, but he says, "Does this system work
with DirectInput game controllers for games supporting
more than four controllers? I'm currently supporting
ten local players in my game." We don't have support
for DirectInput, but we have support
for HID in general. So if you're looking
at XInput controllers, no, we only support four,
because we're picking up that with the XInput API,
and that is limited to just four controllers. As far as HIDs are concerned, like the DualShock controller,
and many other gamepads, we support many, so there's no limitation there. And for XInput, I expect
that in sometime in the future, when we merge the UWP code base more with a classic
Win32 code base, we will hopefully also get rid
of the limit for XInput controllers. Right, cool. Interesting question
from Philip who asks, is there any emulator of PS4
or Xbox gamepads? We don't even have
a real one for testing. So we don't have
that kind of functionality, but I think the answer to that
is either obviously, to get one, but I wonder if it's something
that you would consider, Rene, as something that we would have
in a similar way to how Device Simulator
sort of provides emulation for people without phone handsets. There are two things
that you can basically already do out of the box, like you can build
your custom gamepad on-screen controls. Basically, like Andy did there,
you can just create a gamepad, basically, with a couple buttons
on screen and two sticks and control your game with that,
and use that for testing. And then, we have
extensive automation support where you can make up
any kind of device input. So you can write your own code,
for example, for automated testing, or even for manual testing. Just creates an artificial gamepad
and routes input into the system in however you see fit. Cool. Just scanning
the rest of the questions. I'm going through and trying
to answer them as well. Because I think the answers
are then recorded somewhere in chat afterwards. Console support, I see
the question on there, yes, but there are specific packages,
because of NDA reasons that you need to get
from the license and specific forum sections and install them
for the specific platform. Like Xbox just for XSwitch. So you need a license
for the respective console. So if you have a contact,
with any of the vendors, then they'll be able to hook you up
with that packaged already. So migration, a migration guide. We still have to catch up,
even on the migration we had from our old Input System. As far as migration
from other solutions, I don't see us doing that. Also especially the Asset Store
packages, because we don't really
want to compete with our own Asset Store vendors. But migration guide,
like a more useful one than we have for our
old Input System, that will definitely happen. Cool, and someone else asked
if there will be tutorials. So as I understand it
from our Learn team, they're going to be using
the new Input System in some of their new projects
over the next couple of months. So we look forward to that
being implemented there. Someone else asked,
I've lost track, there's quite a few questions here, what method they'd
be using in those. But I don't know these projects
intimately enough these days, so I'll try to tell you about that. ElectricShock asked,
"Can you please post what your YouTube channel is,
with a URL, if possible?" So if you go to <i>Google.com,</i>
and type in "Unity YouTube", you'll get that. Blah, blah, blah. How about VR and AR inputs? Well, like we said, there's some
support for that. And then, if you do have a look
on Package Manager, you will see there's a separate
package coming out for that purpose. It's called XR Interaction Toolkit. And it basically aims to give you
a bunch of prefabs and ready-made rigs
for controls for XR. It looks like
we will have to wrap up. Yes, we will, absolutely. So I'm just going to finish
by putting the links back on the screen,
but thanks to everyone for coming. Andy, thank you so much
for the demo. Rene for making
an amazing Input System. So I'm just going to present
and we'll close out on the screen with the links. Thanks to everyone for attending. And we'll see you
on the next Unite Now. [ANDY] Thank you, everyone. ♪ [MUSIC] ♪