[MUSIC PLAYING] TIM PSIAKI: Welcome to
the third day of I/O. It's great to be here
talking to you this morning. My name is Tim Psiaki and
I'm a software engineer on the AR and VR team. Today Tom and I are going to
talk to you about the tools that we've built to make AR
development easy and fast for every Android developer. So if you heard at
the Developer Keynote, ARCore is Google's platform
for building augmented reality experiences. You've been hearing a lot about
ARCore the last couple of days. And I hope you guys are
excited about building for AR. I certainly am. Our goal is that, at
the end of this session, every single one
of you would feel like you can build an AR app. So what we're going
to cover today is how you can get
an asset, how you can build an app
using that asset, and finally how you
can iterate and launch. But before I get
into the details, I want to tell you a
little bit about something that I like to do. So I really like to time
myself doing ordinary things. I'll time myself doing
things like ironing a shirt, or even taking a shower,
or reading my kids a book. It's kind of quirky, I know,
but I like to optimize my time and do these things
as fast as possible. Sometimes my kids don't
appreciate it that much. So what I want to
do today is I want to do something in AR
as fast as I possibly can by doing something
that I call AR done fast. That's right. I'm going to do a speed run
through building an AR app. And Tom, I'm wondering if you
can help time this speed run and see just how fast I
can build an app in AR. Now to make this happen
as fast as possible, I'm going to use the
Sceneform SDK, which is our new 3D framework that
we just announced and launched. And this framework makes our
development easy in Java. So that I can narrate
what's going on, I did a one take video
recording of developing an app last night. And we'll see how I did. Let's jump the next slide. And Tom, are you ready? TOM SALTER: Ready. TIM PSIAKI: All right. So let's go. Start. All right. So I'm getting my asset
from poly.google.com. To make this as
illuminating as possible, I'm going to search
for a lamp post. We'll see what I find. OK. So this first lamp post from
Naomi Chen looks pretty good. Let's have a closer look
and see if I like that. This is pretty much exactly
what I'm looking for. So I'm going to
download the model file. And as you can see here
I've got it downloaded. And I've got all the files
here in my Downloads folder. Now what I'm going to do is I'm
going to open the hello scene form sample app. And I'm going to see if I
can get this lamp post being used instead of Andy. So what I'm going to do is I'm
going to drag all of the model files into my
sample data folder, and then using our just released
Scene Form Tools Android Studio plug-in, I'm going to import
that asset into my project. As you can see, as
soon as I click Finish, we've got a Gradle build
going that's doing the import of our asset into our project. And we're going to see it pop up
in the viewer in just a second to see if we like what we got. There it is. So that's looking pretty much
exactly like it did on Poly. So let's see if we can get
that added in place of Andy. So we're going to jump
into the code here. And as you can see, Andy is
loaded using a resource ID. Now you might have
noticed I actually put this file into
the Assets folder so I'm going to change a
little bit how I load it. I'm going to load the file
from the Assets folder, going to drop that in, and
I'm loading the file now. And I press Play almost. Play. OK. Done. All right. Hang on with the time. Let's see what this looks
like back to the slides. There you go. OK. So here's our app. So we're done. We have an app
that's running in AR. We've got a lamp post. We can place our lamp posts. We can place a lot of
lamp posts in our office, and make our office as
decorated with lamp posts as we would like it to be. It's looking great. And so that was our speed run. And so let's see Tom,
how did I do there? TOM SALTER: One
minute, 45 seconds. TIM PSIAKI: One
minute, 45 seconds. That's minutes, not hours. [APPLAUSE] OK so that's how fast
we can build with AR. But now we're
going to slow down. And we're going to talk
through all the tools that we've built that
make this possible. So the first thing I'm going to
do is talk to you about assets. You can get your assets
that you use in an AR app from many different places. You can source them online. You could have an artist
create custom assets for you. But what I'm going
to do for this talk is we're going to get
our asset from Poly. Now Poly is Google's
library for 3D assets. On Poly you'll find
thousands of 3D assets that you can use in your apps. As you saw, I was able to
easily search for a lamp post and find exactly what
I was looking for. But if you want something
else like a tree, you can search for that. And there are lots of trees
available in all sorts of colors, shapes, sizes. There are maybe some
dangerous looking ones. Whatever you want, you'll
find what you're looking for. But if you don't know exactly
what you want and you just like to be inspired, you
can browse by category. Maybe you'll find an asset
there that will inspire you to create a whole new AR app. So have a look around and
see if you can find something that you like. And I think you will. Most of these assets are-- many
of these assets are licensed under the CC-BY license. And so that means that you can
download model files for these and use them directly
in your apps. All you have to do
is credit the author, and then you can use them. So here I'm downloading
the model file. And look at this. I have an asset. As you saw in the
speed run, here it is. Now if you look closely,
you'll see that this asset has three different files. Now you want to make sure that
you keep all of these files together, because each
one of these files is important to actually be able
to render this asset the way we want to. But what do we do with these
files now that we have them? Just for fun, let's crack one
of them open in Android Studio and see what it looks like. So this doesn't really look
that much like a lamp post. Maybe a little bit,
but you probably were guessing that this wasn't
going to look like a lamp post if we just opened
up the raw files. As you can see,
we've got an image. We've got some text, which is
showing what all the vertices are in this object. You might love a file like this
if you're a graphics developer. And if you were using a
game engine like Unity, it would know exactly
what to do with this. It would be able to render this. You'd be able to tweak it. It'd be great. And I actually want
to stop for a second and let you know that at
11:30 right here on stage 7, or of course
streaming on YouTube, there is another session
called Building an AR app with the Poly
Toolkit for Unity. And you can learn a
lot more about Poly and more about using
these assets in Unity. But for now, because we're
not building a Unity app, we're going to build
a native Java app, we want to see how we can use
these assets in Android Studio. So let me talk to
you about Sceneform. So we can work with 3D
models and Android Studio using Sceneform or using
the Sceneform tools plug-in that we just announced
and is available now. We announced it at the
developer keynote on Tuesday. So Sceneform is our
high level 3D framework. And it makes building
our apps in Java easy. Now Sceneform includes
a Runtime API and that handles all of your rendering. It handles on doing
things with a ARCore and making your app work
on the Android side. But Sceneform also
has a set of tools that interact with
Android Studio and let you import, view,
and tweak your assets right in your IDE. So where do you
get this plug-in? Well, like most other
Android Studio plug-ins, you can get your plug-in
right from the Android Studio Plug-ins page. You go to Preferences, Plug-ins,
Browser Plug-ins, and just search for Sceneform
and you'll find it. So once we have the
plug-in, it's super easy to get your assets imported. The first thing we need to
do is place our source files in our projects. We're going to drop them
in the sample data folder. Now this is a little
bit important, because the sample data folder
is a folder that does not get bundled into your APK. Now you might wonder why
would I want these files not bundled into my APK. Well, as you can see here-- well, actually hang on. I missed one thing. As you can see here, we support
over OBJ, FBX, and glTF files. And so we're going to
focus more on OBJ files right now because that's
what I got from Poly. But this exact same flow
works with FBX and glTF which you might find
other places online or sourced from artists,
things like that. But anyway, so we
drop these files into our sample
data folder like I said, because we don't want
them bundled into our app. Why don't we want them
bundled into our app? It's because we build
Runtime optimized binaries is that Sceneform
will be able to use to render your assets in Android. Now these binaries are
optimized so they load fast and they look great. They include all of
the data that you need in order for Sceneform
to render your models. So that file-- actually,
can you go back one slide? So that is the SFB file. And so you're going to hear a
little bit more about SFB files later, but just
suffice it to say that SFB files are
the files that we want bundled into our APK. OK. So now that we have our set
of files in our sample data folder, we're going
to right-click. And using the Sceneform
Tools plug-in, we'll click Import
Sceneform Asset. This will trigger our
import wizard flow. So this is the import wizard. As you can see, this
is fairly simple. We have three
fields to fill out. The first is the file
path to our model, and that's already
filled out for us. The second two are outputs,
and both of these files are created by
the import wizard. The first file you'll
see is an SFA file. And I'm going to go
into that in detail in just a couple of minutes. For now, you just need to know
this should be placed right next to your model,
because this is going to become one of
the source files that will show how
your model is rendered. Your SFB file is what you
want bundled into your APK, and so you're going to want to
put that somewhere that gets bundled. You kind of have
two options here. Your options are your Assets
folder and your res/raw folder. You can put it in
either of these. The loading is
slightly different when you get to the runtime,
but other than that, they'll both get bundled. And it doesn't matter. You can pick whichever
one you choose. So that's all we have to
do in the import wizard. We click Finish, and we're done. Let's see what happens next. So as you can see, as
soon as we click Finish we're going to have a
Gradle build kicking off. We're compiling here the
SFB that we just imported. Now what's going
to happen is we'll get these Gradle tasks added. We're picking these
defaults that we're going to fill in in the SFA
that I'm going to tell you about in a second,
and we're building this runtime optimized binary. But as soon as we
finish this build-- or sorry-- as soon as
we finish this build, we are going to see these SFA,
SFB files added to our project. And so you can see
we've got our SFA next to our model and
our SFB right here. I think I have it in
res/raw right here. But that's not all. We also have a viewer. And so the viewer
is going to pop open as soon as you get your
asset fully imported. And as you can see, we
see exactly what our model is going to look like. Here it looks just like it did
on Poly, which makes sense, because that's what
we implemented-- or that's what we imported. And you don't have to
deploy to your phone. You can just see it right away. You don't even have to
have an app built yet. You could just import it right
into your project and view files without having
your app built. And this uses the exact same
renderer as our runtime, And? So what you see is what you get. This is going to look just
the same on your phone. OK. So this looks just
like it did in Poly. But I was thinking I wanted it
maybe a little bit different for my app. In Poly it sort of looks maybe
like it's rough plastic maybe. And I was hoping it would look
maybe a little bit shinier. So let's see how
we could do that. So I talked to you about
the SFA a couple of times. So there was a text file
that you saw in the viewer. And so that file
is your SFA file. The SFA file defines
how Sceneform will render your asset. You can see it has a bunch
of different parameters. And if you look
closely, you'll see that these might affect the way
your asset is going to look. So we're going to look
at a couple of these. One of them here is metallic
and one of them is roughness. And so by default when
we import from Poly, we get something where
the roughness is one and the metallic is zero. And that's going to make it
look like it's sort of made out of plastic. And what I'd like to do is turn
the metallic up maybe to one so it looks really metallic
and turn the roughness down. Maybe not all the way to zero. I don't want it too shiny,
but I'm going to turn it down, and then I'll see what happens. So as soon as I
save the SFA file, we're going to rebuild
the runtime bundle. And as soon as that
is complete, we're going to see that
reflected in the viewer. So here you can see,
I guess, this gif is sort of looping a little bit. But you can see once it
completes the viewer is looking a little bit different. It's looking pretty metallic. This is basically
what I'm going for. So I want to point you to
this documentation online. So I mentioned a couple
of the parameters. And there are actually
a lot more parameters that you can edit
in your SFA file. There are a lot of parameters
that affect the look and feel of your asset but,
there are also parameters that affect things like
scale or the way its collision works for when
you're tapping on your objects. And if you check out this page,
you'll see all of the details on all of those parameters. Now one thing I'll
mention, I mentioned how we support glTF,
FBX, and OBJ files. When you import
each of these files, the parameters will
be slightly different that you can edit in your SFA. And all of the details
are on this site, so I would strongly
recommend you go see which things
you can tweak in each of these types of files. OK. So we've got our asset
looking the way we want it to. Let's see what we need
to do to get it in AR. Now you remember I
talked about placing the runtime optimized
binary, the SFB, in the assets or res/raw folder. So let's look at how we would
load that in our runtime out of each of these. So as you can see here,
this is an activity that you guys are all
very familiar with. We have an onCreate method. And what we want to do
is our model renderable is an object that
in the runtime is going to describe all
the details of how to render our lamp posts. You could think of this as
the runtime representation of our file that has
our optimized model. And so what we need to
do is we need to load our file into the runtime here. And we're using that using
the model renderable builder. Now loading can
take a little time, so this API is an
asynchronous API. And all we have to do is create
a builder, set its source-- here I'm setting it to
a resource ID in res/raw that points to the
lamp posts in build. Now the model renderable builder
returns a completable future. And so that completable
future we're using this then accept method
so that after that's complete, we can on we can go and
store the renderable that's loaded into our
lamp post renderable so that it can be used later on. So this was for
loading out of res/raw. If we were to load
out of assets, it's just slightly different. This code is almost exactly
the same except for one line. When we set the source ID, we
just need to set it to a URI and parse out the actual file
name instead of a resource ID. You'll notice this is just
the lamppost.sfb directly in the Assets folder. This is how we would load that. You can define a directory
hierarchy however you please and just reference the path all
the way from the Assets folder to your SFB right here. So there's a lot more to learn
about Sceneform and about how to actually display
these in your app. When I was doing my speed
run, I used the sample, and so that's a great way
to just try this out and see how your assets will look. But if you'd like to
learn more about how to do more things
with these assets and how to display
them in the runtime, you should check
out this session-- it was yesterday-- Rendering for Android Apps. And it's available
on YouTube of course. You can learn a lot more there. So I'm going to jump right into
our actual app we've built. And let's take it for a spin. See what it looks like
now that we've tweaked the parameters in the SFA. All right. So there is our lamp post. It's lit nicely. It's got everything
that you would expect of a giant lamp post
in the middle of your office. And it's looking great. That's just what
I was going for. So let's jump back
to the slides now. So I want to talk about
one other thing, which is I'm asking you to install
this Android Studio plug-in. You might wonder, what
is this plugin going to do to my project build? Do I want to install this? And so let me tell
you a little bit about what's going
on under the hood. So we've built a
Sceneform Gradle plug-in. And this is what is actually
doing the asset builds that I showed you. When you import,
what we are going to do is we are going to
add this Sceneform Gradle plug-in to your Gradle build. The first thing we do is we just
add the classpath dependencies to your project build file. You'll note that we load
this from the Google Maven Repository so you want
to make sure that that's in your project build file. Of course, this is the default
for all new Android apps, so this shouldn't be a problem. And then in your app
build build.gradle file, we apply the plug-in just once. We don't do that more than once. But then for every
asset that you import, you'll see these
asset definitions in your Gradle file. Now you'll see there
are a few pads there. And if you remember
the import flow, you'll remember that
those look very familiar. And in fact, you're right. Those are the exact same pads
that we had in the import wizard. This is all the
import wizard does. It writes these
files into Gradle. So could you edit these Gradle
files manually if you wanted? Sure, that would be fine. Now we think that the Android
Studio plug-in is the quickest and easiest way to get
these in, but of course they can be edited and
changed here as well. And finally, once you build and
you've got these asset rules in your project, you'll
see that here I'm building an APK using the
assemble debug rule in Gradle. And what you see is we're
getting this compile asset task. And we have a compile asset
task for each of the assets that we've added to
our Gradle build. And these are added to
the dependency chain so that they'll run before
the tasks that merge in assets or resources into your APK. And so you can always be
sure that these bundles will be updated-- these optimized binaries will
be updated and ready to go every time you build an
APK, and you'll always get the latest and
greatest version of the optimized binary that
has all your updated parameters. And that should be-- that should-- yes. Oh sorry. I forgot about source control. So one thing you
might ask, since we're building these files as
part of Gradle, and do we need to check them
into source control? Where should they live? And the truth is you
don't have to check in the optimized
runtime bundles. These could just be built as
part of your Gradle build. But we would actually
highly recommend that you do check them in. Since these are
trapped in Gradle and we understand all of the
dependencies and the input files that were used to
generate our optimized binary, we can always understand
when it needs to be rebuilt. And so if you just check in your
SFBs check in your SFAs phase and then pull down the latest
source from your source control in build. We'll know which
we have to rebuild and which we can
leave unchanged, and so you'll always have
the latest stuff to run. OK. So now we have our basic app. And I'm going to
turn it over to Tom. And he's going to tell us
how after we've built an app we would test it, iterate,
and launch that app. So here we go. Tom. TOM SALTER: Cool. Thanks, Tim. Hi everyone. I'm Tom. I'm engineering manager
on the AR and VR team. I'm going to talk
about some of the tools that we've built to make
iterating, and building and debugging your
apps even easier. So first of all,
earlier this year, we released ARCore support
to the Android Emulator so no matter where you are,
what devices you've got, you're always able
to build for ARCore. So if you'd like to develop
your ARCore app without having to deploy your device, get
up and walk around your room, place your objects,
then you can do that. If you would like to develop
in a coffee shop, or on a bus, or on the plane, then
we've got you covered. The virtual environment that
we let you walk around in it's got multiple rooms. It's got tables. It's got chairs. And if you also
want to see what it looks like with a cat in your
room, then you can do that too. So to get started,
all you need to do is use Android 3.14 or above. Use the latest Android
Oreo system image. Make sure you use an image that
has access to the Play Store so that you can
download ARCore or there are instructions on our website
to side load it if you'd like. And then it should be
the default option, but make sure in the
advanced settings that you're using the
virtual scene as your camera for the back facing camera. Once you've settled
up, it's good to go. So walking around
your room, it's not as easy as holding your
phone up and looking around. But if you played a lot
of video games like me then you'll find it
really, really natural. So use WASD on your keyboard
and the mouse or trackpad, and you can navigate like
a first person video game. So the really cool thing
about the emulator support is we actually run the
entire ARCore tracking stack. We don't fake anything
on that rendered image. We just run the
full tracking stack, which is really cool because
as new features for ARCore come online, then it
will just work magically within the Emulator. So this week in the keynote,
we announced support for augmented images. And whilst all that
stuff works, we don't actually have any images
in that environment just now. But if you get the Canary
build of Android Studio with the Emulator
right now, then we've added this extra
UI which lets you place an image in your world. You can change the size of it. You can put it on horizontal
surfaces, vertical surfaces. And as you can
see, we didn't have to do anything to make it work. Augmented images just works
perfectly in the Emulator. So that helps you iterate. One thing that we've
worked really hard on is helping you debug
your application. So I want to show
you a tool that we've been working on that helps
building and debugging these complex immersive
scenes really easy. With Sceneform, and Unity, and
Unreal engine integrations, we've made building
your app really easy. But sometimes you want
to go under the hood and see what's happening. So we built GAPID, which is
the graphics API debugger. So it's a really
powerful tool that we've been working on that lets you
inspect all of the graphics API commands that your
application is calling and lets you replay
them later on. So if you want to see exactly
how your frame is being built draw call by draw call, or
see what resources your app is being used, or investigate
why your app doesn't look quite as you expect, then
GAPID can help. So it also helps you understand
exactly what your GPU is being asked to do as well. So here's a really, really
quick example of GAPID. We're just stepping through
draw call by draw call. You can see the Andy model,
the raw resources that this is using, and you can see
every single gldraw elements call which contributes
to your final image. And you can see it built
up draw call by draw call. So we're going to go for a quick
whirlwind tool through GAPID. There are two main stages. The first stage
is tracing, which is capturing every single API
call that your app is making. And then secondly,
there's replay, which is on your
desktop or laptop so that you can step
through and reproduce those issues that your app has. So first of all,
to get going, you need to capture a trace
of your application. So File, Capture Trace. You have to choose your
device, choose your package, making sure your package is
debuggable, and then click OK, and then it will connect to
your device, start your package, and then it will start
streaming this information back. So once you've captured an
interesting part of your scene, this is the UI that you get. And you can replay
each frame within GAPID and step through it
draw call by draw call. For open GLES
applications, which most of our AR Core
applications are right now, we actually capture from
the start of the application all the way through
until you click Stop. Use the filmstrip UI at the
top to choose between the frame that you'd like to
take a deeper look at. And then once you've found
an interesting frame, you can really dig
into the details. On the left, we've
got the command view. So this shows every
single command that contribute to
drawing your frame. It's neatly separated by
frame and by draw call. And any of the parameters to
those functions can be changed, and you can replay
it just to see how that might make a difference. In the center, we've
got a framebuffer pane. So that shows what
the application looks to up to the current command. And that way you can step
through every single command. We've added a bunch
of options so you can visualize that framebuffer
in a different way too. So you can edit the histogram. So if your application
is particularly dark, you can bring it
more into the light, and you can change
those performances. On the right, there are various
tabs for your resources. So you can see the
textures that contributed to your application. You can see every mesh. Go and investigate
the polygon count. You can see it actually
gives the number of vertices and triangles. So you can take a look there. If you also want
to, all the shaders that go into building you're
scene, and the raw open GL state, you can go and
look at that, too. So let's take a look
at a concrete issue, the kind of thing you
might see with GAPID. So this is an example that
I created from our Seattle office. This is just an Andy hanging
out on a big chess set. It looks like just
one or two androids just hanging out there. But a really common issue
on mobile applications is overdraw. So overdraw is when you
shade a pixel multiple times. That can become a
performance bottleneck on our bandwidth-limited
mobile devices. So you generally want
to shade your pixels only one, two, or three times. So how many times do
we think those pixels on that front Android
actually get rendered? Well, it's kind of 1, 2,
3, 4, 5, 6, 7, keeps going. It actually goes
all the way to 20, and that's really, really
expensive on our mobile GPUs. So GAPID can let
you see and like check your assumptions on
how your frame is actually being built up. It just lets you kind
of go under the hood and see exactly with Sceneform,
and with Unity, and Unreal, exactly what they're doing. So to walk to my
sight, you could render your objects front to
back instead of back to front. So my ask of you all who are
building AR applications here is go and take GAPID for a spin. Go and download a trace
of your application. Go under the hood. Poke around and see. Just check your assumptions. I'm always surprised every
time I go and take a capture. So there are a lot more
features within GAPID. Go and take a look at
the Github to I/O link. And we have a lot of
tutorials and guides on there on how you
can do extra things. GAPID supports Cardboard,
Daydream, and Vanilla Android apps as well, as well
as the Vulcan APIs that are going to become
popular over the next few years. So once you've developed your
AR app, and you've debugged it, and it's looking
exactly as you want, your next step is to get
it out into the world. So we'll talk a little
bit about that now. So with AR, there are
two types of applications that you should think about when
you're developing your AR app. There's AR optional
and AR required. So your AR optional apps,
they can be installed and run on devices that
don't support ARCore. But you have to go and
test for ARCore support if you want to create
an ARCore session. On the other side,
AR required means it can't be installed on the
device until you actually-- unless that device
supports ARCore. So for those devices-- for AR optional
device applications, the onus is on you to
go and write that code that checks to see if
ARCore is installed and also checks to see if the device
is ARCore compatible. So we'll go through
an example of what that code looks like right now. So we have a very
simple availability API. You go in Core,
check availability, and you wait until
that returns true. If your device supports
ARCore, then you can go and enable your UI widget
that takes you into your AR mode within your session. But before you go and
check if you have support, you should check whether ARCore
is actually installed or not. So we basically have
this simple API that is ARCore APK Request Install. That can tell you
whether it's installed. If it's not installed,
it can go and install it in the background for you. Once you've done all that,
you can go and create your ARCore session as normal. And so for AR
required applications, they're really, really simple. There's really nothing to do. We you go and do the white
listing on the Play Store for you so it will only
be available on devices that have ARCore support. ARCore is also
installed when the app is installed in these cases. And these new devices
are ARCore enabled, these devices will just be
whitelisted automatically. Just before we
finish up, we would love to get feedback
from you on this session. But just to recap,
we've shown you how to build your AR
applications really, really quickly with Sceneform
using assets from Poly. We've shown you how
to iterate quickly in the Android Emulator. Even if you don't have a
device, you can go and build ARCore applications right now. We've shown you how to debug
gnarly graphical issues in GAPID and understand
what you're scene is doing. And we've shown you how to
ship them on the Play Store. All right. Thanks everyone. [MUSIC PLAYING]