RADO: Hello, everybody. Thanks for coming to the last talk of the
conference. My name is Rado. I work at Google on the Angular team and on
the team that supports the TypeScript infrastructure in Google. We're talking about building Angular applications
like Google. Google is known for scale, so there will be
a lot of talk about scaling up. Everything you see here is part of Angular
labs umbrella project, so it's some things are experiments. We want you guys to talk a look and be excited
about them. If you want to use them in your production
systems, it is probably not the right time right now! We will see. So, we're all web developers here, so are
you familiar with this question: you go to users and you ask them: do you want a powerful
application, kind of like a desktop application with lots of features and power? Or do you want the easy-to-use application
like the web where you type the address in the URL bar and very easy to use? So, if you go to users, and you ask them what
is going to happen? Yeah. Exactly. Kim Kardashian agrees with you - not really,
it is fake! They will tell you why do you ask me such
a stupid question? I want them both! Get out of here! So let's flip it around. We provide tools for you guys, and we ask
you do you want new JavaScript features. Async awaits TypeScript, or do you want the
instant edit first cycle. What do you want? You want both. Exactly. So, let's figure out how to deliver that. So we want both but this is it what we see
in the wild. We see that things start well. You feel like you have it both, right. At one end, code base is small, set up your
tooling, it looks great, instant edit refresh cycle. Things tort at a grow that you didn't think
about before and are starting to weigh you down. Development sucks, and you're not sure why,
and you blame the tooling. That's what we see that happens if you're
not careful. This is where we want to be. Like we want as the application scales, we
increase the lines of code, we want to see you still have enjoyment building this application. We want it to be - it's going to increase,
but we want the growth to be gradual increase. One main thing to do as we are scaling up
is we've got to know when to complain, if we complain too often, nobody's going to listen
to us. If we don't complain at all, maybe we really
set apart tooling bad, we did the wrong set-up, and something that should be complained about. You have to know what too slow means, and,
to that end, what we've decided to shoot for, target at, is two seconds at refresh cycle. This is what we are trying to achieve. We think we can do it. And I will show you some of the tooling we
use to try to achieve this, but two seconds, we think it is reasonable for even very large,
large applications. How do we know about large applications? Because we work in Google and we have a lot
of TypeScript code and Angular code. We heard from Jen about the single application
with 10,000 TypeScript files, and around 3,000 of them are Angular. That's all of them going into one application,
to one front-end, and in total, a bunch more but split amongst smaller applications. So, given this scale, we think we can achieve
the two-second editor-first cycle and asking how do you do it, and can I do it too? How do we do it? Incrementality. We have to use incrementality. The size of the project to be comparable to
the change you made, not the rest of the project. Any project you're making small changes, one
file at a time. Still making the change. You change the file. You don't want to the rest of the size of
the project weigh you down and make your buildings slow. To achieve that, you have to use incrementality. We need boundaries. What is the unit of incrementality? Functions, classes, modules, every file is
a module. It turns out neither of them is really the
right unit. We need something a little bit bigger, and
TypeScript doesn't have that unit. We have to define our own packages. We need to define something a little bit bigger
that contains more than one file. So we define TypeScript package. Each package gets compiled separately. All the dependent packages will receive the.d.ts
files and is it don't need to see the actual sources. We have packages of units of - what do we
do with Angular? Angular has this from the get-go, NgModule,
incremental builds for Angular, which has an ngc compiler. We have the json files which describe the
API shape for other packages, and, yes, that's one of the main reasons why we have NgModules. If you're working on a smaller project, they're
actually the bread and butter of really big projects, when the scale grows, these are
the packages that they will define the packages of your Angular application. So at this point, if you're thinking packages,
or that means like, you know, a new GitHub repo version releasing like heavy overhead,
that's not what I'm talking about. I'm talking about thin, easy-to-create and
easy-to-reuse packages. In Google we have the mono repo so we don't
have a release step of exchanging packages. These are the units that help our build. On average, we have around eight TypeScript
files per package in Google. I'm not saying those are preferred number,
this is just what actually has evolved. Around 1.5 components per NgModule. It is pretty low. That's what the developers are doing. Now that you have packages, what do you do? How do you stay fast? You have to follow two simple rules. Rebuild at the right time, so rebuild the
right packages every single time, no complex work done over all the sources, and keep your
build tooling warm. What are the right packages? When you change a file, we're going to rebuild
the package that contains that file, we have to do that. And then there's extra step which has the
API changed? If the API has changed, we have to build all
the dependent packages. If the API hasn't changed, we're done. We have to do this calculation here as an
example. So, if I work on one of the modules, work
on module 0, if it has the same API, I don't need to rebuild any of the other packages
that depend on it, I just need to rebuild that one down here. Making changes, I keep rebuilding this one. If module 0 has a different API, I have to
rebuild that and rebuild the parent because maybe the parent has similar methods on this
package, but I don't need to touch the parent's parent. There's another scenario which is if team
zero exports the API then we have to this chain but, rest be assured, the rest of the
build graph doesn't need to be rebuilt. Here's an example what I mean by API change. Here's a component. This is how it looks right now. Coming at to make this change. I'm going to add one more logging line in
the implementation of the click-handler. The API has defined produced by TypeScript
hasn't changed in both cases, it is exactly the same. Therefore, the parent doesn't need to be rebuilt. The API's always just one method, here's another
change. This is a change where I add another click
handler. The API now looks different. There's an extra method, so now I need to
rebuild the parent. So, that's kind of the reasoning you have
to do every time you make changes to achieve incrementality. It's not to fun to do by hand. We just did it on the stage, but we need a
tool to do this reasoning for us, right? Maybe there will be one at the end of the
talk that does that for shadowing. All right, number 2: no complex work over
all sources. TypeScript and ngc can work incrementally. We have CSS processors, we have minification
tools. The real world is more complicated than these
tools. You will probably do some work over all the
sources but my advice is don't do the complex work. When you do work over the whole project, be
cognizant of that, measure it, and don't do anything complicated there. Here's an example of what wave to do in our
set-up. This is the build part. The cloud is what I just showed you. This is where TypeScript works. At the end of the day, there is a dev server
that has to concatenate the sources. That concatenation needs to be done. We think it is trivial, we can measure, it
is fast. If you have a slow global optimiser, a compiler
like - a Closure compiler, it's - we just don't run it during development mode. We kicked it out of the development flow because
it doesn't achieve the incrementality and therefore you can't reach your two-second
SLO. This is a very straightforward trick in this
business: keep Europe build tools warm. If you're calling node.js, the compiler is
a node.js programme, parsing, 1.5 megabytes of JavaScript, that's not efficient, you've
got to keep it parsed and ready. Cache the right things and use all the CPUs
you have ready on your machine. All right. So, what is a tool that achieves all this? That's Bazel, a tool we've been using internally
in Google for many, many years, at least ten years, and only recently has been open-sourced. It can learn more about it in this really
cool new top-level domain name, .build. This is its tag line. Fast, correct - choose two. You don't want to make a trade-off between
fast and correct, you want it to be fast and correct. Bazel achieves that. It's a general-purpose build tool. It is used for all Google builds for all the
languages, right? Even for Java, C++, languages that have other
build tooling, we use Bazel or all of them. It is also agnostic to the version control
system. It is not really tied to anything that we
use. And why is it fast? Well, yes, it does the three things that I
just mentioned. Therefore, that is how it achieves the speed
at the scale that we use it at. We've done some benchmarking. This is the - it appears to be doing a lot
better than the current state of the art. I guess that is webpack, but this disclaimer,
don't trust benchmarks. What we are more excited about is the trend
line. We're going into the thousands of TypeScript
sources, and it's scaling pretty nicely on what appears to be a linear curve. The SLO is the green line here. So we have work to do, or then again, the
lines are always going to cross, but we are hoping to push it further than the size of
the applications that big enterprises are writing. Some other benefits for Bazel. It's one build tool for the front end and
back end. Like I said, for all the languages, that has
some real benefits when you, for example, go and edit the data exchange format between
front end and back end and you re-run all the tests for both sides of the story. It's very nice and it makes you very productive
when you have that available. And it really tries to achieve hermeticity
which is a fancy way of saying whatever happens on my machine, probably, the same thing will
happen on your co-worker's machine, so your builds are pre-produceable. That's a same thing like having a test reproducible. This has happened to many of you when you
pass a project to your co-worker, and it doesn't build, and you tell them to remove modules,
or something. That's something that, with the right care
in that build system, you can avoid. Of course, if you can reproduce it across
different developers' machines, you can reproduce it in the cloud and you get other benefits
from that. All right, do you guys want to see it in action,
actually? Yes, enough talking about it! How does this look like. All right. So here I am. So, this is an Enterprise application - a
very large Enterprise application. This is my first day on the job. I'm told I'm working on component 0. Doesn't 0 is compart of module 0, NgModule
0. There are ten components in this module. There are ten modules in this team, and this
team, there are ten teams, and they all work on this aeroplane application that there is
one application that rolls all this up. So, if you did the maths, that means there
are thousands of different component types, right? 1,000 components in this application. Not very well named. Components. I don't suggest you name your components like
this! It feels like a script with all this application. I don't know why. Right, but it is my first day. I don't know what to do. How do I even build it? I've been told to use Bazel. That's nice, because it is a query language
where I can start querying the whole build and see what is happening. Let's first see how many NgModules are there
in this application. Lots. 200 NgModules. Like I said, every NgModule has its own build
unit and the unit is not seen here. How do I start it? How do I start it? I know that there's something called node.js
binary, and I query it quickly and see there's only one node.js binary. Let's do Bazel and profile it into a file. So I'm kicking off the build in a clean state. That's not in cache right now. There is some time. That's going to take some time to build 1,000
components. One thing you see immediately, there are three
workers happening at the same time because this machine has four cores. One is taken by Bazel, the other three are
immediately paralysed by the build. It's building. This is a - it will take a while. While it's doing that, I want to show you
the actual file that describes this build. So this is the description of the build that
I'm passing to Bazel. For those of you that are a little bit older
in the audience, this started to sound like Make, good news, the syntax of Bazel is sane. There are no significant taps. It doesn't have significant white space. Pretty readable right away. There's some visibility at the top. I import an NgModule and say this is my module,
it takes some files. This is my ts.config and there are some tests. It finished. It took one minute to build the whole application. I can look at the analyse profile. All right, so there is a lot of information
that it captured. What I really care about is I guess total
time, one minute. There's the information here that you can
poke around it if you're more interested. The number of actions, this is the number
of compilations that happened, I guess divided by two. There is action to move file, so it's around
100 compilations that happen that matches the fact we have hundreds of NgModules and
thousands of components. That's not very fast, but it is a big application. Let's now see the incrementality aspect of
it. What I will do next is I run I Bazel which
is the watch mode for Bazel. It sets up "watch" on the thousand files so
whenever any one of them has changed, it will recompile. There it is. I start the server. It didn't take one minute because old actions
are cached on this, right. Even though I killed Bazel and restarted it,
everything is cached, and now we can take a look at this amazing aeroplane application. Don't get your hopes high, it is not very
pretty! But anyway, it is there. Each one has its own component with its own
AoT compilation when all that happened. A true enterprise application, I'm leverage
impact to deliver value, yes! That's great. I'm told I should deliver more value, though. This is my task. This is my ticket. So let's deliver more value. And what happens is I didn't touch anything,
it is automatically rebuilt and refreshed, and this took a lot less than one minute. I hope you believe that. In fact, it took 1.6 seconds. Less than before, yeah! [Applause]. How do we do it? No magic tricks - just incrementality. We rebuilt everything that needed to be used. There is a two-to-one factor. There was one compilation that needed to happen,
and that resulted in two actions. And this was because the API didn't change. Remember what I was talking about earlier? I only changed the implementation, and I didn't
change any APIs, so let's do one other change where we can change the API, so instead of
leveraging the impact, let's do something more drastic. Let's do a paradigm shift. Paradigm shift. And we will just add one more handler here. Double the value. I just save it. And there it is. Right. Yay. I think - I think I'm climbing the corporate
ladder! And like I said, the API changed, so we had
to do four actions right here, which meant two compilations - two recompilations, and
it look a little bit more time - four seconds - so, again, a very far cry from the one minute
it took initially. And that is the end of my demo, but again,
I just want to point out that there was no - this was a big application, we made incremental
changes and everything that needed to be type-checked was type-checked. I didn't push down any work on some integration
engineer, or somebody down the pipeline that had to resolve my changes. This application, as big as it is, is still
completely TypeScript, correct as defined by TypeScript so all the type-checking that
needed to happen has happened. There were no tricks to play, or like asynchronous-type
checking. It happened synchronously, only when ready
that it was pushed to the screen, so there were no tricks. All right. That was the demo. We used a pretty weak machine with three workers. It turns out this demo, because it has a branching
factor of ten, if you want it on your desktop, it uses all the CPUs, keeps scaling as much
as your build graph, as much as you have any build graph. What happens when you have even more parallels
is in your build graph, your machines can have, you can build it in the cloud. That's what we do internally in Google. There is the project to run your own build
farm with Bazel, and actually, just recently, it's been announced in the Bazel conference,
I think it's happening concurrently with this conference, but recently we are announcing
that you can now build in the cloud on Google cloud. You can see how naturally, when you have the
hermeticity you can ship and scale as much as you want. All right, that's all I have for the Bazel
demo. You can find a demo and run it. There's nothing secret about it. It doesn't use anything that's not available. The future as part of your Angular build is
still a little bit experiments, therefore the Angular labs. What we are working on right now, a bunch
of window support, camera testing, more benchmarking. We want to change the build on Angular - Angular
is big as a TypeScript application itself. We want to change it to use Bazel. What we are working on maybe next year basically
down the road, cold-splitting - code-splitting. Closure compiler is the JavaScript minification
tool we use in Google. It already works in Angular. It has a few rough edges that we want to improve
on, and that is something that is on our road map to improve on. The build file that you saw, it didn't look
too terrible, but we can auto generate those based on the TypeScript imports so we have
tooling in the pipe to - in the pipeline to do that. And finally, expose it as a kind of experiment
option as a CLI so you guys can bootstrap it and start using it faster if you wish to
do so. If this is - if you are excited about this,
I've been waiting to hear about this, I work on a big enterprise application, I want to
start using it, you want to be part of our pile-up launch, please - our pilot launch,
Robin and Stephen, talk to them. The nrwl team is already and looking into
using Bazel. Reach out, and we can collaborate on using
this tooling. Want to learn more? Follow Alex. He is the driving force behind all this. He will be giving this talk if he was here. I'm here instead of him. Follow him on Twitter to get updates. This is the canonical link, not just for this
talk but for all the talks, and documentation around this project - abc standing for "Angular
Bazel Closure" - Closure being the Google tooling that we are exposing internally for
you guys to use. That's all I have, thank you. [Applause].