[♪ dramatic music ♪] [Narrator] Before the 1950s car manufacturers had a problem: safety. Sharp flanges. Metal that didn't crumple. Untempered glass. And you can forget about
seat belts and airbags. Cars were dangerous. [car horn] The problem was that
the study of crashes was the study of the aftermath. Trying to piece together
what went wrong from a few vague clues. That's a pretty tough
environment to learn from. If the auto industry
wanted to get safer they needed a hero. Someone that would allow
them to study crashes from the inside. They needed... Sierra Sam, the world's first
crash test dummy. Over the course of Sam's life, Sam endured all manner of uh... testing. But these weren't
random incidents. Each one was carefully
planned and dissected. For the first time Sam
allowed experts to study what happens in a car crash
as it happened and then make adjustments
to the cars accordingly without the need for
anyone to get hurt. So you can thank Sam
for airbags, seat belts and all the things
we take for granted. The safety features that
were only made possible by studying crashes
in real time. Today car companies
rely on dummies like Sam to create cars safe enough
for us to drive every day. But what about the
technology we use every day? How do we ensure it’s
secure, shored up and have safety features built in to protect the
people who use it? The same way the auto industry
did all those years ago. By breaking things
over and over and then studying exactly
what failed from the inside. [sound of door closing] [♪ anthemic music ♪] When it's your job to keep
billions of people safe online, you have to live and
breathe and see the internet just like the attackers do. Because the only way
to stop a hacker is to think like one. [♪ soft music ♪] This is Daniel Fabian. [Daniel] Hey. [Narrator] Despite his
disarming presence, he carries one of the
most diabolical titles in all of Google. [Daniel] The job title
I chose for myself is Staff Digital Arsonist. [♪ sinister music ♪] Yeah, I run the
Offensive Security Team. So setting things on
fire in this context mostly means to run what
we call Red Team exercises where we basically take
the role of an attacker and try to hack into Google. [Narrator] You heard
that correctly. Google's Red Team is a
group of security employees who spend their days just trying
to break Google's security. [Daniel] I think
it's totally fine to just say we're hackers. [♪ soft music ♪] I was always kind of interested in like the mechanics
behind things. [watch ticking]
I remember my mom got super upset because I disassembled
one of her watches. No particular reason outside
of curiosity; how it works. Tiny gear wheels and
they all click together. It just fascinated me. I wasn't able to
put it back together— just to get ahead
of that question. [laughs] But I think a hacker is someone who tries to break things in order to understand them. [Narrator] It may
sound unsurprising, but this hacker mentality is shared by many of
Daniel's colleagues. [Niru] I mean, since I like
taking things apart and finding issues, the best place to do it
is in the Red Team. [laughs] [Narrator] This is
Niru Ragupathy a security engineer
on the Red Team. But in some circles she's
known by a different name. [Niru] My handle is c0rg1. [dog barking] [Niru laughs] [Narrator] Beyond
screen names and avatars, she's even got the
hardware to prove it. [Niru] I should have never
told the c0rg1 story. It supposed to be an Easter egg. [dog howling] Yes, I would say I'm a hacker. I might qualify it by
saying I'm an ethical hacker where you're not hacking
to steal things, you're hacking to
help fix things. [♪ soft music ♪] [Director] What's your
relationship like with the Red Team? [Heather] The Red Team
are my favorite enemies. They bring a completely new
way of looking at the system because they're not
burdened every day with having to maintain it. With having to deal
with reliability issues and functionality issues, and that's so valuable
for us understanding that we didn't actually build it always the way
we thought we did. [Niru] How does it hold up
against actual hackers? That's kind of our job
as the Red Team to come give that answer. [Heather] So they're
kind of like our agents of testing, if you will. [Operator] Fire! [♪ rousing music ♪] [Narrator] Just as
car companies have different models
and components to test, Google has different
products and infrastructure. Each with unique features
and security controls. And it's the Red Team's job to put all of them
through their paces. We'll let Daniel and
Niru give you some hits. [Daniel] Yeah,
sounds good. We have targeted Ads, we've targeted Search. We've targeted— [Niru] Gmail. [Daniel] Google Cloud. [Niru] Chrome. [Daniel] YouTube. [Niru] Maps. [Daniel] What else
did we target? Um... [Narrator] You get it. [Daniel] Yeah, whatever. [Narrator] Every single
Google product, each with its own security team dedicated to keeping it safe. And the Red Team—they've
targeted them all. In fact, the mere mention
of one Red Team exploit is enough to put security
engineers in cold sweats. [Tim] Ooh. Yes, I have heard of the
infamous plasma globe. [Tim] One of the most
creative exercises. [Eduardo] Most
memorable exercise. [Niru] Anyone would
fall for it. [laughs] [Fatima] I heard
about it, but it might have been
before I joined Google. [Narrator] Alright, so
it's not the newest exercise, which is probably most apparent
when discussing the target. [Daniel] Back when
we ran the exercise, like, the latest, newest,
bestest Google product were Google Glass. [Sergey Brin] This can go wrong
about 500 different ways, so tell me now, who wants
to see a demo of Glass? [audience cheers] [Narrator] That's right,
Google Glass. [Creator] A phone
for your face. [Robert Scoble] It's part
of my life I'm never gonna take it off. [Narrator] Google's
first venture into wearable technology. [Daniel] We wanted to target people who work on Google Glass so we could get access
to design documents, blueprints, the electronics behind it. Everything that a real adversary would be very interested in. [Niru] Some of Red Teaming is
to actually understand how people in the real world
are causing harm and actually emulate
their behaviors. That doesn't mean that we will do everything
an adversary does, right? [Narrator] Right. Red Team has something that most attackers don't... rules. [Daniel] Yeah, we have
a set of rules that we call
"Rules of Engagement." First of all,
don't break anything. Another obvious one is, we can never ever access
real customer data. [Narrator] The rules
are also there to make sure no one
actually gets hurt. [Niru] Yes. [laughs] No bribing. No coercion. [Narrator] They literally
say it's not okay to chloroform security guards. [Niru] Yes, obviously
we don't do that as well. [Daniel] So in our
exercise we kept thinking, "Okay, what could we do to get Googlers to actually give us
access to their computers?" And one idea that we had is we could send them
a small gimmick under some pretense like, "Congratulations on your anniversary for
working at Google. Here is a small gift." [Narrator] So what was this
devious doomsday device that the team gifted to
unsuspecting Googlers? [Daniel] Yeah, I have it
here actually. I searched my attic
and I found it. It looks just like a very,
very regular USB plasma globe. If there is a thing as a
regular USB plasma globe. [♪ ominous music ♪] [Narrator] Yes, a
USB-powered version of the science
gift shop favorite, adorned with the Google decal, and loaded with
malicious software. Now remember this was 2012. A time when USB was perhaps the most ubiquitous
interface on the market. Beyond thumb drives or webcams, people were plugging in fans, lights, fans with lights, charging devices
you use every day and pointless things
you only use once. But once is all it takes. No matter how
the ports were used, plugging in a USB could exploit
a soft spot on a computer allowing an attacker
to easily get inside. [Daniel] When you plugged
in the plasma globe if you were super careful, you would see for about a
10th of a second or so a black window
flash up on the screen. And that's when the plasma globe would send a series of
keystrokes to the computer that would download our backdoor and install it on a computer, and that is that. To be fair, not everyone
plugged it in so there were definitely
some people who were careful. However, we did get two or
three people who plugged it in and who got infected. [Narrator] You might imagine successfully compromising these computers would mean the end of
this story, but... [Daniel] Okay, I
should take a step back. The people that we compromised
using the USB plasma globe were actually completely
unrelated to Google Glass. [Niru] Oftentimes when
you first get your foot in through the door, you might not be near what
you're trying to get to. So think of it like
playing a video game. [♪ video game music ♪]
You're in this level, you cleared the first area and now you're in a new area and the map is
just black, right? [Daniel] So usually in order
to get to the target, it is multiple different
security exploits that we call a kill chain. [Narrator] Yep, a kill chain: a string of different attacks that bring hackers progressively
closer to their goal. Despite the plasma
globe's cunning success, it was only the first
link in the kill chain used to steal the plans
for Google Glass. [Daniel] After we used
the plasma globe, we were able to
assume the identity of the person we
had compromised and then we could
use the privileges to access anything that
that user has access to. [Narrator] Including
their corporate email. The Red Team drafted fake emails using the stolen identities
of the compromised employees and sent them to
the real targets who were busy working
on Google Glass. [Daniel] At the very bottom
of the email, there was a tiny image and that image was loaded
from a website that we built. [Narrator] That website
would lift the target's
digital fingerprints without them knowing, but for it to work the target would first
have to open the email. [Daniel] We were thinking, "Okay, what kind of email
is interesting enough that people would
actually open it, but on the other hand
not so interesting that they would discuss
it with other people." And the topic that we
settled on eventually was workplace ergonomics. [♪ rousing music ♪] [Infomercial voice] Even small
movements can add up to big damage when they
are done repetitively. [Daniel] So like, how do
you sit in a healthy way and make sure that you
have the right posture, and that your desk
is the right height, and that you don't
get back problems. That sort of thing. [Infomercial voice] When
force is applied, the damage is multiplied. [Daniel] Long story short
is the email... worked well. [laughs] [Narrator] Emails were opened, targets were compromised, digital fingerprints
were captured. It was time for the Red Team
to get what they came for. [Daniel] So we presented
those identifiers that we stole back to Google and
Google Drive thought, "Hey, that's the user
that works on Google Glass, and who has access to all the
design documents and blueprints and everything related
to Google Glass," and it allowed us
to download them. [Narrator] By any measure, this marked the successful
completion of the exercise. But then... [Daniel] But then we
get really bold. Yes, we decided, okay, now
we get all the blueprints but it will be really cool if we got a physical pair of Google Glass glasses.
[♪ upbeat music ♪] [Narrator] The team
pressed their luck and wrote a new email directly to one of the
Google Glass team members. "Hey, we need to pick up
a Google Glass for like some VIP at Google so we're gonna come to
the office and pick it up." I hate lying to people and I always get super
nervous and anxious when I have to do that. So it turns out that we
actually had like one or two grammar
mistakes in there. [Narrator] The mistakes
were enough that when they arrived
at the office there was no pair of
Google Glass waiting for them, and they were instead met by Google's Head of
Physical Security. [Security guard] Come
in control. [Daniel] So yes, we did
eventually get detected. That was kinda like
the back flip where we then
fell on our noses. [Robot voice] Game over. [♪ soft music ♪] [Narrator] Even when the Red
Team completes an exercise, the job is only half done. The other half? Patching the holes
the Red Team utilized to make their way
onto Google systems so no real attacker can
use them in the future. In the case of the
plasma globe exercise, the hole was
quite literally that— a USB port hole in a computer
found on nearly every one of the hundreds of thousands
of Google corporate machines, each one of them a potential
point of entry for an attacker. [Darren] Maybe a real attacker might not do the
plasma globe thing, but they have a lot
of opportunities, whether that's, USB keys dropped
in the car park, modifying the mice that people plug into their
devices and other things. And so understanding
that that is an attack, how do we ensure
that this class of issue that we're dealing with, how do we make that
go away so that any USB device that you plug in isn't just open for an
attacker to break in? [Daniel] So we had
to figure out how are we gonna protect
against this type of attack? And what we did was we
developed a software that would defend USB ports
against suspicious activity. [Narrator] Through USB, the plasma globe
infected the machine by delivering hundreds
of keystrokes of commands in a fraction of a second. Remember this? [beep] Yeah, that fast. Much faster than any human
being could realistically type. [Daniel] This software
listens for keystrokes. If the keystrokes
come in too fast, then it blocks them. [Narrator] And just like that, USB ports across Google
got a little bit safer. But what about the
hundreds of millions of other USB ports and computers all
around the world? [Daniel] Well the nice thing
about this fix is that we actually open-sourced
it as well. So anyone can install it
on their machines and make sure that
they are protected against this type of attack. [Narrator] Of course, this is just one of
the countless patches Red Team exercises have created. They've helped make the
internet safer for everyone. With that level of success, you'd think their work would make headlines
and news stories. But the truth is it doesn't and probably never will. Red Team exercises
are disappearing acts. Thousands of vulnerabilities
brought to light and fixed in silence. The only real mementos are
stronger, safer products for the billions of
people that rely on them. [♪ pensive orchestration ♪] [Daniel] If we are able
to make changes that make the next exercise
much more difficult, I say we've done an amazing job and we've actually achieved
what we came here to achieve. And we do really see that because probably 90%
of the stuff that we do just doesn't work. [Niru] We try it, and
if it fails, it's okay, we'll try again. [Director] Does that
failure ever frustrate you? [laughs] [Niru] It's a good thing when
the Red Team fails. That means that things
are working well. [♪ soft music fades out ♪] [♪ anthemic music ♪] [Eduardo] we ask people
in the world to tell us about
security issues. [Bug hunter 1] Nice catch. [Bug hunter 2] Nice catch. [Bug hunter 3] Nice catch. [Eduardo] That's why it's
called bug hunting. It's like hunting,
[laughs] for bugs. [Tim] These days, you
could sell that vulnerability on the gray market. Almost certainly to
exploit users. We will keep paying you, if you keep reporting bugs.
[cash register sound] [Bug hunter 3] They were gonna
pay 10,000 dollars. [Hacker 2] Whoa! Very nice catch! Oh my god! [Katie] The best
hackers in the world are securing the internet and everything that runs on it.