MALE SPEAKER 1: A data centre's
the brains of the Internet. MALE SPEAKER 2: The
engine of the Internet. FEMALE SPEAKER 1: It is a giant
building with a lot of power, a lot of cooling and
a lot of computers. MALE SPEAKER 3: It's row, upon
row, upon row of machines, all working together to
provide the services that make Google function. JOE KAVA: I love building
and operating data centres. I'm Joe Kava, Vice-President
of Data Centres at Google. I'm responsible for managing
the teams globally that design, build and operate
Google's data centres. We're also responsible for
the environmental health and safety, sustainability
and carbon offsets for our data centres. This data centre, here
in South Carolina, is one node in a larger
network of data centres all over the world. Of all the employees at Google,
a very, very small percentage of those employees are
authorised to even enter a data centre campus. The men and women
who run these data centres and keep them up 24
hours a day, seven days a week, they are incredibly passionate
about what they're doing. MALE SPEAKER 2: In layman's
terms, what do I do here? FEMALE SPEAKER 1: I
typically refer to myself as the herder of cats. MALE SPEAKER 4: I'm an engineer. MALE SPEAKER 3: Hardware
site operations manager. MALE SPEAKER 2: We
keep the lights on. MALE SPEAKER 1: And
we enjoy doing it. JOE KAVA: And they
work very hard, so we like to provide them with a fun
environment where they can also play hard as well. FEMALE SPEAKER 2: We just went
past the three-million-man-hour mark for zero
lost-time incidents. Three million man-hours
is a really long time, and with the number of people
we have on site, that is an amazing accomplishment. JOE KAVA: I think that the
Google data centres really can offer a level of security
that almost no other company can match. We have an information
security team that is truly second to none. You have the expression,
"they wrote the book on that." Well, there are many of
our information security team members who
really have written the books on best practices
in information security. Protecting the security
and the privacy of our users' information is
our foremost design criterion. We use various layers
of higher-level security the closer into the centre
of the campus you get. So, just to enter
this campus, my badge had to be on a
pre-authorised access list. Then, to come into
the building, that was another level of security. To get into the secure corridor
that leads to the data centre, that's a higher
level of security. And the data centre and
the networking rooms have the highest
level of security. And the technologies
that we use are different. Like, for instance, in
our highest-level areas, we even use underfloor
intrusion detection via laser beams. So, I'm going to demonstrate
going into the secure corridor now. One, my badge has to be on the
authorised list. And then two, I use a
biometric iris scanner to verify that it truly is me. OK, here we are on
the data centre floor. The first thing that
I notice is that it's a little warm in here. It's about 80
degrees Fahrenheit. Google runs our
data centres warmer than most because it
helps with the efficiency. You'll notice that we have
overhead power distribution. Coming from the yard outside, we
bring in the high-voltage power distributed across the bus bars
to all of the customised bus taps that are basically
plugs, where we plug in all the extension cords. All of our racks don't really
look like a traditional server rack. These are custom designed
and built for Google so that we can
optimise the servers for hyper-efficiency and
high-performance computing. It's true that
sometimes drives fail, and we have to replace
them to upgrade them, because maybe they're no
longer efficient to run. We have a very thorough
end-to-end chain-of-custody process for managing
those drives from the time that they're
checked out from the server til they're brought to an
ultra-secure cage, where they're erased and
crushed if necessary. So any drive that can't
be verified as 100% clean, we crush it
first and then we take it to an
industrial wood chipper, where it's shredded into
these little pieces like this. In the time that I've been
at Google β for almost six and a half years
now β we have changed our cooling technologies
at least five times. Most data centres have
air-conditioning units along the perimeter walls that
force cold air under the floor. It then rises up in
front of the servers and cools the servers. With our solution, we
take the server racks and we butt them right up against
our air-conditioning unit. We just use cool water
flowing through those copper coils that you see there. So the hot air from the servers
is contained in that hot aisle. It rises up, passes
across those coils, where the heat from
the air transfers to the water in
those coils, and then that warm water is then
brought outside the data centre to our cooling plant,
where it is cooled down through our cooling
towers and returned to the data centre. And that process is just
repeated over and over again. To me, the thing that amazes
me about Google and the data centres is the
pace of innovation and always challenging the
way we're doing things. So, when people say that
innovation in a certain area is over, that we've kind of
reached the pinnacle of what can be achieved, I just laugh. [MUSIC PLAYING]
Hah! I love the alligators in their moat, excellent security choice.
edit: Also, clearly the VP must have the coolest scooter in the video.
My "OK google" hotword detection was triggered twice while watching this video.
Very nice, I worked in the Facebook datacenter up in Prineville Oregon for a while. The security is far more lax than the datacenter shown in this video. You have to be admitted through the main gate to the camput, then you must pass through security to get into the datacenter itself. Past there you must have badge access to gain admission to the hallways that can access each of the server rooms, and then you must have badge access to get into each server room. The only place where there is security personnel is at the main gate, and the entrance to the building itself.
Though at the same time, I feel that the cooling solution for the facebook data center is superior to the one shown here. The lighting solution is also pretty nice, the entire datacenter floor (excluding lounge, offices...etc) is dark. It only lights up as you walk through it and the lights turn off behind you. Each row in the server rooms lights up as you walk into it and turns off when you leave, it's pretty posh.
Unrelated, if you can avoid working in the hot row, do so, it's scorching in there, and loud beyond belief.
I'm impressed by the size and complexity of the place, but admit I laughed a little at 1:45 when he's referring to the security team and it's focussing on a person with a helicopter hat :D So Google.
The dude scootering around like it's nothing :O
We have a couple razor scooters at my firehouse that the younger guys use to get around all the time.
Now when we do station tours I can accurately claim that the Fire Department utilizes the same technology in the Google data centers to better run this firehouse.
people are always uploading stuff onto YouTube, so is it a race against time to add hdd's to keep up?
I didn't realize they actually used retina scanners irl
I wonder how long it'd take a team of three guys with guns and forcible entry tools to make it into the ultra-secure server room.