Good morning, everyone. It's great to
be back at I/O. At Google, the past year has given renewed purpose to our mission, to
organize the world's information and make it universally accessible and useful. We continue
to approach that mission with a singular goal, building a more helpful Google for everyone.
For over a decade, we've been pushing documents toward collaborative, linked content
inspired by the web. Smart canvas is our next big step. And this fall, we're
excited to bring Meet directly into Docs, Sheets, and Slides for the first time.
Today I'm excited to share our latest breakthrough in natural language understanding, LaMDA. It's a
language model for dialogue applications. Natural conversations are generative and they never take
the same path twice. LaMDA is able to carry a conversation, no matter what we talk about.
Let’s see what happened when we talked to a paper airplane.
What’s the worst place you’ve ever landed?
That would have to be in a puddle. I was just
barely still intact, so I was stuck there for a few minutes. It was quite annoying.
But it is still trained only on text. So we need to build models that allow people to naturally ask
questions across different types of information. These are called multimodal models. For
example, when you say, show me the part where the lion roars at sunset, we will get
you to that exact moment in a video.
Two years ago at I/O, I announced auto-delete.
We've since made auto-delete the default for all new Google accounts. Now after 18 months
we automatically delete your activity data. And this is now active for
over two billion accounts.
We want to free everyone from password pain.
Password alerts will let you know if we detect any of your saved passwords have become
compromised in a third-party breach. And a quick fix feature in Chrome, the assistant will help
you navigate directly to your compromised accounts and change your passwords in seconds.
Today, I'm excited to announce the TPU V4. A single V4 pod contains 4096 V4 chips.
And each part has 10x the interconnect bandwidth per chip at scale compared
to any other networking technology. Achieving our quantum milestone
was a tremendous accomplishment, but we are still at the very beginning of a
multi-year journey. We hope to one day create an error-corrected quantum computer. And success
could mean everything from increasing battery efficiency to creating more sustainable energy
to improved drug discovery and so much more.
You might ask, I've hiked Mount Adams. Now, I
want to hike Mount Fuji next fall. What should I do differently to prepare? But search engines
today can't answer it directly because it's so conversational and nuanced. MUM is changing the
game with its language understanding capabilities, it's highlighting that Mount Fuji is
roughly the same elevation as Mount Adams, but fall is the rainy season in Mount Fuji,
so you might need a waterproof jacket.
You’ve gotta scan the floor. So let's scan
the area. Whoa, and she's here. Yes!
And she goes for the triple double.
This is very accurate. Nails it.
Around the world, people use Lens to translate
over a billion words every day. So now, we're rolling out a new capability that combines visual
translation with educational content from the web. For instance, you can easily snap a photo of a
science problem and Lens will provide learning resources in your preferred language.
Advances in AI are helping us reimagine what a map can be. This year alone we're on track to release
more than a hundred AI driven improvements. We're adding prominent virtual street signs
to help you navigate complex intersections. We're bringing it indoors to help you get around
some of the hardest to navigate buildings.
Google Maps will soon give you the option
to take the most fuel-efficient route. At scale this has potential to significantly
reduce emissions and fuel consumption. More than a billion times a day, people are shopping across
Google. Shopping inspiration often strikes when we see something we like in the world around us.
When you view any screenshots in Google Photos, there will be a suggestion to search the
photo with Lens. You'll see organic search results that can help you find that pair
of shoes or browse similar styles.
Today, there are more than four trillion photos
and videos stored in Google Photos.
Soon, we're launching Little Patterns.
Little Patterns show the magic in everyday moments, by identifying not so
obvious patterns in the photos you take and resurfacing them to you.
Beauty is personal. To face this challenge, we imagined Material You –
a new design that includes you as a co-creator.
Letting you transform the look and feel of all your apps.
Just this week, we crossed an amazing milestone. There are now three billion active
Android devices around the world.
So let's start by taking a look
at our new UI for Android.
We've overhauled everything, revamping the
way we use color, shapes, light, and motion, inspired by Material You.
We've got something new planned for Google Pixel using what we call color extraction.
The system creates a custom palette based on the colors in my photo. The result is a
one-of-a-kind design, just for you.
Samsung and Google have a long
history of collaborating
and now we're combining the best of our two
operating systems into a unified platform focused on faster performance, longer battery
life and a thriving developer community.
To make smartphone photography truly for everyone, we've been working with a group
of industry experts to build a more accurate and inclusive camera.
We're making auto white balance adjustments and algorithmically reducing stray
light to bring out natural brown tones and prevent the over brightening and
desaturation of darker skin tones.
We've been working to make mammography better
and now we're collaborating with Northwestern Medicine on an investigative
device research study
to better understand how AI can apply to
the breast cancer screening process.
Technology can and should help
close the equity gap.
We were all grateful to have video conferencing
over the last year, but there is no substitute for being together in the room with someone.
So several years ago, we kicked off a project to use technology to explore what's possible.
We call it Project Starline.
It's early, and currently available
in just a few of our offices, but we thought it'd be fun to give you a look at
people experiencing it for the first time.
Using high-resolution cameras
and custom-built depth sensors, we capture your shape and appearance
from multiple perspectives.
And we have developed a breakthrough light
field display that shows you the realistic representation of someone sitting right
in front of you in three dimensions.
By 2030, we aim to operate on
carbon-free energy, 24/7.
And today I'm excited to announce we
are the first company to implement carbon intelligent load shifting across both time
and place within our data center network.
Investments like these are needed to get to
24/7 carbon-free energy, and it's happening right here in Mountain View too.
We are building our new campus to the highest sustainability standards.
Over the past year, we have seen how technology
can be used to help billions of people through the most difficult of times.
It’s made us more committed than ever to our goal of building a more helpful Google for
everyone. I hope to see you in person next year.
I'm most excited about TPU v4, unified WearOS with Samsung, Project Starlight, and the Quantum computing project.
Would love to get more updates from the Quantum team, I would imagine there are plenty of real-life use cases and scientific breakthroughs that can be done before the million qubit milestone in 2030.
This is needed. I watched the whole thing and it was SO BORING!