[MUSIC PLAYING] SUNDAR PICHAI: Good morning,
everyone, and welcome. Let's actually make
that welcome back. I'm excited to
show you how we are deepening our understanding
of information so that we can turn
it into knowledge and advancing the
state of computing so that knowledge is easier
to access, no matter who or where you are. Today, I'm excited
to announce that we are adding 24 new languages
to Google Translate. This includes the first
Indigenous language of the Americas. And, together, these languages
are spoken by more than 300 million people. Using advances in 3D mapping
and machine learning, we are fusing billions of
aerial and street-level images to create a new high-fidelity
representation of a place. We are using computer
vision to detect buildings at scale from satellite images. As a result, we have increased
the number of buildings on Google Maps in
Africa by five times. At Google, whenever I get
a long document or email, I look for a TL;DR at the top. TL;DR is short for "too
long, didn't read". And it got us
thinking, wouldn't life be better if more
things had a TL;DR? That's why we have introduced
automated summarization for Google Docs, and Docs
are only the beginning. We are launching summarization
for other products in Workspace. PRABHAKAR RAGHAVAN: Later
this year, we'll add a new way to search
for local information with multisearch near me. Let's say there's quite a
tasty-looking dish online. With this new
capability, I can quickly identify that it's japchae, find
nearby restaurants that serve it, and enjoy it in no time. With an advancement we're
calling scene exploration, you'll be able to use
multisearch to pan your camera and ask a question and
instantly glean insights about multiple objects
in a wider scene. ANNIE JEAN-BAPTISTE: Today,
we're excited to share how we're starting to
use the Monk Skin Tone Scale to build more inclusive
products across Google. We've started using the
Monk Scale to help improve how we understand and
represent skin tone in products like photos and search. JEN FITZPATRICK: Even as technology
grows increasingly complex, we keep more people safe online
than anyone else in the world. We were the first consumer
technology company to offer two-step verification, and
we're now the first to turn it on by default. Whether
you're on Android or iOS, just one tap on your
phone and you're in-- no six-digit codes. We're constantly monitoring the
security of your Google account to give you peace of mind. We're now adding a safety
status on your profile picture. So if anything needs your
attention, we'll let you know and then guide you
through simple steps to ensure your
account is secure. We've engineered a
new technical approach we call Protected Computing. At its core, Protected
Computing is a growing toolkit of technologies that
transforms how, when, and where data is processed to
technically ensure the privacy and safety of your data. DANIELLE ROMAIN: Soon, if you
find search results that contain your contact details,
such as your phone number, home address, or email address,
that you want taken down, you can easily request their
removal from Google Search. SUNDAR PICHAI: The journey we
have been on with computing is an exciting one. We've always thought computers
should be adapting to people, not the other way
around, so we continue to push ourselves to
make progress here. To share more about how we are
making computing more natural and intuitive with the
Google Assistant, here's-- SISSIE HSIAO: You should be able
to easily initiate conversations with your Assistant. First is a new feature for Nest
Hub Max called Look and Talk. I can simply look
over and ask, show me some beaches in Santa Cruz. ASSISTANT: I found a few
beaches near Santa Cruz. SISSIE HSIAO: We're also improving
how the Assistant understands you by being more responsive
as you just speak naturally. For example, I might tap
and hold on my Pixel Buds and say, play the
new song from-- Florence and the something? ASSISTANT: Got it. Playing "Free" from Florence
and the Machine on Spotify. SUNDAR PICHAI: Today, we
are excited to announce LaMDA2, our most advanced
conversation AI yet. We are at the
beginning of a journey to make models like
these useful to people and we feel a deep
responsibility to get it right. That's why we have
made AI Test Kitchen. It's a new way to explore
AI features with a broader audience. This demo tests if
the model can take a creative idea you
give it and generate imaginative and
relevant descriptions. JOSH WOODWARD: And this will be the
first ever live demo of LaMDA from stage. I'm going to tap
Start, and this is a project I've been
thinking a lot about lately, plant a vegetable garden. I'll send this off to
LaMDA, and there it is. On the fly, it's come up
with these different steps and broken it down into
this list of subtasks. One of the other
things LaMDA does is not just break down lists,
but you can generate a tip. So here when I tap
Generate a Tip-- it's never seen this
one before actually. It's telling me if I have
a small yard or patio, it gives me different vegetables
I might be able to grow. You can see all the
different pathways that LaMDA is helping
me think through and giving me tips
along the way. SUNDAR PICHAI: As
you just saw, LaMDA2 has incredible
conversational capabilities. To explore other aspects
of natural language processing and AI, we recently
announced a new model. It's called Pathways Language
Model, or PaLM for short. It's our largest model to date
and trained on 540 million parameters. And when we combine that
scale with a new technique called chain-of-thought
prompting, the results are promising. Chain-of-thought
prompting allows us to describe
multistep problems as a series of
intermediate steps. Now if you asked
a model, how many hours are in the month of May,
it actually answers correctly. Recently, we announced plans
to invest $9 and a half billion in data centers and
offices across the US. One of our state-of-the-art
data centers is in Meigs County, Oklahoma. I'm excited to
announce that there, we are launching the
world's largest publicly-available machine
learning hub for all our Google Cloud customers. We hope this will fuel
innovation across many fields, from medicine to logistics,
sustainability, and more. And, speaking of
sustainability, this hub is already operating at
90% carbon-free energy. This is helping us make
progress on our goal to become the first
major company to operate all our data centers and
campuses globally on 24/7 carbon-free energy by 2030. SAMEER SAMAT: Android 13
builds on our Material You design language so your
phone has even more ways to adapt to your style. Of course, Android 13 comes
jam-packed with dozens of new security and
privacy features. When you send a message from
your phone to someone else's, you want to be sure
it's private and secure. That's why we've worked with
carriers and devicemakers all over the world to
upgrade SMS text messaging to a new standard
called RCS, which can enable important
privacy protections, like end-to-end encryption. Today, we're excited to
introduce the new Google Wallet. It's a digital
wallet for Android that gives you
fast, secure access to your everyday essentials. TRYSTAN UPSTILL: We are
now approaching 270 million active users
on large-screen devices. Starting today, we'll be
updating more than 20 Google apps to look amazing
on large screens. LIZA MA: And here's one
of my favorite new features. Copy something on your phone
and paste it on your tablet. It could be a URL, an
address, or even a picture or screenshot. RICK OSTERLOH: An ambient approach
gets the tech out of your way so you can live your life while
getting the help you need. It doesn't matter what
device you're using, what context you're in,
whether you're talking, typing, or tapping. The technology in your life
works together seamlessly. SONIYA JOBANPUTRA: Pixel 6a gives
you the same great Android experience as our Pixel 6 Pro. We truly believe it's the
best smartphone we've ever offered for this price. BRIAN RAKOWSKI: Let me give
you an early preview so you can see what's
coming this fall. Pixel 7 and 7 Pro are
designed to deliver the most helpful, most
personal experience you can get in a smartphone. They'll use the next generation
of our Google Tensor SoC, bringing even more AI-heavy
breakthroughs and helpful, personalized experiences across
speech, photography, video, and security. Pixel Buds Pro-- these earbuds
have all the helpfulness and smarts you'd
expect from Google embedded in the best mobile
audio hardware we've ever designed. And they're the first Pixel Buds
with active noise cancellation. The new Google Pixel Watch. It's the first watch built
inside and out by Google, and it's coming this fall
with our new Pixel 7. SUNDAR PICHAI: Today, we talked
about all the technologies that are changing how we use
computers and access knowledge. Looking ahead, there
is a new frontier of computing, which
has the potential to extend all of
this even further, and that's augmented reality. At Google, we have been
heavily invested in this area. Let's take language
as an example. Language is just so fundamental
to connecting with one another-- the joy that comes
with speaking naturally to someone, that
moment of connection to understand and be understood. Let's see what happens when
we take our advancements in translation and
transcription and deliver them in your line of sight in one
of the early prototypes we've been testing. SPEAKER: I'm actually
looking straight into your eyes and it seems like you're
looking right at me. SUNDAR PICHAI: That's what
our focus on knowledge and computing is all
about, the ability to spend time focusing on what
matters in the real world in our real lives. And it's what we strive for
every day, the products that are built to help. Thank you so much.