JESSICA DENE EARLEY-CHA: You can
integrate your Android widgets with Assistant to enable
users to receive quick answers and interactive app experiences
on Assistant-enabled services like Android and Android Auto. Let's see how widgets work
within App Actions, how Assistant can use them, and
highlight the steps needed to make an existing widget to be
accessible to Google Assistant. Since App Actions is
needed to integrate widgets with Assistant,
let's start there. App Actions is how you can
integrate Google Assistant into your Android
app, and is how end users can launch and control
Android apps with their voice. So when a user says, "Hey,
Google, order a pizza from example app," Assistant
will process the user's input with natural language
understanding, matching the request to either
a built-in intent or custom intent, like ORDER_MENU_ITEM
built-in intent. It also does entity
extractions to pull out supported parameters,
like getting the parameter menuItemName from the user's
input, which in this case is pizza. Check out our reference stacks
for the full list of all our supported built-in intents. Assistant opens the
Android app, passing the data extracted from
the user's voice request and opens a screen that
starts a pizza order. That, though, is great for
starting a pizza order. But I know that when I'm hungry
and I'm waiting for my food, I want to know the status
of my order quickly. I don't want to
wait for a whole app to open just to find
out about my order. So when I say, "Hey,
Google, check my pizza order on example app," Assistant
will process the input, match it to the GET_ORDER
built-in intent, and extracts "pizza"
from the input for the orderedItemName
parameter. Since the app has
been configured to return a widget
for this intent, my order status is displayed
within the Assistant. No need to load the entire app
just to give me a quick update. The widget is then displayed
within the Assistant UI. Assistant can also
provide a spoken response to go along with
the widget if you want it provided by the widget. We'll revisit this
text-to-speech feature later when we go over steps needed
to integrate an existing widget to Assistant. Widgets can be invoked by
the Assistant in two ways. The first is just what we
did, where the user asked for information that triggers
a built-in intent or custom intent. The widget is displayed
within the Assistant UI. The second is when
a user directly requests the app's widget. So users can say, "Hey, Google,
show example app widget." By integrating your
widgets to Assistant, users will be able to
discover your widgets since they'll be displayed
within the Assistant UI with a chip, so users can
add the widget to their home screen. In situations when the user
hasn't unlocked the device or with Android Auto,
widgets can still be surfaced as a
result of a query. Now let's dive into how
to integrate your Android widgets to Assistant. Any existing widget can be
configured for Assistant widget fulfillment. If you don't have
one already, let's go over a few key concepts of
implementing an Android widget. Widgets are a
great way for users to quickly monitor information,
complete tasks, or be inspired by their home screens. You can think of them
as at-a-glance view of apps' most important
data and functionality that is accessible right
from the user's home screen. Widgets take elements
like AppWidgetProvider to define the widget's behavior,
and the AppWidgetProvider info for its metadata, and,
finally, the broadcast receiver to allow for triggering
of the widget. After implementing a widget,
you'll create or modify an App Actions' capability by
adding an app-widget tag. Let's use the prior
example of checking the status of my pizza order. Here is an example of
a GET_ORDER capability with an app-widget tag. This built-in intent
contains parameters which are pulled from
the user's query. For example, "Check my
pizza order" on example app will pull "pizza" as the
value for the parameter called "name." This data will be sent as
extras via the Android intent. Also included in
the intent extras is the "hasTts"
configuration, which allows you to include custom
introductions for your widgets. By saying to "true,"
it is letting Assistant know that
the widget will have texts to read using
text-to-speech function and displaying on the screen. We have some best practices
for text-to-speech later in this video. There are many times
when the user might not provide the value
of the parameter, like, for example, "Hey, Google,
check my order on example app." So we'll need to include
a fallback intent. A fallback intent
requires no parameters. You could think of
it as the "else" in a conditional statement. Fallbacks are needed
whenever you have a parameter in your app widget. Here's an example of a fallback
intent for this capability. It only contains information
on constructing your Android intent. Now that the
capability is updated, the Android widget
needs to be bona fide. It'll first need to
extract the built-in intent name and its parameters
to construct the widget for the Assistant to use. This is done by adding the
Widget Extension Library and then extracting
the information. Add the library to the
Dependencies section of build.gradle. Here's an example
of a class that extracts the built-in
intent name and parameter from the widget options bundle. It's importing the App
ActionsWidgetExtension class. Here we're accessing the data
that was sent via the Android intent as extras. In this example, we are
using bundles and the App ActionsWidgetExtension to
pull the built-in intent name and its parameters. Then you could use that data to
construct the UI of the widget. Next is to add the
text-to-speech to the widget. Here is a continuation
of the previous example where it is setting the
speech in text strings. This is the
text-to-speech that was mentioned before in the
capability configuration. The speech string will
be played to the user, and the text string
will be displayed as text in the Assistant UI. You may want to use
two different strings due to the differences between
written and spoken language. For example, you might want
to include the order number in the text to be displayed. But in the text-to-speech, you
should just refer to the order as "your order." You'll update the
widget with a text to present then
update the widget UI. By doing this, the
SDK automatically enables the launcher
pinning, adding the Add This Widget chip
to the response. Going back to the text-to-speech
feature that enables Assistant to speak the response
along with the widget, we have some style
recommendations. The first recommendation is
using simple and plain language because it has the
broadest appeal, making it accessible to people
of all backgrounds. Here's an example of a
speech containing the phrase "Your order has been delivered." With the display
text, the phrase contains "Your order
43512 has been delivered." In situations where
the number of the order isn't important or relevant,
the speech phrase can omit this. Use contractions in your speech. Words like "cannot" and "do not"
can sound punishing and harsh. This example speech contains the
phrase "Account doesn't exist," while its text contains
"Account does not exist." Use serial comma in a list
of three or more items. Without the serial comma,
individual items in your list can be incorrectly heard or
misinterpreted as groups. For example, "Your
last orders were of yellow roses, daffodils,
daisies, and sunflowers." "Daisies and
sunflowers" sound kind of like they come together,
while "daisies, and sunflowers" are clearly separated. Use numerals instead of text. Numerals make visual
content more glanceable. Similar to numerals,
use symbols, like a currency sign instead of
using text, for glanceability. So instead of using the word
"dollar," use the dollar sign. Now, a few things to
avoid, like niceties. Niceties makes response
feel distant and formal. Ditch them, and keep the
tone friendly and informal. In this example, by removing
"Sure, I could tell you that," it keeps the content
plain and simple. Finally, avoid
exclamation points. They can be perceived
as shouting. Widgets can be used as
fulfillment to a user's query. They are ideal for simple
answers or brief confirmations. And since they're displayed
within the Assistant UI, it can help users with
discovering your widgets. SDK automatically
enables the launcher pinning so users will
see the Add This Widget button [INAUDIBLE] added
to their home screen. And during hands-free
contexts, widgets can be surfaced on lock
screens and Android Auto. To learn more about App
Actions, check out our docs, codelabs, and videos. Join our developer
community on Reddit where you chat with other
App Actions developers. And stay up to date by
following us on Twitter. I'm Jessica. Thanks for watching, and I can't
wait to see what you build.