[AUDIO LOGO] FAHD IMTIAZ: Good
morning, everyone. Thanks for joining us today. [CHEERS AND APPLAUSE] It's fantastic. It's fantastic to be
here live in Shoreline and connect with so many of you. With Android reaching more
devices from phones to foldables to tablets to
Chromebooks, building apps that seamlessly adapt has
never been more crucial. We believe now is the
time to elevate your apps and make them shine
on any Android device, and this talk is
just about that. We are going to cover all
things adaptive, the strategies, the tools, and
the mindset shift. I'm Fahd, Product Manager on
the Android Developer Team. [CHEERS AND APPLAUSE] Over to Alex. ALEX VANYO: Hi, everyone. I'm Alex, a Developer
Relations Engineer on Android. [CHEERS AND APPLAUSE] FAHD IMTIAZ: 2023
marked the launch of our exciting new tablet
and foldable devices-- the Pixel Tablet
and the Pixel Fold. But what made it phenomenal
is working with you-- our developer partners--
optimizing your apps and games for the growing Android
device ecosystem of foldables and tablets. We have worked
with many apps that were optimized for these
devices, delivering powerfully unique experiences, and also
worked with more than 50 Google apps to drive optimizations
for these devices. And we have learned
a tremendous amount from your incredible work
optimizing your apps, but also directly from users
about how they engage with apps on these
devices in unique ways. Today, we'll share
those insights with you to help you build
outstanding experiences. Our talk today will cover
building adaptive layouts and how the new Compose Material
3 adaptive library helps, supporting input beyond
touch, and lastly, testing your adaptive apps. Let's get started. There are now over 300 million
active Android large-screen devices out there, including
foldables, tablets, and Chromebooks. Since the last I/O, in
addition to the Pixel Fold, more OEMs have launched
new foldable models showcasing the incredible
potential of this unique form factor. We can't wait to see the amazing
app experiences you will create for this exciting form factor. And talking about foldables,
Counterpoint Research predicts foldable
smartphone shipments alone will top 100 million by 2027. That's massive. Let's switch gears to Play now. Building on the success of our
tablet-optimized Play Store experience, the new
foldable-specific Play Store experience with
tailored recommendations and foldable-specific
collections is a great way for
users to discover amazing large-screen
experiences, and for you to get the
spotlight on foldables. Last year, Google Play
began updating its featuring and ranking system on
large-screen devices to prioritize high-quality
apps and games higher, both in editorial picks
and organic rankings, all based on the large-screen
App Quality guidelines. Play also increases
the visibility for apps to multi-device
users by allowing users to see tablet-specific ratings
and reviews for your apps on their phones. Apps that fail to meet the
technical quality threshold will also display a red
message on the app details page that warns users when
apps may not perform well on their device. Users want apps that meet these
essential levels of quality, but they also want apps that
can help them do so much more. So the takeaway-- maximize
the reach for your app as high-quality apps are more
discoverable across the board on tablets, foldables,
Chromebooks, and even phones for multi-device users. ALEX VANYO: This is why you
should build adaptive apps, apps that seamlessly adjust layouts
based on the available window space and more. We're committed to
making it easier than ever for you to build
adaptive UI into your apps because this isn't just about
the devices of the future, it's about optimizing
for the experiences that users demand today. We'll talk about some
new libraries soon, but let's first get
on the same page. What makes an app
truly adaptive? Traditionally, apps
could assume they are running full screen, either
on a phone or on a tablet. But fast forward
to today, and apps are running across many,
many different devices. This assumption that apps will
always take up the full screen is brittle as apps can be placed
in any arbitrary window size independent of the
physical display size. For example, you might be in
split-screen mode browsing the web to research
a topic, while also watching a video on YouTube. And we're seeing this firsthand
with Pixel Fold and Pixel Tablet users. 47% of Pixel Fold
monthly active users use split-screen
mode for multitasking and increased productivity
on their devices. Similarly, apps can
previously assume that they'd never be
resized, and that's simply no longer the case. Apps cannot opt out
of this behavior. With all of these different
devices where your apps can run and all of the
ways users interact with those apps
across these devices, it's time for a mindset shift. Here's how you should
think about adaptive apps. Adaptive apps
adjust layouts based on conditions, such
as changes to the size of the window in which your app
is running, like the window size changing when your app enters
split-screen mode, changes to the device posture,
like users placing a foldable into tabletop mode,
or users adjusting the screen density or changing
the font size. Adaptive apps dynamically
adjust their layouts by swapping components,
showing or hiding content based on the
available window size instead of just simply
stretching UI elements. For example, you should swap
the bottom navigation bar for a vertical navigation
rail on larger window sizes. These conditions provide a
granular and meaningful way to understand the context so
that apps can adjust the layout to optimize for the user's
experience accordingly. For adaptive apps, the available
window size is the guiding star. With features like freeform
windowing and split-screen mode on Android tablets,
your app won't always occupy the entire display. So the key to ensuring
that your app consistently functions across devices
in any windowing mode is to use the available window
size to determine your layout. Adaptive apps also seamlessly
respond to changes, like a foldable device opening
or closing, a device rotating, or the size of an app window
changing in split-screen mode, and while doing
so, adaptive apps retain their state and offer
seamless continuity to users when conditions change. Expectations are changing
for Android app development. Adaptability for your
app to any UI size unlocks the seamless experiences
that users demand today. We expect apps to be used across
more device surfaces and window sizes, and apps will need to
become more adaptive in turn. Building an adaptive
app also sets you up for long-term success. Let's discover how. FAHD IMTIAZ: So with
those use cases in mind, we are releasing a
new Compose library that helps you build adaptive
layouts, which is now in beta. This library includes
layouts and components that adapt as users
expect when switching between small and
larger window sizes. Let's take a look
at this in action with SAP, who have integrated
the Compose Material 3 adaptive library into the SAP
Mobile Start app. This app provides a
native mobile entry point into SAP's business
applications and content. First, let's take a look
at the navigation UI. For optimal ergonomics,
users expect the navigation UI to
adapt to the window size, swapping between a
navigation bar and navigation rail as the window size changes. The new NavigationSuiteScaffold
component automatically switches between a navigation
bar and navigation rail depending on the
overall window size. SAP was already swapping these
components with custom logic. By using the new
scaffold, they were able to replace that custom
code with the logic provided by the Compose
Material 3 adaptive library without changing
visible behavior to users, SAP told us they were able
to delete their custom logic and switch to
NavigationSuiteScaffold within five minutes. Their previous logic was
using window size classes to decide which
navigation UI to show. And the new component
does the exact same thing with the configurable
default behavior. Because app may not-- [APPLAUSE] --apps may not always
take the full screen, the default behavior is based
on using window size classes as UI breakpoints. Window size classes partition
the raw available window-size into buckets. Each window has a
width and height, and the possible widths
and heights of a window are distributed into three
window size class buckets each. Let's take a look. For width, a window that
is less than 600 dp's wide, falls into the compact-width
window size class. A window that is at least 600
dp's wide and less than 840 dp's wide, falls into the
medium-width window size class, and a window that is
840 dps or larger, falls into the expanded-width
window size class. Similarly, there are
also separate breakpoints that define compact,
medium and expanded height. window size classes. But available width
is usually more important than available
height due to the ubiquity of vertical scrolling. The default behavior in
Compose Material 3 Adaptive for NavigationSuiteScaffold
is that if the window is compact width or compact
height, show a navigation bar. Otherwise, on medium or
larger-width window size classes, or medium or
larger-height window size classes, show a navigation rail. Let's look at another
example of how the adaptive layouts can help. The new library also has
ListDetailPaneScaffold and SupportingPaneScaffold,
which help you implement
canonical layouts that we recommend in many cases like
list detail and supporting pane. Looking at
ListDetailPaneScaffold specifically, this
component handles presenting a list and
detail view together in a way that adapts to how much
available window space there is. If there is room
to display both, say, in expanded-width
window size class, then both the list content
and the detail content will be displayed
together side by side. If the window shrinks, say, down
to compact-width window size class and there's not
enough room to show both, then only one or the
other will be shown. The component also supports
swapping between the list and detail panes
with animation when there's only room to show one. SAP has integrated the
SupportingPaneScaffold in the Mobile Start app
for the To-Do screen. This allows SAP to
display the comments next to the main content
on expanded-width Windows to allow users more context
during their approval workflows. The new component in
Compose Material 3 adaptive helped SAP build an
experience that adapts to the available window size. The available window
size is not a constant as users rotate, fold,
and resize your apps. With SupportingPaneScaffold,
SAP was able to provide users a
delightful experience that preserves continuity across
different devices and device states. Building UI with the
Material 3 adaptive library is a full, dedicated
talk that dives into the specifics
of these new APIs, so please make sure
to check that out. ALEX VANYO: Even if you
aren't using the new Material 3 adaptive library
quite yet, we recommend all apps to use the
principles and the APIs that the new library is
using under the hood that make it adaptive. When working on the Compose
Material 3 adaptive library, one of the main principles
we isolated was conditions-- the inputs driving how
your layout should adapt. Conditions are meant to simplify
how top-level layouts behave. There's nothing particularly
special about conditions in the architectural sense. They're just pieces
of state that we can use to drive behavior. What is important
about these conditions is defining exactly which
inputs to use and when. The window size and
the foldable posture are the main
conditions that we see users expect apps to react to. There are some possible
pieces of state that could be used as
inputs, but we do not use them as
conditions because we have seen them cause an
extremely confusing, buggy user experience. Physical display size, device
orientation, window orientation, and device type
are all not inputs that users are expecting
apps to react to. Let's talk a bit more
about all those inputs that are related
to size and go over first some examples of what
users expect when using apps. The physical display size
is a piece of information that you can query
from the system. However, in almost
every situation, this is not a useful
API for you to use. Your app is not guaranteed to
fill up the entire display size. As you saw earlier, users
want to use your apps in multi-window
mode to multitask, and you can't prevent this. Your users will put your
app into multi-window mode and expect it to work. If you are using the
physical display size and you're assuming that
your app is filling up that entire space, that
assumption will be wrong. If you make layout decisions
based on that assumption, your layouts will be wrong
when your app's window doesn't match up with the display size. And speaking of
the display size, it might have been
true in years past that there was a single
display size to worry about, but that is no longer true. Foldables and flipables often
have two physical displays, each of which has
a different display size and other parameters. The number of available physical
displays also isn't a constant. Many devices can be plugged
into an external monitor, adding a new physical
display at runtime. What really matters to your app
is your current window size. This is also what users expect. If they resize your
app, they expect it to care about the window
size that they are intentionally giving it. And this means that your
window size will change, and you also can't prevent this. When your app gets resized, that
causes a configuration change to occur. By default, each configuration
change recreates your activity. When this happens, all
of the in-memory state in your old activity is lost
as a new activity is created. It is critical that your app
save its state correctly, either in saved instance
state or by retaining it in a view model. If you lose state, then
users see a completely broken experience with your app. Not only did the
app, not adapt well with layouts when
they resized it, the app lost their
place and their work. As a user, it is aggravating
to be almost finished with some form, and then
I rotated the device, and now I am forced to go
back through and fill it all out again. To calculate the window
size, both for the first time and each time it changes, you
should use the window metrics calculator
computeCurrentWindowMetrics method from Jetpack
WindowManager. This is the same method that
the new Compose Material 3 adaptive library uses
to determine the window size to decide how to
adapt its components. Once you are using
the window size, you have a couple of options
to decide what layouts to use. You could look at the width,
the height, or the orientation, but we don't recommend
using the orientation. Instead, we recommend
using the window size classes that are based on
the width and the height. Let's take a closer look at why. It has been common to have two
different layouts-- a landscape layout and a portrait
layout-- for your app, depending on the
orientation of your window. However, this can lead
to awkward behavior for window sizes that are very
close to a one-by-one aspect ratio. For example, if we're showing
two panes in landscape and one pane in
portrait, then a window that is just barely landscape
will show two panes. If the user then makes
the window slightly bigger by expanding the
height of the window, the window will become
portrait and, therefore, we will show one pane. But this is counter-intuitive. Resizing the app to
give it more space resulted in taking
away information. So instead of using
landscape or portrait to decide how many
panes to show, you should instead use
window size classes. The default behavior
recommended my material design and implemented in Material. 3 adaptive is to have one
pane when the window is compact or medium width and
two panes when the window is in expanded width. And because the amount
of panes displayed is based only on the width,
we get much more predictable behavior as the window resizes. Let's talk more
about panes and how they compare to another term
that is hard to define-- screens. In common usage,
you probably refer to different parts of your
app as different screens. You might have a settings
screen, a profile screen, a home screen, a list screen,
and a detail screen. The term screen here,
however, has a drawback. It implies that your
screen is filling up the entire physical screen. On an expanded window
size, you have enough room to display both your list screen
and detail screen together. So now, neither
screen is filling up the entire physical screen,
so you have multiple screens on screen together. To avoid this confusion, see
if you can avoid the screen term altogether. Think about grouping your
content into different panes. Each pane is still
a set of content that you can navigate to. On smaller window sizes, you'll
probably see one pane at a time, and navigating
somewhere will just show one pane at
each destination. If the content in
a pane can nicely adapt to fill up
the entire window, there's nothing
wrong with always showing one pane that
fills up the entire window, even at larger window sizes. But the content in your
panes relate to each other, so you can often do
better than that. Your list pane displays
a list of content, and your detail pane
displays more information about a specific
piece of that content if you have enough space
to show both the list pane and detail pane together. And if you don't
have enough space, you can fall back to only
showing one pane at a time. When the window
size changes, you should hide or show
a pane when reacting to how much space you now have. But which pane you
show or hide should depend at which
destination you are at. This is compatible with one of
the principles of navigation that creates a consistent,
predictable user experience. How many panes you are showing
at a destination may change, but the destination
you're at doesn't. You should always have
some specific pane that remains visible
regardless of how many times the window size
shrinks or expands. Grouping content into panes
instead of just screens also unlocks more
flexible opportunities to rearrange panes, including
by user interaction. One request that
we've seen from users is the ability to resize panes. When in a two-pane
layout, sometimes a user really wants to
focus on the content in one pane over the other. The solution here should be
a pretty familiar interaction supporting expanding
panes with a draggable handle that will resize and
hide one of the two panes. Let's see this in action
with Google Calendar. The Google Calendar app now
features a new supporting side pane on expanded-width
window sizes. This feature allows users to
see relevant information side by side. Users can view details
of a specific event while retaining the context
of their daily schedule or review pending
tasks while checking their weekly commitments. And with an easy drag, the
size of the supporting pane can be adjusted as
the user desires. In Android 15, this
feature is being added to activity
embedding, and the support will also be added to
Material 3 adaptive as well. So everything we
just covered was related to the primary condition
for adapting the overall window size. As a summary, don't use
the physical display size. Don't use the
device orientation. And don't use the
window orientation to drive top-level layouts. Instead, use window
size classes. There are other
conditions that also influence how your
app layouts should adapt to a specific situation. The posture of foldables
is another condition that you can use to
provide an app that adapts. Retrieving the information
about the folding features is another piece of information
that Jetpack WindowManager provides. With this information, you can
adapt layouts in tabletop mode to provide interactable UI on
the bottom half of the window, and consumable UI on the
top half of the window. These are the main two
conditions that the Material 3 adaptive library
reacts to today, and makes available through
its lower-level APIs-- the available window size
class, and the foldable posture of the device. As we see devices continue
to become more varied and adaptive
themselves, we expect that the types of
conditions that users expect apps to adapt to will grow. Let's see how all of
these areas come together when looking at an
app that is using the camera while being resized. When in this mode, it should
become immediately clear that the orientation and aspect
ratio of the camera sensor is, in general,
completely independent of the orientation and
aspect ratio of your app's window and its UI. As the window resizes
and posture changes, the layout should adapt
to the available space. Components shouldn't
just stretch. Some should swap
out for components that make better use
of the space available. FAHD IMTIAZ: So we have talked
about some decisions to avoid, but we see many apps
following more anti-patterns. Let's take a look at
some of these now. You may have been
wondering about logging the orientation of your
activity and if doing that will prevent window
size changes. The answer is no. Entering multi-window mode
will resize your activity. Folding and unfolding a device
will still resize your activity. Resizing an app in
freeform windowing mode will still resize your activity. Additionally, apps
should not be locking orientation in the first place. If a user wants to lock the
orientation of their device, they can choose to do so
using the system level auto-rotate feature. If the users have system-level
auto-rotate enabled and they rotate their devices,
they expect apps to rotate. Apps locking orientation is
a usability and accessibility issue. For that reason, apps
that lock orientation will be pillarboxed on
large-screen devices. Pillarbox apps see
decreased discoverability on Play browse surfaces
across foldables, tablets, and Chromebooks. Starting in Android 14, users
can override the app behavior and stretch these pillarbox
apps to fill the landscape screen to make better
use of the large display. But here's the thing, stretching
isn't the same as optimizing, and it's far from the polished
experience that users desire. As users want to use their
device and their apps differently, they will rotate,
fold, resize many times. All of these change
the available window space for your app,
and users expect apps to adapt to that space. Another anti-pattern that we
see apps having issues with is unnecessarily
requiring features. Play Store will prevent users
from downloading your app on devices that don't
support all of the features that your app requires. This system made a lot of
sense with single-purpose apps. I mean, it wouldn't
really make sense to install a camera
app on a device that did not support a camera. But apps can do so
much more these days, and features may
only be required in specific user journeys. Many permissions like
camera and location can result in your app
requiring a feature, even if that feature is not
essential to your app's core experience. This completely blocks your
app from being installed on devices that don't satisfy
the feature requirements, which can include phones, tablets,
foldables, Chromebooks, and even cars running
Android Automotive OS. To avoid that, mark these
features in the manifest as not being required. Don't require a camera
or extra camera features. Don't require a physical GPS. Don't require the device
to support both portrait and landscape orientations. And lastly, don't
require a touchscreen. Let's talk about
another crucial aspect now, how users
interact with your app across different device types. Let's dive into input where
we will cover supporting input methods like keyboard,
trackpad, and stylus. If you have developed
an app for phones, chances are touch input
was your main focus. But think about it. Your users are using your
apps across device types, from foldables to tablets
to desktop devices like Chromebooks. Larger screens
invite productivity and creative expression. Users turn to keyboards,
mice, and styluses for precision, expression,
and accessibility. The use of keyboards
with tablets has been increasing as it
allows users to be even more productive, allowing them to get
things done more efficiently. And on Chromebooks,
90% of app interactions are with the keyboard. So the takeaway-- make sure
your app supports input methods beyond touch, like keyboard,
trackpad, mouse, and stylus. Jetpack Compose
over the past year has been enhanced to provide
you with more interactions out of the box. Keyboard interactions like tab
navigation and interactions like mouse or trackpad
click, select, scroll are now available out
of the box with Jetpack. Compose 1.7. In Android 15, we are revamping
the System Shortcuts Helper, which provides you with
an easy standardized way to make your apps'
keyboard shortcuts more discoverable to users. By overriding the
onProvideKeyboardShortcuts method, you can publish
your supported shortcuts with individual keyboard
shortcut info objects which will then be easily discoverable
by users in the System Shortcuts Helper. Let's see an example
with a Chrome app. Android productivity
users familiar with desktop Chrome
shortcuts like opening a new tab or a window, can
make an easy transition to Android tablets. With recent updates to
the Chrome Android app, supported shortcuts now align
to and support desktop shortcuts that were previously missing. These additions don't only
make basic navigation easier but also provides easier
access to Chrome features such as the bookmark manager. Let's move to styluses now. Styluses with Android
tablets and foldables allows users to express
themselves without any limits, be it capturing those fleeting
ideas quickly, or adding those artistic touches. Stylus users on
Android can remain more productive with
new support for stylus handwriting in text fields. With Jetpack Compose
1.7, stylus handwriting is supported out of the
box with basic text field. Android is also
enabling your app to help users be more
productive with the notes role in Android 15. When you register your
app as a note-taking app, users can open your app
right from the lock screen or run it as a floating window
for enhanced note-taking. Make sure to check out the
increased user productivity with large screens
and accessories to learn more about
input methods. So we have talked about layouts
and we have talked about input. How about making sure it all
comes together and works well? Over to Alex to talk about
testing your adaptive apps. ALEX VANYO: We've
heard your feedback that supporting additional
window sizes leads to increased testing and maintenance costs. To know what's most
important to test, our large-screen-quality
guidelines, and test checklists are
a great place to start. They're designed to simplify
the testing process, outlining the
critical user journeys you must test to ensure
a great experience across different devices. Part of lowering the cost of
maintaining adaptive layouts is Compose itself. Compose makes it
easier to build layouts that adapt without needing
to duplicate and maintain different layout files
like you would with views. We are also adding
new testing tools that help for testing UI
on a variety of device configurations,
including window sizes. DeviceConfigurationOverride is
a new testing API in Compose 1.7 that allows locally overriding
pieces of device configuration. Previously, things like
locale, font size, dark theme or light theme, and
arbitrary sizes for UI were harder to test
because these are all device-level parameters. You can try to change these
parameters at runtime, but this requires setup and
tear down around your tests that you have to
orchestrate carefully to not impact other tests. You could also launch
multiple emulators with different
configurations, but now this requires running tests multiple
times across a suite of devices for complete coverage. Either way, it makes a
complete group of tests hard to coordinate. Instead,
DeviceConfigurationOverride is a composable API that allows
specifying an override that is applied locally
just for the content under test contained
inside of it. For example, DeviceConfigurat
ionOverride.LayoutDirection allows overriding
the layout direction locally for the
content under test so that you can verify that
content isn't cut off or shown incorrectly with right to left
compared to left to right. DeviceConfigurat
ionOverride.FontScale allows setting the
font scale locally for the content under test. By passing two to
font scale, you can check how the
content behaves when non-linear font scaling is
applied in Android 14 and above. And my favorite is
DeviceConfigurat ionOverride.ForcedSize. This override takes a DP size
for a width and a height. The content inside the override
will be given that much space to render, regardless of
how much space is actually available. For example, you
can run tests when your UI is rendering at
840 dp wide by 700 dp tall, even if that test is
running on a phone emulator. Normally, this wouldn't work
since the UI would be cut off if you tried to use something
like Modifier.RequiredSize, and then assertions for whether
something would be displayed could fail. Instead, the trick here
is that ForcedSize locally modifies the effect
of density which makes the content inside think
it has enough space to render. With this approach,
you can test your UI in different amounts
of available space on a single emulator with a
single run of your test suite, so you don't have to run
your tests multiple times in different emulators
and maybe filter tests based on which environment
you are running in. You can also parameterize
tests to run multiple times in different amounts of space or
with different font sizes or any of the other device
configuration overrides. Integration tests that
use semantic information to control the UI are
great to verify and prevent regressions for the
functionality of your UI, but they aren't verifying what
your layouts look like visually. For that, we are
launching official support for host-side screenshot
tests of your Compose layouts with the Android Gradle plugin. Host-side screenshot tests
provide a fast and scalable way to verify that the UI
visually hasn't changed, and these can be used
to test the UI is looking the way you'd expect
across many different window sizes. And if you're already familiar
with creating Compose previews, it's just as easy to create
new host-side screenshot tests for your app. And finally, some updates
on manual testing tools. When developing,
use multi-preview annotations to check
what your UI is looking like across multiple
different sizes, font scales, and more. The UI checks for
Compose previews also highlight common
usability issues that might be present in your
previews, such as buttons or text fields being too wide. We also have a
suite of emulators, including the Pixel
Fold and Pixel tablet, to help you test
your apps behavior in all the different ways
that it's being used. The resizable emulator
is a single emulator that can swap between
a phone-size screen, a foldable-size screen,
and a tablet-size screen. And there's also the
desktop emulator, which makes it easier to test
freeform windowing, mouse hover, and keyboard shortcuts. And we also know that emulators
aren't perfect for every case. And there's often no
replacement debugging an issue on a real device. For that, we are integrating
remote device streaming into Android Studio. Android Device Streaming
powered by Firebase allows you to securely connect
to remote physical Android devices, including a selection
of the latest Pixel and Samsung devices, which are hosted
in Google data centers. While connected, you can
install and debug apps, run ADB commands,
rotate and fold, and make sure your
app is working great across a variety
of real devices. Try Android Device
Streaming today at no cost in the latest version
of Android Studio Jellyfish after you sign in and
connect to your IDE to the Firebase Cloud project. With testing, there's
no single method that's perfect for everything
you should be verifying, but that doesn't mean you
shouldn't test at all. We have a new guide on
developer.android.com that goes into
more detail for how to test your apps across
all the window sizes that users are putting them in. So we've covered a
few different areas that all relate to building apps
that work across different form factors, adaptive conditions,
and layouts, inputs, and testing. For every feature that you
design, develop, and change, take these aspects into account
as part of your workflow to make your app better for your users. If you're reviewing
code today, you should already be on the
lookout for hard-coded strings or assuming left-right behavior. Leaving these in becomes
immediate tech debt and will cause issues for
translation and right-to-left languages. You should now become
similarly watchful if you ever make layout
decisions based on the window orientation, the physical
type of the device, or the physical display size. Instead, you should
be using window size classes for top-level layouts. Put your content into panes to
unlock displaying more than one at a time. You should also be wary if
you're ever locking orientation or don't consider
and test what happens when the window size changes. Your app will be resized. The user will rotate
and fold their device and resize your app in a window. It is absolutely
critical to save UI state as configuration changes happen. Users don't only
use the touch screen to interact with your app. Styluses, keyboards,
mice, and trackpads all have the potential to
make your users more productive with your apps. Make sure you aren't making it
impossible to use them instead and introducing
accessibility issues. And just like everything
else in your app, you should be concerned if
you aren't testing for it. There's a selection of
tools for each area. The new
DeviceConfigurationOverride APIs in Compose, screenshot
testing, along with more types of emulators and
device streaming. All of these areas should be
brought up as soon as possible for the same reason that
hard-coding strings and layout direction should be
caught as soon as possible because these are all causing
real usability and accessibility issues for your users today. Android apps need to be built
with an adaptive mindset. Anything else is already
harming your user's experience, and it will only become
a worse experience as devices continue to become
more adaptive themselves. Users want your apps to
be running in more places and they want them to be
running at their best. Android has a solid foundation
for making that a reality. And we are building
more tools that make it easier to lock even more
surfaces for your app to shine. Jetpack Compose is the best
way to build adaptive UI. Try out the new Compose
Material 3 adaptive library, and watch Building UI with
Material 3 adaptive library for a deep dive into
the components in it. And for developers, if your
app hasn't thought about most of these things yet, and even
if you aren't using Compose yet, start with the most
impactful thing. Save your UI state. Remove references to
physical display size. Remove orientation locks. And ensure different
input types are supported. For designers, give guidance
for the simple changes that go a long way,
avoiding buttons, text, and text fields that fill up
the entire width of a tablet. Check out the
Designing adaptive apps on what goes into designing
apps that are adaptive and to learn more about why
just stretching components isn't enough. Please continue to give
us feedback both for what experiences your
users want to see and how the platform and
tools can solve problems that you are facing. And also, please check
out our other talks. Thank you very much. [MUSIC PLAYING]