(music) How is your data protected
when using Microsoft Copilot, and how do you get ready for it? Well, in the next few minutes, with Copilot for Microsoft 365, now more broadly available to
organizations of all sizes, I'll unpack how you can
securely take advantage of generative AI across
Microsoft 365 app experiences. And I'll also go through the steps and resources to deploy it at scale. Now, if you're new to
Copilot for Microsoft 365, it lets you use natural language prompts to interact with your organization's data and generate personalized
content and responses with relevant insights that are
unique to your work context. While you only see the generated response in your original prompt, behind the scenes
Copilot for Microsoft 365 interprets your request, and if necessary, will find information you have access to within your organization
from your work files sitting in SharePoint and OneDrive, as well as email and calendar
via the Microsoft Graph. And it presents this information
as additional context along with your original prompt
to the large language model to generate a personalized
and informed response. And even though none of
this information is retained by the large language model, to securely take full
advantage of generative AI, you can and should protect
data at every stage from the information
contained in the user prompt to the information retrieved based on user access permissions, and the generated response itself so that sensitive data is
not inadvertently exposed when it shouldn't be. And the good news is controls for security and privacy over your
data exist at every stage, and will leverage the sensitivity labels and the corresponding policies that you've already got in place. Now, I'll start by showing
you a couple of examples of the benefits of these data
protection controls in action, and then I'll show you how to configure them yourself as an admin. In this case, I'm using
Copilot in microsoft365.com. I'll prompt it to list the key points from the Contoso purchase agreement, and the information
that was retrieved shows the sensitive label for the
document that it referenced. Now to be clear, this is a file that I
have explicit access to. And if I move over to the
source document itself, you can also see the
confidential sensitivity label was previously applied to it. So you saw how Copilot
was able to inform me of the sensitivity of the
document that it retrieved all as part of its response. Now, let's see how it works for content generation
using Microsoft Word. So here I'm going to
prompt Copilot in Word to generate a confirmation letter that's based on the
same purchase agreement with the sensitivity
label that we saw before. And right after I referenced
that protected document, you'll see that with
this shield icon here, it immediately recognizes
this as a sensitive file. So now I'm going to hit generate, and it will author a
draft confirmation letter for the purchase agreement. Notice that when it's completed
the confirmation letter, because the originating document
has a confidential label, that same label is automatically applied to the generated file as shown in the information
bar above the document. So the protection is inherited from the labeled source material. So as an admin, what are the steps then it
takes to protect your data? Well, it all starts by looking at your data access permissions and applying the principles
of just enough access as well as least privileged for information across
your entire data estate. And one of the first recommended steps that you can take as a
Microsoft 365 administrator is to review SharePoint site access, prioritizing the sites containing the most sensitive information. Now, here you can start
by looking for sites that have their privacy set to public, which means that all employees
can discover and access them. And from there, you can require that site
owners verify ownership as well as who should
be members or visitors of these sites to limit access. Then for content
classification and labeling, one of the simplest
controls to put in place is to classify files automatically saved to sensitive locations, as you can see here with
the site owner controls for this document library. Now, that means that any
content created in that location will get the label applied automatically and corresponding policies can lock files down to the right people. Then for another easy test to see you can use Search and Microsoft 365 you can use Search and Microsoft 365 even before you deploy Copilot to evaluate whether different users can discover and access sites or files that they should not have access to. The labels and classifications
applied in those locations are configured and managed
using Microsoft Purview. In fact, let me show you those controls as well as additional
more advanced controls to protect your data
using its auto labeling and data loss prevention capabilities. The labels you apply in Microsoft Purview can automatically help you discover, limit the sharing radius, and apply encryption
directly using policies. These can also be applied
based on the content within the documents
using data loss prevention or DLP policies with
sensitive information types. So here for example,
I've started a DLP policy for personally identifiable information, and I've added a few sensitive
information types already. And I can add even more
with over 300 options here for things like banking
numbers, addresses, identification types, tax
information, and more. Additionally, using trainable classifiers, there are dozens of
built-in document types that I can choose from,
including source code, healthcare, HR, and more
to auto apply labels. Then moving on to device restrictions, I can also set up endpoint
DLP policies to prevent users from copying sensitive
data to their clipboards and then, for example, into
unapproved AI assistance sites. Next, beyond data protection policies, let me explain how Copilot
for Microsoft 365 activities can all be audited. Using content search in Microsoft Purview, all activity from Copilot for
Microsoft 365 is discoverable as you can see here. Retention policies can also be used to retain content and
prompts and responses, and then retained based
on your requirements. E-discovery is also supported
for Copilot interactions as you can see here
with this case example. And communication and compliance will likewise flag any content with
established policy matches like the one you see here
for codename Obsidian. Of course, another important consideration is how data is processed and where it resides when using
Microsoft Copilot services. Microsoft hosts and operates
large language model instances in Microsoft data centers and will never use your data
to train large language models. And data residency with Microsoft Copilot is consistent with Microsoft
365 and the locations where your data is already
stored and processed today. Which means that if your organization is based in the European Union, Copilot data is likewise stored and processed within the EU data boundary like the rest of your data. Additionally, the Microsoft
Copilot copyright commitment means that content generated using Copilot also comes with legal
protections for Microsoft. Now, let's move on to how you
can fine tune policy settings and configurations for
Copilot as an admin. And for that, we've added new controls in Microsoft 365's admin center, including links to many of the tools and concepts I've shown today. So here you can see the status
of your Copilot assignments as well as the latest
information on Copilot. Under settings, you can find what you need to manage Microsoft Copilot experiences found in Bing, the Edge
browser, and in Windows, as well as deep links to many of the data security
and compliance controls. Next, admin controls to submit feedback about Copilot for Microsoft 365
services on behalf of users, then configurations for plugins and their permissions from
the integrated apps page, as well as tenant wide controls to allow the public web to
be used as grounding data in Copilot for Microsoft 365 and more. Now, with the right protections
and configurations in place, you can take full
advantage of generative AI and start deploying Copilot for Microsoft 365 services at scale. Now, this starts with ensuring that you've got the right
Microsoft 365 services in place. And recently this was expanded
to organizations of all sizes with Microsoft 365 Business
and Enterprise Suites, as well as faculty members for Microsoft 365 Academic suites. Next, for Copilot capabilities to light up in Microsoft 365 apps, using the Microsoft 365 apps admin center at config.office.com,
you'll want to deploy either monthly enterprise,
current channel, or current channel preview. From there, from the Microsoft
365 admin center under setup, you can use the "Get ready
for Microsoft Copilot for Microsoft 365" setup guide to configure any remaining items, and it walks you through many
of the steps I just presented to prepare your organization. From here, you can even
assign Copilot licenses to users and groups in
scope for your deployment and send a welcome email to help them get started with Copilot. And with services deployed, a best practice for driving
and improving adoption is to establish an internal community of Microsoft Copilot users. And the Copilot hub at
adoption.microsoft.com/copilot gives you additional resources by role to help users learn about and
get the most from Copilot. So that was an overview of
how security and privacy with Copilot for Microsoft 365 works, and how you can get ready for
Copilot in your organization. For more deep dives on other
Microsoft Copilot tech, checkout aka.ms/M365CopilotMechanics, And keep checking back
for the latest AI updates Thanks for watching. (music)