copilot
27 TopicsNew Microsoft 365 Copilot Tuning | Create fine-tuned models to write like you do
Fine-tuning adds new skills to foundational models, simulating experience in the tasks you teach the model to do. This complements Retrieval Augmented Generation, which in real-time uses search to find related information, then add that to your prompts for context. Fine-tuning helps ensure that responses meet your quality expectations for specific repeatable tasks, without needing to be prompting expert. It’s great for drafting complex legal agreements, writing technical documentation, authoring medical papers, and more — using detailed, often lengthy precedent files along with what you teach the model. Using Copilot Studio, anyone can create and deploy these fine-tuned models to use with agents without data science or coding expertise. There, you can teach models using data labeling, ground them in your organization’s content — while keeping the information in-place and maintaining data security and access policies. The information contained in the task-specific models that you create stay private to your team and organization. Task-specific models and related information are only accessible to the people and departments you specify — and information is not merged into shared large language models or used for model training. Jeremy Chapman, Director on the Microsoft 365 product team, shows how this simple, zero-code approach helps the agents you build write and reason like your experts — delivering high-quality, detailed responses. Keep information permissions as-is. Use your organization’s knowledge and sharing controls. See how Copilot Tuning works. Guide Copilot with labeled examples. Copilot learns to reason and write like you are your expert team. Check it out. Build Copilot agents powered by your fine-tuned models. Automate work with your tone, structure, and standards. Take a look at Copilot Chat. QUICK LINKS: 00:00 — Fine-tune Copilot 01:21 — Tailor Copilot for specialized tasks 05:12 — How it works 05:57 — Create a task-specific model 07:43 — Data labeling 08:59 — Build agents that use your fine-tuned model 11:42 — Wrap up Link References Check out https://aka.ms/FineTuningCopilot Unfamiliar with Microsoft Mechanics? As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft. Subscribe to our YouTube: https://www.youtube.com/c/MicrosoftMechanicsSeries Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com/podcast Keep getting this insider knowledge, join us on social: Follow us on Twitter: https://twitter.com/MSFTMechanics Share knowledge on LinkedIn: https://www.linkedin.com/company/microsoft-mechanics/ Enjoy us on Instagram: https://www.instagram.com/msftmechanics/ Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics Video Transcript: -You can now teach or fine-tune your Microsoft 365 Copilot experience by creating your own task-specific fine-tune models that channel your expertise and experience to carry out specialized jobs and tasks accurately and on your behalf. In fact, from Copilot Studio, anyone can use this zero-code approach to teaching Copilot’s underlying model the skills from your organization to produce more usable, high-quality responses that can be as detailed as they need to be, even hundreds of pages long to get the job done. And the model remains exclusive to your organization and only the people and departments you specify. -If you compare this to the traditional way of doing this until now, this level of customization would require data science, machine learning, and coding skills. So this process is a lot simpler. And unlike existing approaches where, as a data scientist, you may be copying data into locations that may not be aware of your protections and access controls, this is enterprise-grade by design. You just focus on the outcome that you want to achieve. And because your data stays in place, your existing data access and protection policies are respected by default. Let me show you the power of this in action by comparing the results of an agent that’s calling a fine-tuned, task-specific model of Copilot versus one that’s just calling the original underlying Copilot model. So both agents are configured to author loan agreement documents. On the left is our agent using the task-specific model, on the right is our SharePoint-based agent using a general model. -Now, both agents are focused on the same exact underlying knowledge. It’s all in a SharePoint location, as you can see here with this precedent file set. And both user prompts are identical with example reference files and the client term sheets containing new information. In fact, this is a precedent file that I’ll use. It’s a long and detailed document with 14 pages and more than 5,000 words. The term sheet is quite a bit shorter as you can see here, but it’s still long and detailed with information about the loan amounts, all the details, and if I scroll all the way down to the bottom, you’ll see signatory information for both parties. -So let’s go back to our side-by-side view and run them. So, I’ll start with the general model agent on the right. And it starts to generate its response. And I’ll let this one respond for a moment until it completes. There we go. And now I’ll move over to the agent on the left. It immediately informs me that it’ll receive an email once it’s finished. Now, this is going to be a longer-form document, so we’ll fast forward in time to see each completed response. -So, starting with the general model, I’ve copied it into a Word document, and the output is solid. You’ll see that the two parties are correct, the loan structure, all the amounts are also correct from the term sheet, but it has a few tells. It’s missing a lot of specificity and nuance that a member of our legal team would typically include in all of the terms. It’s also very summarized and not how our firm would draft an agreement like this. When I scroll down to the bottom, the signatories and addresses are captured correctly and match the term sheet. That said, though, it’s just four pages long and has around 800 words, versus more than 5,000 words in our precedent document. So it kind of follows the 80–20 rule where a good portion of the response could maybe work with some edits, but it’s not reflecting how my firm thinks and how it writes when authoring legal documents like this one. -So let’s go ahead and look at the results of a fine-tuned, task-specific agent. So immediately, you can see this document is verbose. It’s 14 pages long with more than 5,300 words. The word count doesn’t always equate to quality, so let’s look at the document itself. Now, as I scroll down, you’ll see that this agent has been taught our firm-specific patterns and the clauses that we use in existing case files. It is structured and worded things just like the precedent document. It’s reasoning and writing with more precision, like an experienced member of our firm would. And while as with any other AI-generated document, I still need to check it for accuracy, it really captures that extra detail and polish to save us time and effort. So model fine-tuning is a powerful way to tailor state-of-the-art large language models that are used behind Copilot to your specific needs. -And as you saw, it also can significantly improve the handling of specialized tasks. So let me explain how fine-tuning works in this case. Unlike Retrieval Augmented Generation, it doesn’t rely on search and orchestration processes that run external to the large language model. The additional knowledge added as part of the fine-tuning process is a protected container of information that attaches the large language models training set to teach it effectively a new skill. Now, it’s never merged into the LLM or used for future model training, and is temporarily attached to the LLM when it’s needed. Again, the skill and knowledge that it contains is exclusive to you and the people or groups that you’ve shared it with, so it can’t be accessed without the right permissions. -Next, let me show you what it takes to create and fine-tune your own task-specific model. I’m in Microsoft Copilot Studio, which you can reach from your browser by navigating to copilotstudio.microsoft.com. I’m on the task-specific model page and I want to customize a model to generate partner agreements. So I’ll paste in a corresponding name. Then I’ll paste in a description. Then as the task type, I’ll select a customization recipe that reflects what I want it to do. And my options here include expert Q&A, document generation, and document summarization, with more task types coming over time. From there, I can provide additional instructions to tailor the fine-tuning recipe, like how the model should use original files, for example, to inform the structure, formatting, company-specific clauses, and other areas important to your model, like we saw before. -Next, I can define my own knowledge sources. Now, these can use information from SharePoint sites and folders, and soon, you’ll be able to add information external to Microsoft 365 using Microsoft Graph connectors. In this case, I’ll define a SharePoint source. Then browse the sites that I have access to. I’ll choose this folder inside the Agreements library. And from there, I can even drill into specific folders for the precise information that I want to use to teach the model, which I’ll do here with the Agreements folder. -For permissions, this process aligns to the enterprise-grade controls that you already have in your organization backed by your Microsoft Entra account. Now, the next step is to process the data you selected for training or what’s known as data labeling. So here, you’ll be presented with data labeling tasks in small, iterative batches. They’re kind of like questionnaires for you to complete, where the fine-tuning process will generate documents and request assessment of them for clarity, completeness, accuracy, and professionalism. This process requires subject matter expertise to open these documents and rate the quality of the generative output for each. I’m just going to show one question here, but you’d repeat this process for every batch. And once all batches are labeled, I can start model training. Now, this will take some time to process, so I’ll fast forward a little in time. -Now with everything finished, I can publish the model to my Microsoft 365 tenant. And it will be available to anyone we’ve shared it with, like our audit team from before, to build new agents. And the process I just showed is called supervised learning, where the model is trained on label data. And soon, you’ll also have the option to use reinforcement learning to enhance the agent’s reasoning capabilities. Now let me show you how to build an agent from Copilot Chat that can leverage our new task-specific model for partner agreement generation. So I’m going to select Create agent. And for the purpose, I have a new option here to build a task-specific agent. Next, I can choose from the existing task-specific models. So I’m going to choose the one that we just created for new partner agreements. There we go. And with any agent, I just need to give it a name. Now I’ll paste in a description for people on the team to know its purpose and what it can do. -And next, I can specify additional instructions as guidelines to provide more context to the agent, as I’m doing here to ensure the structure aligns with our organizational standards. Because this is a very specific agent to write partner agreements, I’ll just specify one starter prompt with details for referencing a precedent source document to start with and a term sheet to get specific new information from, kind of like we saw before. Now, the preview on the right looks good, and I can create the agent right from here. For sharing, permissions also need to align with whoever my task-specific model was shared with, which, as you’ll remember, again, was our audit team. In this case, for my own validation, I’ll select only you so that I can test it before sharing it out with other auditors on my team. -So let’s go ahead and test it out. So I’m going to use the starter prompt. Then I’ll replace the variable file names here. I’ll use the forward slash reference, starting with the precedent file. Now I’ll look for the term sheet file. There it is. From there I can submit my prompt. This is going to take a moment for the response. You can see the structure with sections based on our task-specific files used with the fine-tuning. It tells me that it’ll send me a Word document and email once it’s finished again. In fact, if I fast forward in time a little, I’ll move over to Outlook. And this is the file the agent sent me with links to the new agreement draft. So I’ll open it using Word in the browser. There’s my agreement. And you’ll see it follows exactly how we wrote the precedent agreement. As I scroll through the document, I can see all the structure and phrasing aligned with how we write these types of agreements. In fact, this Representations and Warranties section is word for word direct from our standard terms that our firm always incorporates. And that’s it. My agent is now backed with my task-specific, fine-tuned knowledge, and it’s ready to go and I’m ready to share it with my team. -So those are just a few examples of how fine-tuning in Microsoft 365 Copilot can give you on-demand expertise, and task-specific models respond more accurately using your specified voice and process so that you and your team can get more done. -To find out more, check out aka.ms/FineTuningCopilot, and keep watching Microsoft Mechanics for the latest tech updates, subscribe to our channel, and thanks for watching.444Views2likes0CommentsMicrosoft 365 Copilot Wave 2 Spring updates
Streamline your day with new, user-focused updates to Microsoft 365 Copilot. Jump into work faster with a redesigned layout that puts Chat, Search, and your agents front and center. New Copilot Search lets you yse natural language to find files, emails, and conversations — even if you don’t remember exact keywords — and get instant summaries and previews without switching apps. Create high-impact visuals, documents, and videos in seconds with the new Copilot Create experience, complete with support for brand templates. Tap into powerful agents like Researcher and Analyst to handle deep tasks or build your own with ease. And if you manage Copilot across your organization, you now have better tools to deploy, monitor, and secure AI use — all from a single view. Describe what you want. Don’t know the keywords to find your content in Microsoft 365? You don’t need to. See how the new Copilot Search works. On-demand expertise. Use agents like Researcher or Analyst to do the thinking for you. Start here. View AI agent activities in Microsoft Purview. Find data security policy matches and see if agents are being used with sensitive information or by risky users. Watch here. Watch our video here. QUICK LINKS: 00:00 — Microsoft 365 Copilot new capabilities 00:36 — Microsoft 365 Copilot app 01:49 — Copilot Search 03:09 — Specialized agents 04:06 — Create experience 06:07 — Copilot Notebooks 07:40 — Updates for IT admins 08:16 — Data security with AI apps & agents in Purview 08:51 — Reports 09:20 — Wrap up Link References Check out https://aka.ms/CopilotWave2Spring Unfamiliar with Microsoft Mechanics? As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft. Subscribe to our YouTube: https://www.youtube.com/c/MicrosoftMechanicsSeries Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com/podcast Keep getting this insider knowledge, join us on social: Follow us on Twitter: https://twitter.com/MSFTMechanics Share knowledge on LinkedIn: https://www.linkedin.com/company/microsoft-mechanics/ Enjoy us on Instagram: https://www.instagram.com/msftmechanics/ Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics Video Transcript: -So, Microsoft 365 Copilot keeps getting better, and today I’ll show you how the Copilot experience is evolving to make everything easier with new AI-powered capabilities to help you get even more done, and if you’re an IT, I’ll show you new options for agent management, including updates in the Microsoft 365 Admin Center, new data security views and controls, and Microsoft Purview’s Data Security Posture Management for AI, as well as improved reporting and visibility into Copilot analytics from Viva Insights. So let’s start with updates to the Microsoft 365 Copilot app experience, which has evolved to make every interaction easier and more intuitive. Chat is the core of the experience and where you’ll find the app by default and there’s a new navigation moving what’s important to the left side of your screen where you’ll find links to Search, Chat, your Agents, new Notebooks, and Create experiences that I’ll show you in a moment. -Now moving to the center of the app, you’ll notice that there’s a more streamlined view with the prompt box taking center stage. Under that, you’ll find personalized suggestions for what to do next, including upcoming meetings. And as the author prompts, you can quickly pull up an agent right from here to bring in content sources you want like your files, interactions with people, your meetings, emails and more to generate what you want. And even without referencing your work directly, Copilot is connected to it and can find the matching information that you have access to. That’s because behind the scenes, Microsoft 365 Copilot uses advanced AI native vector-based search to find the most relevant content. And now outside of Copilot Chat, you can use this directly from the new Copilot Search experience. It brings together AI search and your work information across Microsoft Graph. -From here, you can easily get to your recommendations and quick access to what you’ve been working on. Then moving to the search bar, you’ll see that search now goes way beyond keyword search that you’re used to compared to Copilot Chat. This is optimized to find specific content items. Now, where you can ask in simple terms, for example, based on what you remember, to quickly find your files, your emails, chats, and meetings for Microsoft 365, and even information and other graph connected line of business systems like you’re seeing here with Jira. Now it knows like-concepts, synonyms, and contextual information around a topic, so you don’t need to know keywords or be precise with search terms. In fact, many of these results don’t contain words from the search, but they’re highly relevant. And based on the top items and the results, Copilot can summarize what search finds in line to save you time. And without having to open its sighted items in separate apps, Copilot will also help summarize and preview those files right from the Copilot app. -Next, we’ve made it easier to access specialized agents, which give you on-demand help to complete tasks that would normally require an expert. Now these include both Analyst to find insights in your data as well as Researcher for written reports, both first-of-their-kind reasoning agents for work. In fact, we dedicated an entire show on reasoning agents that you can check out at aka.ms/reasoningmechanics. And here in the navigation you’ll also find your pinned and recently used agents on top. And clicking into all agents takes you to the new agent store where you can find more agents that are built by Microsoft, also the broader ecosystem, and your company’s own custom agents. And of course, you can also create your own agents right from here by describing what you want your agent to do or configuring it directly with your instructions and knowledge sources. -Then beyond agents and AI-generated content from Copilot Chat, the new create experience lets everyone tap into powerful and personalized design skills where you can create images like the samples you’re seeing on the screen here, powered by the GPT 4o Model for image generation. And you can design a poster or flyer like these ones and they’re also great for cover pages to your reports. And speaking of that, from Create, you can write a stylized draft document using templates and right from here, you can also upload and edit your own images to make them stand out and select parts of images to remove distractions like this tree. Importantly, what sets the Create experience really apart is that you can use brand templates and even bring in your company’s brand kit, and these include your approved company logos, fonts, and colors. In fact, let me show you how this works by creating a new image. -So I’ll start by describing what I’m looking for. I’ll choose a picture of this new shoe from my local device to work from, and now I’ll choose my style that matches what I want to create. Here’s where my company brand comes in. I can choose the brand kit I want with the right color palette and icon. Now this will take a moment to generate and now I have an image that fits my brand and I can add to this same image. I’d like to see a little ground cover in the image. I’d like to ask for some moss and some rocks. Then I’ll give it a second to render a new image and it gets even better with new direct editing options like background removal, object transform, and enhancements. I’m going to choose the erase option, then select this rock and this plant on the right and hit erase. That’s better. Now, I just need to add a text element and I’ll paste in the shoe name and now it’s ready to go and I can download the image right from here. -And for your bigger projects and tasks, Copilot Notebooks is then another new capability. These help you bring together all of your relevant content for your task at hand, including Copilot Chats. And I’ll open this one for Copilot Craft and you’ll see that I can chat with Copilot about everything in here, and it’s filled with reference content and related chat history to keep interactions in scope to what’s here and even create an audio overview of this notebook. Now the last major update that I’ll show from the Microsoft 365 Copilot app experience is with personalization and memory. Where from Copilot settings, you can specify custom instructions and enable Copilot memory. -First, custom instructions let you add details about your interests, your preferences, the tone of what you expect from Copilot responses. Think of this information as something that will get appended to your initial prompts in future Copilot sessions to improve its output. Then, moving back to personalization settings, Copilot memory works in the same way to recall a handful of notable memorable items from previous conversations in real time. Again, this information sits outside of the large language model and is retrieved for future chat sessions. And you have full visibility and control over what is maintained in Copilot memory and can delete what you don’t want to personalize its responses. -Next, I’ll move on to updates for IT admins. We’re adding more controls to the Copilot Control System so that you have the tools that you need to manage, govern, and measure Copilot and now also agents across your organization. You can now manage the agents and agent deployment right from Microsoft 365’s admin center. Here you’ll see a list of agents in use and the ones you’ve blocked. Also, apps where the agents are supported and usage details. You can also deploy agents from here as well, scoping the right users and groups. -Next, we’re also adding more insights and controls for data security with AI apps and agents and Microsoft Purview. The new AI apps and agents page in Data Security Posture Management for AI gives you a single dashboard to view and create policies for your AI apps and agents, where you’ll find coverage for data protection and compliance policies that you already have in place, and clicking into any of these items, lets you discover more insights, including potentially risky interactions, inappropriate use, as well as sensitive information being shared. -And finally, for reports that you can share beyond your administrator and data security teams, using Copilot Analytics and Viva Insights, you can measure the usage and business impact of your agents. And direct from Viva Insights, the new Copilot Studio agents report can be shared with your team, and it provides a comprehensive view of agent use, session outcomes, and you can see how assisted actions are contributing to overall ROI. -So Microsoft 365 Copilot continues to evolve to help you get more done, along with enterprise grade IT controls to help keep your data protected. Now, to find out more, check out aka.ms/CopilotWave2Spring and keep checking back to Microsoft Mechanics for the latest updates. Thanks so much for watching.1.3KViews1like0CommentsMicrosoft Purview protections for Copilot
Use Microsoft Purview and Microsoft 365 Copilot together to build a secure, enterprise-ready foundation for generative AI. Apply existing data protection and compliance controls, gain visibility into AI usage, and reduce risk from oversharing or insider threats. Classify, restrict, and monitor sensitive data used in Copilot interactions. Investigate risky behavior, enforce dynamic policies, and block inappropriate use — all from within your Microsoft 365 environment. Erica Toelle, Microsoft Purview Senior Product Manager, shares how to implement these controls and proactively manage data risks in Copilot deployments. Control what content can be referenced in generated responses. Check out Microsoft 365 Copilot security and privacy basics. Uncover risky or sensitive interactions. Use DSPM for AI to get a unified view of Copilot usage and security posture across your org. Block access to sensitive resources. See how to configure Conditional Access using Microsoft Entra. Watch our video here. QUICK LINKS: 00:00 — Microsoft Purview controls for Microsoft 365 Copilot 00:32 — Copilot security and privacy basics 01:47 — Built-in activity logging 02:24 — Discover and Prevent Data Loss with DSPM for AI 04:18 — Protect sensitive data in AI interactions 05:08 — Insider Risk Management 05:12 — Monitor and act on inappropriate AI use 07:14 — Wrap up Link References Check out https://aka.ms/M365CopilotwithPurview Watch our show on oversharing at https://aka.ms/OversharingMechanics Unfamiliar with Microsoft Mechanics? As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft. Subscribe to our YouTube: https://www.youtube.com/c/MicrosoftMechanicsSeries Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com/podcast Keep getting this insider knowledge, join us on social: Follow us on Twitter: https://twitter.com/MSFTMechanics Share knowledge on LinkedIn: https://www.linkedin.com/company/microsoft-mechanics/ Enjoy us on Instagram: https://www.instagram.com/msftmechanics/ Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics Video Transcript: -Not all generative AI is created equal. In fact, if data security or privacy-related concerns are holding your organization back, today I’ll show you how the combination of Microsoft 365 Copilot and the data security controls in Microsoft Purview provide an enterprise-ready platform for GenAI in your organization. This way, GenAI is seamlessly integrated into your workflow across familiar apps and experiences, all backed by unmatched data security and visibility to minimize data risk and prevent data loss. First, let’s level set on a few Copilot security and privacy basics. Whether you’re using the free Copilot Chat that’s included with Microsoft 365 or have a Microsoft 365 Copilot license, they both honor your existing access permissions to work information in SharePoint and OneDrive, your Teams meetings and your email, meaning generated AI responses can only be based on information that you have access to. -Importantly, after you submit a prompt, Copilot will retrieve relevant index data to generate a response. The data only stays within your Microsoft 365 service trust boundary and doesn’t move out of it. Even when the data is presented to the large language models to generate a response, information is kept separate to the model, and is not used to train it. This is in contrast to consumer apps, especially the free ones, which are often designed to collect training data. As users upload files into them or paste content into their prompts, including sensitive data, the data is now duplicated and stored in a location outside of your Microsoft 365 service trust boundary, removing any file access controls or classifications you’ve applied in the process, placing your data at greater risk. -And beyond being stored there for indexing or reasoning, it can be used to retrain the underlying model. Next, adding to the foundational protections of Microsoft 365 Copilot, Microsoft Purview has activity logging built in and helps you to discover and protect sensitive data where you get visibility into current and potential risks, such as the use of unprotected sensitive data in Copilot interactions, classify and secure data where information protection helps you to automatically classify, and apply sensitivity labels to data, ensuring it remains protected even when it’s used with Copilot, and detect and mitigate insider risks where you can be alerted to employee activities with Copilot that pose a risk to your data, and much more. -Over the next few minutes, I’ll focus on Purview capabilities to get ahead of and prevent data loss and insider risks. We’ll start in Data Security Posture Management or DSPM for AI for short. DSPM for AI is the one place to get a rich and prioritized bird’s eye view on how Copilot is being used inside your organization and discover corresponding risks, along with recommendations to improve your data security posture that you can implement right from the solution. Importantly, this is where you’ll find detailed dashboards for Microsoft 365 Copilot usage, including agents. -Then in Activity Explorer, we make it easy to see recent activities with AI interactions that include sensitive information types, like credit cards, ID numbers or bank accounts. And you can drill into each activity to see details, as well as the prompt and response text generated. One tip here, if you are seeing a lot of sensitive information exposed, it points to an information oversharing issue where people have access to more information than necessary to do their job. If you find yourself in this situation, I recommend you also check out our recent show on the topic at aka.ms/OversharingMechanics where I dive into the specific things you should do to assess your Microsoft 365 environment for potential oversharing risks to ensure the right people can access the right information when using Copilot. -Ultimately, DSPM for AI gives you the visibility you need to establish a data security baseline for Copilot usage in your organization, and helps you put in place preventative measures right away. In fact, without leaving DSPM for AI on the recommendations page, you’ll find the policies we advise everyone to use to improve data security, such as this one for detecting potentially risky interactions using insider risk management and other recommendations, like this one to detect potentially unethical behavior using communication compliance policies and more. From there, you can dive in to Microsoft Purview’s best-in-class solutions for more granular insights, and to configure specific policies and protections. -I’ll start with information protection. You can manage data security controls with Microsoft 365 Copilot in scope with the information protection policies, and the sensitivity labels that you have in use today. In fact, by default, any Copilot response using content with sensitivity labels will automatically inherit the highest priority label for the referenced content. And using data loss prevention policies, you can prevent Copilot from processing any content that has a specific sensitivity label applied. This way, even if users have access to those files, Copilot will effectively ignore this content as it retrieves relevant information from Microsoft Graph used to generate responses. Insider risk management helps you to catch data risk based on trending activities of people on your network using established user risk indicators and thresholds, and then uses policies to prevent accidental or intentional data misuse as they interact with Copilot where you can easily create policies based on quick policy templates, like this one looking for high-risk data leak patterns from insiders. -By default, this quick policy will scope all users in groups with a defined triggering event of data exfiltration, along with activity indicators, including external sharing, bulk downloads, label downgrades, and label removal in addition to other activities that indicate a high risk of data theft. And it doesn’t stop there. As individuals perform more risky activities, those can add up to elevate that user’s risk level. Here, instead of manually adjusting data security policies, using Adaptive Protection controls, you can also limit Copilot use depending on a user’s dynamic risk level, for example, when a user exceeds your defined risk condition thresholds to reach an elevated risk level, as you can see here. -Using Conditional Access policies in Microsoft Entra, in this case based on authentication context, as well as the condition for insider risk that you set in Microsoft Purview, you can choose to block their permission when attempting to access sites with a specific sensitivity label. That way, even if a user is granted access to a SharePoint site resource by an owner, their access will be blocked by the Conditional Access policy you set. Again, this is important because Copilot honors the user’s existing permissions to work with information. This way, Copilot will not return information that they do not have access to. -Next, Communication Compliance is a related insider risk solution that can act on potentially inappropriate Copilot interactions. In fact, there are specific policy options for Microsoft 365 Copilot interactions in communication compliance where you can flag jailbreak or prompt injection attempts using Prompt Shields classifiers. Communication compliance can be set to alert reviewers of that activity so they can easily discover policy matches and take corresponding actions. For example, if a person tries to use Copilot in an inappropriate way, like trying to get it to work around its instructions to generate content that Copilot shouldn’t, it will report on that activity, and you’ll also be able to see the response informing the user that their activity was blocked. -Once you have the controls you want in place, it’s a good idea to keep going back to DSPM for AI so you can see where Copilot usage is matching your data security policies. Sensitive interactions per AI app shows you interactions based on sensitive information types. Top unethical AI interactions surfaces insights based on the communication compliance controls you’ve defined. Top sensitivity labels referenced in Microsoft 365 Copilot reports on the labels you’ve created, and applied to reference content. And you can see Copilot interactions mapped to insider risk severity levels. Then digging into these reports shows you a filtered view of activities in Activity Explorer with time-based trends and details for each. Additionally, because all Copilot interactions are logged, like other Microsoft 365 activities in email, Microsoft Teams, SharePoint and OneDrive, you can now use the new data security investigation solution. This uses AI to quickly reason over thousands of items, including Copilot Chat interactions to help you investigate the potential cause of risks for known data leaks in similar incidents. -So that’s how Microsoft 365 Copilot, along with Microsoft Purview, provides comprehensive controls to help protect your data, minimize risk, and quickly identify Copilot interactions that could lead to compromise so you can take corrective actions. No other AI solution has this level of protection and control. To learn more, check out aka.ms/M365CopilotwithPurview. Keep watching Microsoft Mechanics for the latest updates and thanks for watching.1.6KViews0likes0CommentsMicrosoft 365 Copilot Power User Tips
Take control of your workday — summarize long emails instantly, turn meeting transcripts into actionable plans, and build strategic documents in seconds using your own data with Microsoft 365 Copilot. Instead of chasing down context, ask natural prompts and get clear, detailed results complete with tone-matched writing, visual recaps, and real-time collaboration. Get up to speed on complex email threads, transform insights from missed meetings into next steps, and pull relevant content from across your calendar, inbox, and docs — all without switching tools or losing momentum. Mary Pasch, Microsoft 365 Principal PM, shows how whether you’re refining a plan in Word, responding in Outlook, or catching up in Teams, Copilot works behind the scenes to help you move faster and focus on what matters. Cut through inbox clutter. Microsoft 365 Copilot in Outlook condenses long email chains into key takeaways. See how to save time with Copilot. Build strategy docs in minutes. Researcher agent asks smart questions and connects the dots. See how to use AI with chain-of-thought reasoning in Microsoft 365 Copilot. From teammate input to polished copy. Prompt Microsoft 365 Copilot to incorporate key meeting info into a shared document. See how it works. Watch our video here. QUICK LINKS: 00:00 — How to put Copilot to work for you 01:09 — Use Copilot in Outlook to summarize email threads 01:57 — Use chain-of-thought reasoning with Researcher 03:55 — Reference your content & meeting recap 05:29 — Use Copilot in Word to build on existing content 06:56 — Use Copilot in Microsoft Teams when late to a meeting 07:52 — Wrap up Link References Check out the free Copilot Academy at https://aka.ms/copilotacademy Unfamiliar with Microsoft Mechanics? As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft. Subscribe to our YouTube: https://www.youtube.com/c/MicrosoftMechanicsSeries Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com/podcast Keep getting this insider knowledge, join us on social: Follow us on Twitter: https://twitter.com/MSFTMechanics Share knowledge on LinkedIn: https://www.linkedin.com/company/microsoft-mechanics/ Enjoy us on Instagram: https://www.instagram.com/msftmechanics/ Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics Video Transcript: -If you have Microsoft 365 Copilot in your organization today, I’m going to walk you through the top five power user tips, and the lesser known ways in which you can really put Copilot to work for you, saving you time. We’ll go beyond the rich Microsoft 365 Copilot chat experience that’s available to every Microsoft 365 user, where you might be uploading information to inform generated responses. And I’ll focus on what you can do with a Microsoft 365 Copilot license, which lights up experiences within your familiar Office apps, automatically connecting your work data in Microsoft 365 to help you in context as you work. I’m going to start in Outlook, because who doesn’t need help with their inbox? You might already be using Microsoft 365 Copilot in Outlook to help write and quickly respond to emails, or to get help rewriting your existing drafts with auto rewrite, or by using your own detailed instructions to get it just right. -And if you haven’t already tried Copilot in Outlook, prompt suggestions for the things you can do in Outlook are built in for you to get started. These are all time-savers that are core to the experience, but have you ever tried using Microsoft 365 Copilot to help you get up to speed on a long email thread? Well, here, I’ve been added to an email thread, and I don’t necessarily have all the context. As I scroll this super long email with multiple people, there’s too much to take in, and it would take a lot of effort for me to parse it, and this is where Copilot in Outlook can help. By clicking on Summary by Copilot, the entire email thread is processed, and I’m left with a quick summary of the main points from the thread, including key actions specific to me. It’s boiled down about 10 pages of emails into these four bullets. It looks like my team needs my help researching a potential fit for new outdoor and adventure goods with our current electronics products, and I have less than a week to pull everything together. This normally would be a time-consuming effort, but this brings me to my second power user tip. I can now use AI with chain-of-thought reasoning to gather information and work with me to create a new product strategy doc. -From the Microsoft 365 Copilot app, I’ll use a new agent called Researcher. I’ll ask Researcher to develop a product strategy to enter a new market for outdoor and adventure goods. After I enter my prompt, Researcher goes to work. You can see that as part of its first response, it’s paused, and this time, it’s asking me clarifying questions about both the scope and format of what I want it to write. So I’ll respond with key details to answer both of its questions, then it uses my response to move forward. It takes my prompts, understands the task, and starts to build a plan that it’ll use to author a detailed report, and I can follow along. It’s reasoning over information that I have permission to access from internal locations. As it works, I can take a look at its reasoning process in real time. It tells me what it’s doing. It’s identifying our existing business lines, clarifying our product categories, analyzing the potential fit for outdoor products, looking for relevant meetings that I’ve been invited to to analyze the transcripts, and even researches industry trends from the web. What I love about this agent is that it’s actually doing the research to create what I need. -So let’s jump ahead to that final result. On the right, you can see that it’s delivered a thorough response with a fully-documented product strategy in line with what I’d expect from an expert. Starting with an analysis of my existing business, it’s also analyzed the outdoor and adventure gear sector. Then it’s built insights based on our existing business, and how it intersects with outdoor products. It’s added strategic positioning, and a detailed go-to-market plan. So I’ve saved a ton of time, and now, I have a solid, well-researched draft that I can build on with my team. -Next, because my team uses Word to build out these types of plans, I brought everything over to a new doc, and in Word, there are two power user tips for building on existing content using Copilot that I’ll show you. First, you can pull up Copilot on any blank line using the Copilot icon or the Alt + I keyboard shortcut. Beyond what’s here, I know that my team just brainstormed ideas about in-store experiences during a Teams meeting, and I want to use those details directly from the meeting recap to add that to our plan. -With Copilot, I can do that by using the Reference Your Content button. In the Meetings tab, I’ll locate the meeting I want, this one, for location planning, which uses the same Copilot-generated content for meeting recaps in Teams. I’ll pause a second before I complete this prompt in Word to show you the meeting recap first, and give you some context on that meeting. For any transcribed meeting, I can find the recap by going back to the original meeting invite from my calendar, then clicking on the Recap tab. These are AI-generated, and capture what was discussed in meetings that you were in, meetings you were late to, or meetings that you were invited to but couldn’t attend. For example, I missed this meeting, and without anyone taking notes, from AI notes, I can see they discussed placing outdoor products in our retail stores and creating connected outdoor display in our store. There are five cities listed here in Washington and Oregon to launch them. -If I go to the Mentions tab, you can see that I was even mentioned 33 minutes into the meeting. So with that context, I’m going to go back to Word and finish writing my prompt. I’ll ask Copilot to add a paragraph for creating in-store displays discussed in the meeting, with a few additional instructions, and it’s taken the details from the meeting and adding it to our plan. And you can see those five store locations that we saw before in the meeting recap, and in seconds, we’ve transformed the actions from a spoken Teams meeting to add to our written plan. It didn’t just insert the paragraph to the rest of the doc, it’s actually matched the tone and altitude of the rest of our plan, so it doesn’t feel out of place. -And now, everyone’s working together on this document. I can see Adele and Daichi are here. In fact, as I scroll down, there’s also a comment from Daichi to add details about our outdoor products that we’re already working on to release later this year. For this power user tip, I’ll open Copilot, and use a forward slash and start typing. Then choose the email from Daichi, and complete my prompt to add those details. Now, we have details about the outdoors electronics we’ll be launching soon to complete our plan. -And by the way, if I need more inspiration from Copilot in Word, I can use Copilot from the ribbon, and then use the Add menu in Copilot in Word to ask an agent, add an image, and view prompts from the prompt gallery with lots of great options here. This is also available across other Microsoft 365 apps with prompt tips specific to each app. That said, let me show you our next power tip, which is something that’s super powerful. If you’re ever late to a meeting, and join the meeting after it started, you can use Copilot to catch up on what you missed, even shared visual content that was presented during the meeting. -Here, I’m joining a brainstorming meeting, and you can see that I’ve missed the first seven and a half minutes, but that’s okay. I can ask Copilot to bring me up to speed by asking what I missed, and Copilot tells me exactly what was covered before I joined. Next, I can also ask if there were any visuals shared, and not only does it provide a summary of the content that I missed on screen, including embedded text, but it also shares still images of the shared content themselves. This one is an important prototype of the in-store campsite display, highlighting our product lineup. I can even zoom in for a closer look at the image. And because I’m caught up with everything I missed, I don’t need to ask my team to back up and repeat what they’ve already presented. -As you saw with the power user tips I shared today, whenever you use Microsoft 365 Copilot inside your apps, your work data is automatically delivered into the experience. There’s no need to upload or paste work content into your prompts. This also means that your work information retains its protections. For more things to try, check out the free Copilot Academy at aka.ms/copilotacademy. And keep watching Mechanics for the latest updates from Microsoft, and thanks for watching.1KViews0likes0CommentsIntroducing Copilot in the Microsoft 365 admin centers
Streamline daily admin tasks with AI-powered insights, natural language queries, and automation using Copilot in Microsoft 365 admin centers. Quickly recap key updates, monitor service health, and track important changes — all in one place. No more digging through multiple pages — just ask Copilot for the answers you need, grounded in real-time data from your tenant. From finding users and managing licenses to generating visual insights and automating tasks with PowerShell, use Copilot to simplify complex admin workflows and save valuable time. For Copilot in the admin center to light up, all you need is one active Microsoft 365 Copilot license for any user in your tenant and from the Microsoft 365 admin center, you can get started right away. Jeremy Chapman, Director of Microsoft 365, demonstrates how to leverage Copilot for proactive guidance, whether in the Microsoft 365 admin center or directly within Copilot Chat. Save time with Copilot. Type Recap to instantly see critical admin updates and actions in one view. Check it out in the Microsoft 365 admin center. Stay on top of changes. Copilot summarizes new features & updates from the Message Center, so you never miss an important rollout. Get started. Instant visual insights. Ask Copilot how many Copilot licenses are left and see a breakdown, no manual reports needed. Watch it here. Watch our video here. QUICK LINKS: 00:00 — Copilot in Microsoft 365 admin centers 00:42 — Use Copilot for change management 02:13 — Stay ahead of upcoming changes 03:31 — User and licensing queries 04:21 — Generate Visual Insights for Licensing and Usage 04:50 — Author PowerShell scripts for bulk operations 06:07 — Copilot Chat using Microsoft 365 Admin agent 07:37 — Copilot coming soon to other admin centers 07:51— Wrap up Link References For more information, check out https://aka.ms/CopilotinMAC Start using Copilot in the Microsoft 365 admin center at https://admin.microsoft.com Unfamiliar with Microsoft Mechanics? As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft. Subscribe to our YouTube: https://www.youtube.com/c/MicrosoftMechanicsSeries Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com/podcast Keep getting this insider knowledge, join us on social: Follow us on Twitter: https://twitter.com/MSFTMechanics Share knowledge on LinkedIn: https://www.linkedin.com/company/microsoft-mechanics/ Enjoy us on Instagram: https://www.instagram.com/msftmechanics/ Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics Video Transcript: -If you’re a Microsoft 365 admin, you can now take advantage of Copilot and generative AI to perform tasks across different Microsoft 365 services. In the next few minutes, in fact, I’ll show you how you can interact with it using natural language, get contextual guidance, and find proactive suggestions for common admin tasks. For the experience to light up, all you need is one active Microsoft 365 Copilot license for any user in your tenant. And from the admin center, you can get started right away. That said, before we get started, in case you’re wondering, Copilot Microsoft 365 admin centers does not make configuration changes autonomously on your behalf. As I’ll show you, it’s designed to save you time and many of the things that you do every day as an admin or business owner. -And I’ll start by showing you an example of how you can use it for change management. I’m in the Microsoft 365 admin center, and now Copilot can help you keep track of new capabilities rolling out, as well as changes that you need to action as an admin. In fact, you can use the starter prompt recap, and I’ll add the latest admin info, and you’ll see that Copilot is generating an up-to-date view of important information and key insights across service health, message center, and Microsoft 365 Copilot usage insights. This summary is personalized to you based on your specific admin role, highlighting the parts of the admin center that you use most, and your real-time individual tenant information. So this saves you time looking for information and insights that are typically spread across multiple locations in the Microsoft 365 admin center. And you can click on the see details controls to expand each area, and find out more, as well as where you can go to take any corresponding actions. For example, with these now expanded, I can see my tenant service health status, and a summary of active incidents, issues and advisories. In this case, I have one issue and three total advisories across Microsoft 365 suite, Microsoft Purview, and others. -From here I can even use these as deep links to click into my active issue for the updated attack simulations, and training URL endpoint, in this case, to find out more. Copilot can also help you stay ahead of upcoming changes, along with the items that you need to take care of from the message center. For example, back in my recap, I can see details highlighting three new features, and also three feature updates. So for this new feature, I can see details about Copilot in Edge, new contextual features to find out more about its capabilities and rollout details. I can also use the view in buttons for deep links directly into service health or the message center, like you’re seeing here with all my recent unread messages. -So as we saw, Copilot helps you stay on top of issues with its suggested prompt starters, like recap, and, of course, you can author your own prompts too, and they’ll also be grounded on data from your individual tenant. In this case, I’ll type in “Summarize my announcements for Outlook,” and Copilot generates a full summary with feature updates from the past week for Outlook. For example, here’s a new capability rolling out for the Microsoft 365 app, getting updated to be the Microsoft 365 Copilot app, and corresponding changes to the Outlook apps for iOS and Android. This change will allow more people to experience Copilot chat from their mobile apps. And now you have all the details you need to prepare for the update. -Next, let me show you how Copilot can help you with common admin tasks, like user and licensing queries using natural language. So I’ll prompt Copilot to find users in the marketing department with a Copilot license, and submit. Now, behind the scenes, it’s combining a directory attribute, the marketing department, with a licensing attribute for Copilot, what would’ve previously required advanced filtering or PowerShell. And it finds three people that match the query. And if it’s a larger group of people, you can use the CSV file option to export a list that you might use for a broader email campaign or with PowerShell scripting. -To be clear, everything that you’ve just seen is running under the permissions context of the admin using Copilot, so it can only find information that the individual account specifically has access to. Now, another area where Copilot can help is with generating visualizations for bulk insights into things like usage and licensing. For example, you might want to see how many Copilot licenses in your tenant have been acquired, and how many are available to assign. So for that, I can prompt Copilot, “How many Copilot licenses do I have available to assign?” And it generates an inline bar chart with details about Microsoft 365 Copilot, Copilot Studio, and Sales Copilot licenses available to assign. -And since this is Mechanics, let me show you an early look at a more advanced admin scenario to help author PowerShell scripts for bulk operations. Now, this is useful where performing specific tasks in the admin center at scale might be too manual or in cases where the control is not available in the admin center. For example, as part of my Microsoft 365 Copilot rollout, if I want to enable restricted SharePoint search using a list of allowed sites, which is only possible using PowerShell, I can prompt Copilot with “How do I get the SharePoint online PowerShell module, then enable restricted SharePoint search using a CSV file with a allowed sites using PowerShell?” And Copilot will use the Microsoft 365 admin documentation and PowerShell reference guides so that I can save time by not having to look that information up myself. And notice that everything is formatted so I can easily parse what the commands are doing, and I just need to change the placeholder values for the URL and file path, and I can run what’s presented. -Okay, and just to prove that it will work, let’s test it out. So these are the cmdlets that we just saw from Copilot with updated placeholders. So I’ll go ahead and run it. You’ll see there are no errors. Now, I’ll get the status of the feature. It’s enabled. Then get the list of allowed sites, and there they are. Next, let me show you another early look for performing these tasks in Copilot Chat using a Microsoft 365 admin agent. If your day-to-day Microsoft 365 account is the same account that you use for admin tasks, and you don’t use a separate admin-only account, you’ll be able to access these admin experiences from Copilot Chat. I’m in the Microsoft 365 Copilot app. I just need to type the @ symbol to pull up a list of available agents or I could directly type @Microsoft365Admin. -And from here I can run the same admin recap we saw earlier by typing “recap important info for me” as my prompt. You’ll see that it surfaces the same information that we saw before in the admin center. In fact, when I expand the details under service health, there’s our attack simulation’s URL endpoint update. The view in buttons also link me directly to the Microsoft 365 admin center. And because it’s an agent, you’ll also be able to access the Microsoft 365 admin agent from other app endpoints, like you’re seeing here with Microsoft Word. I can get the same information with my recap from before, and once it completes, I can use that right from Word, for example, if I wanted to write a change management report. Now, it’s worth pointing out that whereas everything I showed from the admin center does not require a Microsoft 365 Copilot license for your admin account, to use the Microsoft 365 admin agent, your admin account would need one. Finally, Copilot in admin center’s experiences will extend to other surface areas, as we presented in November, including the admin centers for Microsoft Teams, as well as SharePoint online, and more details for those are coming soon. -So those are just a few examples of how Copilot can help you as an admin save time with your day-to-day work, and give you proactive suggestions for different admin tasks. Again, all you need is just one active Microsoft 365 Copilot license in your tenant, and you can get started right away. To find out more, check out aka.ms/CopilotinMAC, and start using it today in the Microsoft 365 admin center at admin.microsoft.com. Keep watching Microsoft Mechanics for the latest tech updates. Subscribe to our channel and thanks for watching.1.8KViews2likes0CommentsOversharing Control at Enterprise Scale | Updates for Microsoft 365 Copilot in Microsoft Purview
Minimize risks that come with oversharing and potential data loss. Use Microsoft Purview and its new Data Security Posture Management (DSPM) for AI insights, along with new Data Loss Prevention policies for Microsoft 365 Copilot, and SharePoint Advanced Management, which is now included with Microsoft 365 Copilot. Automate site access reviews at scale and add controls to restrict access to sites if they contain highly sensitive information. Erica Toelle, Microsoft Purview Senior PM, shows how to control data visibility, automate site access reviews, and fine-tune permissions with Pilot, Deploy, Optimize phases. Protect your data from unwanted exposure. Find and secure high-risk SharePoint sites with Microsoft Purview’s oversharing report. Start here. Secure Microsoft 365 Copilot adoption at scale. Check out the Pilot-Deploy-Optimize approach, to align AI use with your organization’s data governance. Watch here. Boost security, compliance, and governance. Scoped DLP policies enable Microsoft 365 Copilot to respect data labels. Take a look. Watch our video here. QUICK LINKS: 00:00 — Minimize risk of oversharing 01:24 — Oversharing scenarios 04:03 — How oversharing can occur 05:38 — Restrict discovery & limit access 06:36 — Scope sites 07:15 — Pilot phase 08:16 — Deploy phase 09:17 — Site access reviews 10:00 — Optimize phase 10:54 — Wrap up Link References Check out https://aka.ms/DeployM365Copilot Watch our show on the basics of oversharing at https://aka.ms/SMBoversharing Unfamiliar with Microsoft Mechanics? As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft. Subscribe to our YouTube:https://www.youtube.com/c/MicrosoftMechanicsSeries Talk with other IT Pros, join us on the Microsoft Tech Community:https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog Watch or listen from anywhere, subscribe to our podcast:https://microsoftmechanics.libsyn.com/podcast Keep getting this insider knowledge, join us on social: Follow us on Twitter:https://twitter.com/MSFTMechanics Share knowledge on LinkedIn:https://www.linkedin.com/company/microsoft-mechanics/ Enjoy us on Instagram:https://www.instagram.com/msftmechanics/ Loosen up with us on TikTok:https://www.tiktok.com/@msftmechanics Video Transcript: -Are you looking to deploy Microsoft 365 Copilot at scale, but concerned that your information is overshared? Ultimately, you want to ensure that your users and teams can only get to the data required to do their jobs and nothing more. For example, while using Microsoft 365 Copilot and interacting with work data, you don’t want information surfaced that users should not have permissions to view. So, where do you even start to solve for this? You might have hundreds or thousands of SharePoint sites to assess and right-size information access. Additionally, knowing where your sensitive or high value information resides and making sure that the policies you set to protect information continuously and avoid returning to an oversharing state, can come with challenges. -The good news is there are a number of updated tools and resources available to help you get a handle on all this. In the next few minutes, I’ll unpack the approach you can take to help you minimize the risks that come with oversharing and potential data loss using Microsoft Purview and its new Data Security Posture Management for AI insights, along with new Data Loss Prevention policies for Microsoft 365 Copilot and more. And SharePoint Advance Management, which is now included with Microsoft 365 Copilot. This helps you automate site access reviews at scale and adds controls to restrict access to sites even if they contain highly sensitive information. First, let’s look at how information oversharing can inadvertently occur just as it would with everyday search when using Microsoft 365 Copilot. -I’ll explain how it works. When you submit a prompt before presenting that to a large language model, the prompt is interpreted by Copilot and using a process called Retrieval Augmented Generation it then finds and retrieves grounding information that you are allowed to access in places like SharePoint, OneDrive, Microsoft Teams, your email and calendar, and optionally the internet, as well as other connected data sources. The retrieved information is appended to your prompt as additional context. Then that larger prompt is presented to the large language model. With that added grounding information, the response is generated then formatted for the app that you’re using. For this to work well, that information retrieval step relies on accurate search. And what’s important here is as you use Copilot it can only retrieve information that you explicitly have access to and nothing else. This is how search works in Microsoft 365 and SharePoint. The controls you put in place to achieve just enough access will reduce data security risk, whether you intend to use Microsoft 365 Copilot or not. -So, let me show you a few examples you may have experienced where content is overshared. I’ll start in Business Chat. I’m logged in is Adele Vance from the sales team. Her customers are pressuring her for information about new products that haven’t been internally or externally announced. She submits a prompt for 2025 product plans and the response returns a few clearly sensitive documents that she shouldn’t have access to, and the links in the response and in the citations take Adele right to those files. -Now, I’m going to switch perspectives to someone on the product planning team building the confidential plan stored in a private SharePoint site. I’m working on the 2025 product plan on a small team. This is the same doc that Adele just found in Business Chat, and if you look at the top of the document right now, there was one other person who I expect in the document. Then suddenly a few more people appear to have the document open and I don’t know who these people are and they shouldn’t be here. So, this file is definitely overshared. -Now, I’m going to switch back to Adele’s perspective as beyond the product planning doc. The response also describes a new project with the code name Thunderbolt. So, I’ll choose the Copilot recommended prompt to provide more details about Project Thunderbolt, and we can see a couple of recent documents with information that I as Adele should not have access to as a member of the sales team. In fact, if I open the file, I can get right to the detailed specifications and pricing information. -Now, let’s dig into the potential reasons why this is happening, and then I’ll cover how you discover and correct these conditions at enterprise scale. First, privacy settings for SharePoint sites can be set to public or private. These settings are most commonly configured as sites are created. Often sites are set to public, which means anyone in your organization can find content contained within those sites, and by extension, so can Microsoft 365 Copilot. -Next, is setting the default sharing option to everyone in an organization. One common misperception here is just by creating the link, you’re enabling access to that file, folder, or site automatically. That’s not how these links work though. Once a sharing link is redeemed or clicked on by the recipient, that person will have access to and be able to search for the shared content. There are, however, sharing approaches, which auto-redeem sharing links, such as pasting the link into an email and sending that to lots of people. In that case, those recipients have access to the content and will be able to search for it immediately. -Related to this is granting permissions to the everyone except external users group, as you define membership for your SharePoint sites. This group gives everyone in your organization access and the ability to search for that information too. And you’ll also want to look into permissions granted to other large and inclusive groups, which are often maintained using dynamic group membership. And if you’re using Data Loss Prevention, information protection, or other classification controls from Microsoft Purview, labeled content can also trigger sharing restrictions. -So, let’s move on to addressing these common issues and the controls you will use in Microsoft 365, Microsoft Purview, and SharePoint Advance Management. At a high level, there are two primary ways to implement protections. The first approach is to restrict content discovery so that information doesn’t appear in search. Restricting discovery still allows users to access content they’ve previously accessed as well as the content shared with them. The downsides are that content people should not have access to is still accessible, and importantly, Copilot cannot work with restricted content even if it’s core to a person’s job. So, we recommend restricting content discovery as a short-term solution. -The second approach is to limit information access by tightening permissions on sites, folders, and individual files. This option has stronger protections against data loss and users can still request access, if they need it to do their jobs. Meaning only people who need access have access. We recommend limiting access as an ongoing best practice. Then to scope the sites that you want to allow and protect, we provide a few options to help you know where to start. First, you can use the SharePoint Active sites list where you can sort by activity to discover which SharePoint sites should be universally accessible for all employees in your organization. Then as part of the new Data Security Posture Management for AI reporting in Microsoft Purview, the oversharing report lets you easily find the sites with higher risk containing the most sensitive information that you want to protect. The sites you define to allow access and limit access will be used in later steps. Now, let’s move on to the steps for repairing your data from Microsoft 365 Copilot. We’ve mapped best practices and tools for Copilot adoption across Pilot, Deploy, and Optimize phases. -First, in the Pilot phase, we recommend organization-wide controls to easily restrict discovery when using Copilot. This means taking your list of universally accessible sites previously mentioned, then using a capability called Restricted SharePoint search, where you can create and allow list of up to 100 sites, then allow just those sites to be used with search in Copilot. Then in parallel in Microsoft Purview, we’ll configure ways to get visibility into Copilot usage patterns where you can enable audit mode using Data Loss Prevention policies to detect sharing of labeled or unlabeled sensitive content. And likewise, you’ll enable analysis of Copilot interactions as a part of communication compliance. Again, these approaches do not impact information access only discoverability via Copilot and search. -Now, let’s move on to the broader Deploy phase where you will enable Copilot for more users. Here you’ll use the list of identified sites from Microsoft Purview’s oversharing report to identify sites with the most sensitive information. Controls in Microsoft Purview provide proactive information protection with sensitivity labels for your files, emails, meetings, groups, and sites. For each item, you can use more targeted controls to right-size site access by assigning permissions to specific users and groups. And when applied, these controls on the backend will move public sites to private and control access to defined site members based on the permissions you set. Next, you can enable new Data Loss Prevention from Microsoft 365 Copilot policies to exclude specific labels from Copilot prompts and responses. And you can change your DLP policies from the audit mode that you set during the Pilot phase to start blocking unnecessary sharing of labeled content where you’ll now turn on the policies in order to enforce them. -Then, two options from SharePoint Advance Management are to use restricted access control to limit access to individual sites. That way only members in defined security groups will have access, and to limit site access by operationalizing site owner access reviews. Then as an additional fine-tuning option, you can target restricted content discovery on individual sites, like you see here with our leadership site to prevent Copilot from using their content as you continue to work through access management controls. And as part of the Deploy phase, you’ll disable restricted SharePoint search once you have the right controls in place. Together, these options will impact both access permissions, as well as discovery via Copilot and search. -Next, the final Optimize phase is about setting your organization up for the long term. This includes permissioning, information classifications, and data lifecycle management. Here you’ll continually monitor your data security risks using oversharing reports. Then implement auto-labeling and classification strategies using Microsoft Purview, and ensure that as new sites are created, site owners and automated provisioning respect access management principles. These processes help ensure that your organization doesn’t drift back into an oversharing state to keep your data protected and ongoing permissions in check. Now, if we switch back to our initial user examples in Business Chat with our controls in place, if we try the same prompts as before, you’ll see that Adele can no longer access sensitive information, even if she knows exactly what to look for in her prompts. The data is now protected and access has been right-sized for everyone in the organization. -So, those are the steps and tools to prepare your information from Microsoft 365 Copilot at enterprise scale, and help ensure that your data is protected and that everyone has just enough access to do their jobs. To learn more, check out aka.ms/DeployM365Copilot. Also, watch our recent show on the basics of oversharing at aka.ms/SMBoversharing for more tips to rightsize permissions for SharePoint site owners. Keep watching Microsoft Mechanics for the latest updates and thanks for watching.1.8KViews0likes0CommentsMicrosoft 365 Copilot Wave Two updates - Pages, Excel, OneDrive, and agents
Check out Microsoft 365 Copilot Wave Two updates, featuring Business Chat and the new Copilot Pages for enhanced collaboration, advancements in Excel data analysis, AI-driven file comparisons in OneDrive, and easy-to-create Copilot agents for automating business processes.9.3KViews1like1Comment