M365 Copilot
6 TopicsUnleashing Copilot: 'Aha' Moments | ESPC24
Live Demonstrations, No PowerPoint First off, let me just say, no death by PowerPoint here! This session was all about seeing the good stuff—live and in action. Bryan Wofford, who is clearly as passionate about adoption as a puppy is about belly rubs 🐶, showed us how adoption kits from Microsoft can transform users into Copilot connoisseurs. "Aha" Moments Galore Bryan talked about guiding users to their "aha" moments with Copilot. Honestly, this session gave me several "aha" moments about the adoption kit itself. It's like I was seeing the content through new eyes. Thanks, Bryan! Essential Tools and Resources Now let’s talk tools. The mission of this session was to understand the adoption tools at our disposal: adoption.microsoft.com/Copilot is updated more often than my Netflix recommendations, courtesy of Bryan’s team. There's even a section dedicated to all the news. Meaning, this is the place to visit to make sure that you stay on top of all news. Interactive Library and Downloadable Guidance Bryan introduced us to the interactive library and downloadable guidance. It’s like a treasure chest of functional scenario guidance, ready to be tailored to your organization. Starting Points and Champions Starting at adoption.microsoft.com/adoption, Bryan and his team keep it as the latest and greatest. You’ll also find the Copilot Success Kit and a link to join the community. The interactive scenario library is another gem; each role has a kit with templates, and there's a "Top 10 to Try First" for onboarding new users. Each role deck includes suggestions for KPIs specific to your department. No need to wade through endless PowerPoints, everything is on the website, ready for you to download and use. The suggestion prompts are just starting points, but the real magic happens when Champions take it further. And let’s not forget the fun factor to boost adoption —who wouldn’t want to find out what kind of superhero they are based on their email tone? Industry-Specific Scenarios The functional scenario library now includes industry-specific scenarios for Education, Energy, Financial Services, Healthcare, and more. The Copilot prompt library is another treasure trove for "aha" moments, with a guide on crafting the perfect prompt. Beyond ESPC24, continue the learning... The following Microsoft Ignite session recording takes you a little deeper into "Microsoft 365 Copilot: a power-user masterclass". Watch it now: Conclusion To sum up: This session was a rollercoaster of insights, laughs, and "aha" moments. My Copilot-world has definitely been improved in the best way possible. If you ever get the chance to attend a session with Bryan Wofford, don't miss it. Your inner Copilot enthusiast will thank you. I hope this post brought you a few 'aha' moments of your own, Caroline Kallin254Views4likes0CommentsHow to disable the option "Only during the meeting" (formerly Without transcription) tenant-wide?
Testing on MS Teams calls has shown that the option "Only during the meeting" (which used to be called "Without transcription") results in some parties not being informed they are being "listened to" by M365 Copilot. These parties include anyone joining from a tenant that is different to the hosting tenant, those that dial in and any joiners from an iOS device. This is unethical so we'd like to turn off the "Only during the meeting" tenant-wide. How can we do this?1KViews1like1CommentCopilot for Microsoft 365 : Architecture and Key Concepts
The following diagram displays the Copilot for Microsoft 365 service and tenant logical architecture. Architecture of Copilot for Microsoft 365 : Copilot for Microsoft 365 can generate responses anchored in the customer’s business content, such as: User documents Emails Calendar Chats Meetings Contacts Other business data Copilot for Microsoft 365 follows these foundational principles: Built on Microsoft’s comprehensive approach to security, compliance, and privacy. Architected to protect tenant, group, and individual data. Committed to responsible AI. Key components of Copilot for Microsoft 365 includes: Large Language Models (LLMs) Natural Language Processing (NLP) Microsoft 365 apps Microsoft Copilot (chat) Microsoft Syntex Microsoft Graph How Microsoft Copilot for Microsoft 365 works - YouTube Users can initiate Copilot prompts from devices that have Microsoft 365 apps installed. Copilot components include: The Copilot service, which orchestrates the responses to user prompts. An instance of the Microsoft Graph for the data of your Microsoft 365 tenant. Your Microsoft 365 tenant that contains your organization data. Key concepts Microsoft 365 Copilot will only work with files saved to OneDrive. If files are stored locally on your PC, you will need to move them to OneDrive to activate Copilot ( as on date Mar'2024) Microsoft's Azure OpenAI Service privately hosts the LLMs used by Copilot for Microsoft 365. Copilot for Microsoft 365 only displays organizational data to which individual users have at least View permissions. It's important that organizations use the permission models available in Microsoft 365 services, such as SharePoint, to help ensure the right users or groups have the right access to the right content. Copilot for Microsoft 365 applies Microsoft Graph to synthesize and search content from multiple sources within your tenant. The Microsoft Graph API brings more context from user signals into the prompt, such as information from emails, chats, documents, and meetings. This information includes data from services like Outlook, OneDrive, SharePoint, Teams, and more. Only data a user has access to is returned in query responses, as illustrated in the following diagram. Microsoft 365 keeps your data logically isolated by tenant. This design, together with encryption, ensures privacy while processing and at rest. Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs, including those used by Copilot for Microsoft 365. Copilot is a shared service just like many other services in Microsoft 365. Communication between your tenant and Copilot components are encrypted. Microsoft Copilot for Microsoft 365 uses Azure OpenAI services for processing, not OpenAI’s publicly available services. Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs, including the LLMs used by Copilot for Microsoft 365. The Copilot for Microsoft service and tenant logical architecture supports an organization's security and compliance in several ways: Copilot operates as a shared service within Microsoft 365, ensuring encrypted communication between your tenant and Copilot components. Your data remains confidential and secure. Existing security and compliance policies deployed by your organization continue to apply. Copilot adheres to these policies, safeguarding sensitive information. The tenant boundary ensures data privacy, location compliance, and adherence to security protocols. Your data remains within the Microsoft 365 service boundary, protected by Microsoft 365's robust security measures. To ensure that Copilot for Microsoft 365 uses your content effectively, administrators should: Add a urlToItemResolver when you create your connection. AurlToItemResolverenables the platform to detect when users share URLs from your external content with each other. Copilot for Microsoft 365 has a higher likelihood of displaying content shared with that user. As such, you should add a urlToItemResolver inactivitySettingswhen youcreate your connection. Apply semantic labels. Semantic labels help Copilot for Microsoft 365 interpret the semantic meaning of your schema. Apply as manysemantic labelsto your schema as applicable. Add user activities on your items. For a list of supported user activity types, seeexternal activity. The system assigns greater importance to items that have more activities. Administrators can choose to let data out of the compliance boundary; for example, to query public web content using Microsoft Bing. For more information, seeHow to make your Graph connector work better with Copilot.27KViews5likes0Comments