Copilot for Microsoft 365 : Architecture and Key Concepts

Brass Contributor

The following diagram displays the Copilot for Microsoft 365 service and tenant logical architecture.

service-tenant-architecture-without-semantic-index-31e78460.png

 

Architecture of Copilot for Microsoft 365 : 

microsoft-365-copilot-logical-architecture-0ed9fddb.png

 

Copilot for Microsoft 365 can generate responses anchored in the customer’s business content, such as:

  • User documents
  • Emails
  • Calendar
  • Chats
  • Meetings
  • Contacts
  • Other business data

Copilot for Microsoft 365 follows these foundational principles:

  • Built on Microsoft’s comprehensive approach to security, compliance, and privacy.
  • Architected to protect tenant, group, and individual data.
  • Committed to responsible AI.

Key components of Copilot for Microsoft 365 includes:

  • Large Language Models (LLMs)
  • Natural Language Processing (NLP)
  • Microsoft 365 apps
  • Microsoft Copilot (chat)
  • Microsoft Syntex
  • Microsoft Graph

How Microsoft Copilot for Microsoft 365 works - YouTube

 

  • Users can initiate Copilot prompts from devices that have Microsoft 365 apps installed.
  • Copilot components include:
    • The Copilot service, which orchestrates the responses to user prompts.
    • An instance of the Microsoft Graph for the data of your Microsoft 365 tenant.
  • Your Microsoft 365 tenant that contains your organization data.

Key concepts 

  • Microsoft 365 Copilot will only work with files saved to OneDrive. If files are stored locally on your PC, you will need to move them to OneDrive to activate Copilot ( as on date Mar'2024)
  • Microsoft's Azure OpenAI Service privately hosts the LLMs used by Copilot for Microsoft 365.
  •  Copilot for Microsoft 365 only displays organizational data to which individual users have at least View permissions.
  • It's important that organizations use the permission models available in Microsoft 365 services, such as SharePoint, to help ensure the right users or groups have the right access to the right content.
  • Copilot for Microsoft 365 applies Microsoft Graph to synthesize and search content from multiple sources within your tenant. The Microsoft Graph API brings more context from user signals into the prompt, such as information from emails, chats, documents, and meetings. This information includes data from services like Outlook, OneDrive, SharePoint, Teams, and more.
  • Only data a user has access to is returned in query responses, as illustrated in the following diagram.
  • Microsoft 365 keeps your data logically isolated by tenant. This design, together with encryption, ensures privacy while processing and at rest.
  • Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs, including those used by Copilot for Microsoft 365.
  • Copilot is a shared service just like many other services in Microsoft 365. Communication between your tenant and Copilot components are encrypted.
  • Microsoft Copilot for Microsoft 365 uses Azure OpenAI services for processing, not OpenAI’s publicly available services.
  • Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs, including the LLMs used by Copilot for Microsoft 365.

The Copilot for Microsoft service and tenant logical architecture supports an organization's security and compliance in several ways:

  • Copilot operates as a shared service within Microsoft 365, ensuring encrypted communication between your tenant and Copilot components. Your data remains confidential and secure.
  • Existing security and compliance policies deployed by your organization continue to apply. Copilot adheres to these policies, safeguarding sensitive information.
  • The tenant boundary ensures data privacy, location compliance, and adherence to security protocols. Your data remains within the Microsoft 365 service boundary, protected by Microsoft 365's robust security measures.

 

To ensure that Copilot for Microsoft 365 uses your content effectively, administrators should:

 

  • Add a urlToItemResolver when you create your connection. A urlToItemResolver enables the platform to detect when users share URLs from your external content with each other. Copilot for Microsoft 365 has a higher likelihood of displaying content shared with that user. As such, you should add a urlToItemResolver in activitySettings when you create your connection.
  • Apply semantic labels. Semantic labels help Copilot for Microsoft 365 interpret the semantic meaning of your schema. Apply as many semantic labels to your schema as applicable.
  • Add user activities on your items. For a list of supported user activity types, see external activity. The system assigns greater importance to items that have more activities.
  • Administrators can choose to let data out of the compliance boundary; for example, to query public web content using Microsoft Bing.

For more information, see How to make your Graph connector work better with Copilot.

0 Replies