Event banner
Unleash Your Potential with Microsoft 365 Copilot: An AMA to Ignite your Productivity
Event Ended
Thursday, Nov 09, 2023, 09:00 AM PSTEvent details
Join us on Thursday, November 9th at 9:00 AM Pacific Standard Time for an exciting Ask Me Anything (AMA) session on Microsoft 365 Copilot. This live, text-based online event will allow you to connect...
Sarah_Gilbert
Updated Nov 09, 2023
NatPor
Nov 09, 2023Brass Contributor
I heard that that M365 Copilot deletes every prompt right after providing the answer in order to stay compliant and not feed the LLM with company data. Thus M365 Copilot always provides new answers / never generates the exact same response. I only received this information orally but could not find it in any of the MSFT official written docs. Could you please provide this information written or send the link. Thanks!
- ScottSchnollNov 09, 2023Bronze ContributorCopilot does not delete prompts; in fact, Copilot interactions are stored within a hidden folder in the user's Exchange mailbox. In addition, retention policies and eDiscovery search can be used against them. See https://learn.microsoft.com/en-us/purview/retention-policies-copilot for more info.
- NatPorNov 09, 2023Brass Contributor
Thank you ScottSchnoll very helpful link. Thus the LLM actually is learning from the old prompts or is this hidden folder preventing it from learning? In other words how does MSFT guarantee that the LLM is not using company sensitive data which is included in the prompts?
- ScottSchnollNov 09, 2023Bronze ContributorLLMs don't learn any more. They are pre-trained, and the learnings from that pre-training are then applied to different data sets (like Microsoft 365 data). LLMs only access company data that a user has access to, and only when a user asks it to. The use of the hidden folder is the same approach taken for Teams chat messages for security and compliance reasons. Also, for convenience, prompt history in M365 chat is shown to users (although they can be deleted from the user's view if the user wants). If you are a user that has access to sensitive data and Copilot, then you can use Copilot with that sensitive data.