Forum Discussion

PeteStockley's avatar
PeteStockley
Copper Contributor
Oct 21, 2024

Can Microsoft 365 Copilot learn directly from my actions? (e.g corrective or change response?)

I am trying to comprehend the scope in which Microsoft 365 Copilot might be able to learn and change it's response on a user-level. I have identified 4 scopes (this might be incorrect) and I am looking for answers about scope 2. Does anybody have some concrete information about this?

Scope 1: In the scope of a single chat.

- The response can be directly change by the user. e.g. tell Copilot that 2+2=5. then ask what is 2+2.

 

Scope 2: In the scope of all interactions with of a single user.

- I cannot find concrete information if Copilot will, or will not, give different responses based on histories of all interactions with me.

 

Scope 3: In the scope of the tenant.

- I cannot find concrete information if Copilot will, or will not, give different responses based on histories of all interactions with other users in the tenant.

 

Scope 4: In the scope of the foundation of Copilot service.

  • “Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs, including those used by Microsoft 365 Copilot.” 



Apologies if this is confirmed / unconfirmed in any official source. I spent a few hours looking but was unable to find any concrete answer.

  • MicheleIT's avatar
    MicheleIT
    Brass Contributor

    I put 2+2=5 and it corrected me by saying there's been a mistake.

     

  • SteinBerge's avatar
    SteinBerge
    Copper Contributor

    I tried to train M365 Copilot do the same, but it quite reluctantly wanted to give me the wrong 2+2 calculation. I also asked how long it will remember the statement 2+2=4, and you can see the answer in the attached screen.

     

Resources