Event details
We are very excited to announce a Microsoft 365 Copilot and Bing Chat Enterprise AMA! Get answers to your questions around Microsoft 365 copilot and Bing Chat enterprise from our team of experts
...
Sarah_Gilbert
Updated Aug 10, 2023
Someone449
Aug 10, 2023Brass Contributor
Is Microsoft Copilot prone to hallucinations like BCE or are hallucinations not a concern?
- KevinShermanAug 10, 2023
Microsoft
As you probably know, hallucinations are a bit of a necessary evil of LLMs. The thing that makes them so creative is the same thing that can cause them to get things wrong at times. We're getting much better at reducing the frequency of hallucinations across the board. Additionally, Copilot is less likely to hallucinate than many chat-connected LLMs systems because it grounds prompts before calling the LLM. The grounding of those prompts adds the context and content needed to help the LLM get to a more accurate answer more often. That said, we do know that Copilot will at times be usefully wrong, and have built the UX and UI to help the user recognize this, and work with Copilot to get to a more accurate answer.