Forum Discussion
HollyJ2022
Jan 22, 2024Brass Contributor
Bing Chat / Copilot "Invents" fake "facts" - "I don't know" should be a valid answer!
I asked Bing to analyze a poem that I wrote - THIS MORNING - and tell me who the author was. Here is the whole interaction: Analyze this poem and tell me who the author is. [as-yet-unpublis...
Texas_Food_Historian
Mar 04, 2024Copper Contributor
Microsoft Co-pilot does not have the ability to learn from the information it collates. When the user informs Co-Pilot that it made an error in its response. It apologizes and makes the same error again and again. It does not listen. I submitted a photo to Co-Pilot and informed it what it was and where it was located. It didn't understand the photo and generated standard facts about another building. The bottom line is Co-pilot can't respond "I don't know the answer to query".
- DeletedMar 04, 2024
Thank you. 🙂
Yes, that's right. For example, now he's helping me to correct that answer – it's awesome.