Forum Discussion
HollyJ2022
Jan 22, 2024Brass Contributor
Bing Chat / Copilot "Invents" fake "facts" - "I don't know" should be a valid answer!
I asked Bing to analyze a poem that I wrote - THIS MORNING - and tell me who the author was. Here is the whole interaction: Analyze this poem and tell me who the author is. [as-yet-unpublis...
Deleted
Jan 22, 2024This discussion opens up an interesting topic,( Trust in AI answers, in my opinion should be limited )
Only when it provides sources of information and I can verify it - it is valuable information.
KidFeedbackers2
Jan 22, 2024Iron Contributor
Deleted HollyJ2022 Yeah, it's often hard to 100% agree with an AI generated answer. I've gotten false facts before. But remember that it doesn't always make things up, as it reads the web and articles. It does make it's opinions, based on the web, but they aren't always accurate.
- HollyJ2022Jan 25, 2024Brass ContributorIt used to claim it couldn't use Search and had no data more recent than 2021, but now it does - proven because I asked it about attribution for a fairly recently posted poem that I wrote and published on my own website, and it found that.
Bard, at least, says "I don't know" or "I don't have enough information to answer that" when asked the same question I asked Bing the other day. Sometimes, that is the BEST answer; Bing should be taught to say "I don't know" rather than inventing non-factual answers.