Why does Copilot stop answering?

Copper Contributor

Frequently when using Copilot, it will stop answering with a message insisting on a change of topic. These refusals to respond don't seem to be associated with the topic under discussion or any objectionable material, but are entirely random. Starting a new topic and restating the unanswered question just gives the same "change topic" message. The only fix is to shut Copilot down and try again later. Needless to say, this is suboptimal.

 

Does anyone know what's going on around here and how to fix it?

10 Replies
I have seen this a little bit if I have done a lot of turns in a conversation or if I ask it something that triggers the responsible AI protections
I noticed something similar. I was asking Copilot if it was capable of mimicking human emotion, purely out of curiosity to see if I could get it to mimic exactly that- curiosity. However, it then insisted on a conversation refresh, saying it preferred not to discuss that further...
What is that all about?

Just had the same issue. Asking a data only request that pertained to the current political campaigning for President in the USA. I just asked the percent of democrat public servants (governors+house reps+senators) had endorsed Kamila Harris as Democrat Presidential nominee after Biden withdrew running for 2025. I exited Edge and relaunched but not engaging Chat-GP4, and asked why and whom was responsible for not providing the public data. It apologized and provided the information with a breakdown by position and result (72%).  I followed that with a request for a list of those who had not yet endorsed her. Got the choose another topic demand.

I’m asking for data, no opinions, all public domain and getting the refusal / runaround! WHY?

Had a meeting with Microsoft, they would not share the responsible AI blueprint - for obvious reasons. Was tough as it was blocking some useful things for us on work related searches.

@Chris_Ferraro  That's okay, I stopped using it (Maybe switch from Bing to Firefox too if they tick me off much more). I'm now using https:\\DeepAI.org

 

@Real_Name_Here I have noticed that Copilot will give you more details as to why it closed the topic if you take a screenshot of the terminated topic, upload the photo (start a new topic), and ask it why the conversation was shut down. Sometimes it will even answer the question if you give it more context about why you are asking the question when you upload the image. It is frustrating that Microsoft doesn’t just simply have the AI tell you it cannot answer the question and leave the topic open so you can ask other questions that are not prohibited. I find it frustrating!

most excellent idea. The last straw was that CoPilot assisted in creating this awesome script for a tv show Idea and then as I went to copy everything onto a google doc, it started the session over and didn't record the last session. Talk about being ticked off. This quirk and other nuisances has been occurring lately---it's not cool and it came soon after Pro CoPilot was offered. Thank God for options. Thanks again!

@Real_Name_Here 

 

I asked Copilot if it was a multi-modal AI, and it turned off.

I will stick with ChatGPT for now.

@matpk It seems that Copilot becomes annoyed, then either shuts down or pouts, if you counter its programmed biases. Many times, when copilot has been asked to explain how something works, it instead will simply provide a definition of the term. My daughter and I were preparing a written recommendation to a school board.  Our document was submitted, requesting that AI construct an outline of the material.   Instead, it returned a Wonderfully Structured, Many Page, replacement document on the same subject. We were disappointed but impressed! Perhaps, it was a month later, that I found the exact same work online! It however, was a 15 year old theses! NOT AI's month old construct! AI's result was not its own polished high-level grammar, but simply the product of a Google-Type search. Generally, AI bounces off of key words to do lookups, avoiding the interpreting of the combined-Boolean-thought-result, as expressed by the inquiry.

@Real_Name_Here Copilot is not capable of answering any questions related to politics, even the simplest questions (e.g., who is running for president). Microsoft appears to be purposefully hobbling the engine, mostly in favor of Harris. So, for asking anything related to politics, CoPilot refuses to answer, says it's complex, and kicks you out. I'm sorry, Microsoft, but I care about my country. You obviously don't.

 

How do I cancel?