Stress Testing Copilot (Preview)

Copper Contributor

While testing out conversational responses across different topics with Copilot, I came across an unusual string of characters attached to the end of the AI's response. I'm very rusty with software development syntax but it was very obvious that it produced an output that was not desired, a bug. I know this isn't the appropriate place to report or track bugs and issues but I'd like to start a discussion around this issue.

Some background information: I'm a former QA Analyst and was stress testing the Copilot AI across various conversational topics when I encountered what I can only conclude as a bug, any thoughts?



cam3lCase0914_0-1712822321660.png

 

I further tested this issue to try and reproduce it. It didn't have an issue with the complexity of my questions or how I phrased them. So far, whenever I inquire anything about this "<|\im_end|>" in the conversation, copilot apologizes and then without further info terminates the conversation, forcing me or an end user to start new conversation, even if the conversation only just started. 

 

Will continue and see if I can produce results similar to this and please, any helpful input on this matter would be greatly appreciated.

 

1 Reply
If I didn't know any better, I'd say it was dodging my question.