Apr 11 2024 01:00 AM - edited Apr 12 2024 04:53 AM
While testing out conversational responses across different topics with Copilot, I came across an unusual string of characters attached to the end of the AI's response. I'm very rusty with software development syntax but it was very obvious that it produced an output that was not desired, a bug. I know this isn't the appropriate place to report or track bugs and issues but I'd like to start a discussion around this issue.
Some background information: I'm a former QA Analyst and was stress testing the Copilot AI across various conversational topics when I encountered what I can only conclude as a bug, any thoughts?
I further tested this issue to try and reproduce it. It didn't have an issue with the complexity of my questions or how I phrased them. So far, whenever I inquire anything about this "<|\im_end|>" in the conversation, copilot apologizes and then without further info terminates the conversation, forcing me or an end user to start new conversation, even if the conversation only just started.
Will continue and see if I can produce results similar to this and please, any helpful input on this matter would be greatly appreciated.
Apr 11 2024 01:03 AM