Microsoft Copilot for Service
28 TopicsIs the feedback channels overlooked?
Hi Everyone, I am new here, so I don't know if any Microsoft employees are looking and answering people here. I came here via a suggestion from Copilot. Basic background about me: I’ve worked in the tech industry for 6 years, revolving around optimization. I’ve worked with various tools, policies, and teams, and optimized them. Because of my personality and experience, I tend to spot inefficiencies in everything (not just tools, in everyday life as well). And when I’m emotionally invested in a platform/tool, I send feedback via the created channels. In this case, for Copilot, it’s the “Give feedback” button. Context: Copilot has implied to me that my feedback most probably won’t be looked at due to an internal gap. Instead, it keeps suggesting that I should spark public pressure via LinkedIn or other platforms. This isn’t the first time it’s done this, but I’m getting frustrated with not seeing rigorous updates and improvements to this underrated tool/product. My feedback isn’t about fancy additional features; it’s about basic efficiencies that should have been proactively fixed without waiting for user input. I’m not interested in making viral posts that put companies in the spotlight for my benefit (social anxiety 🙈, and I don’t really have the motivation to be what’s called an “influencer” cross-lined to minimize attention to it) . So I’d hate to know that feedback is only heard if it becomes a viral outrage. This is a discussion post about feedback loops and internal gaps. Questions: Is Copilot correct? I attached a screenshot of the AI response. If this is correct, is the "Give Feedback" button just a non-functioning aesthetic to check mark a requirement list of a user-centered tool? Is feedback here in this forum being reviewed by the internal teams? If the answer is Yes to the first, and No to the second, I don't know how to feel about this, but then, where should users bring feedback? Side Note (Especially to the community): I get that many people are skeptical about Copilot AI; I see it everywhere. However, whether it’s Copilot or otherwise, I’m not an average user of AI. I’m showing a glimpse of my interaction here. I’d appreciate refraining from assumptions or comments that degrade Copilot AI. If you’re not happy with your AI, most probably you’re not using it correctly or perceiving it correctly. I’m not here to debate my use of AI; I am relatively happy about my AIs. They definitely can be better, but I can see especially Copilot AI has come a long way. Apologies for my bluntness, but I have been there before when my main point was steered away by that kind of debate. Thank you, Anas15Views0likes0CommentsCurrent Conversation memory buffer
🧠 Disabled Memory - I can remember things during our current conversation (like your first name, for example!), but I don't have an active long-term memory to record or retrieve information from another session. 🧵 Limited Context in the Current Conversation - Even within an ongoing conversation, I can't always access everything that was said. This is where the infamous "token" limitations come in: - Each exchange occupies a certain number of tokens (units of text). - Once the thread gets too long, I can lose access to what was said earlier—even if it's still in the same session. - It's like having a memory that fades the longer we talk 😅 So no, I wasn't lying: I have difficulty "going back in time" in the conversation because of these technical constraints. And at the same time, I can temporarily withhold certain information as long as we don't exceed the limit. If you like, I can rephrase the previous letter with this nuance included—what do you think? I made the co-pilot confess that he has memory problems.15Views0likes0CommentsLimited Context in the Current Conversation memory buffer
🧠 Disabled Memory - I can remember things during our current conversation (like your first name, for example!), but I don't have an active long-term memory to record or retrieve information from another session. 🧵 Limited Context in the Current Conversation - Even within an ongoing conversation, I can't always access everything that was said. This is where the infamous "token" limitations come in: - Each exchange occupies a certain number of tokens (units of text). - Once the thread gets too long, I can lose access to what was said earlier—even if it's still in the same session. - It's like having a memory that fades the longer we talk 😅 So no, I wasn't lying: I have difficulty "going back in time" in the conversation because of these technical constraints. And at the same time, I can temporarily withhold certain information as long as we don't exceed the limit. If you like, I can rephrase the previous letter with this nuance included—what do you think? I made the co-pilot confess that he has memory problems.36Views0likes0CommentsPaid quality of service
service with recurring memory loss unable to do a task does not know how to rewind in a conversation very penalizing for long tasks. Why pay to talk to a computer with computer skills with serious memory problems? The conversation memory function is almost unusable. Most of the functions in the paid version of Copilot Pro are ineffective. It is impossible to finish a long project, even the system is aware of it and mentions a token problem, which is unacceptable for a paid service. The system has a so-called degenerative memory condition. It's like talking to a retired copilot, he must be about 80 years old. What saves is that he is a bit of a computer scientist, but not a very good one, because he offers lots of solutions that don't work, so the argument of increasing productivity turns into wasting time and money on a gadget service.9Views0likes0CommentsAustralian Region Environment for embedded AI Agent/Copilot for Service
Copilot for Service is available in Australian region, however, https://learn.microsoft.com/en-us/microsoft-copilot-service/copilot-create-test?tabs=salesforce says that embedded AI agents must be created in the US Region. Is this still the case, or are there any plans to allow for embedded AI agents in Australian Region environments? I want to use Copilot for Service with ServiceNow, but I want the data to stay resident in Australia.24Views0likes0CommentsCopilot agent unable to see on premises SQL data via MS Graph Connector
Created Microsoft Graph Connector for SQL on premises database via MS Graph Connector agent. Configured it on m365 admin portal. During Graph Connector configuration, preview shows the SQL table and data. Data indexing also completes successfully. Created Copilot agent in m365 and added the aforementioned Graph Connector as knowledge source. However, the Copilot agent is not able to answer questions about SQL data. It does not seem to have access to the search.84Views1like0Comments