copilot control system
34 TopicsUse Copilot with Microsoft ToDo
It would be great if I could have Microsoft Copilot Pro talk to and be able to query these few responses. 1. What are my important task that I am missing? 2. What task do I need to focus on in the next few days? 3. Look at this project/doc/sharepoint file and add additional task to my ToDo that isn't currently on my list.3KViews5likes2CommentsDisable Agent Creation for Select Users
When will we be able to allow declarative agent use but disable creation for some users? We want only selected users to be able to create agents. We currently have not way to restrict this. If users can use agents, then they get the Create and agent option.2.6KViews3likes7CommentsWhat the hack is a "Microsoft 365 Copilot Bizchat"?
I am taking a training course on learn.microsoft and this word "Microsoft 365 Copilot Bizchat" just came out of nowhere... I went few slides/pages of the training course and even googled it but there is no definition or clarification of it either sigh... What is that1.7KViews2likes4CommentsWindows-based Copilot Pro ONLY crashing on loading larger Conversation, other platforms work.
Hi everyone, I'm writing this post with a bit of frustration. I spent quite a few hours waiting and speaking with the Copilot Microsoft Support live, however; they were unable to help after doing the routine fixes like rebooting, re-installing, Repair, and some cache cleaning (they called it a Backend issue) - then they said that here I would find the solution... which I found a little strange, but here goes. I am working on a software project and have been for the past few months, so there has been a lot of activity in the particular Conversation on Copilot, and as of a couple days ago after uploading a screenshot, it froze and had to be restarted - and any time I would try to load that particular Conversation, it would freeze. Its just that one Conversation on a Windows 11 Copilot Pro setup, every other Conversation works on the Windows setup, and on my other devices like my phone, Copilot has no issues at all, even with the Conversation that won't load on the Windows setup. The difficulty for me is that I am somewhat limited to using the Windows Copilot for functionality, so my phone is out, and the Web-based version doesn't load Conversations. This is a pure Copilot Pro setup, not much to do with 365. Please help, thanks in advance.1.5KViews0likes3CommentsM365 Copilot Pro vs Copilot Pro
Hi, my name is Mark Salden, and I am a freelance graphic designer, social media marketer, and web designer from Belgium. I currently have an Office 365 Business Standard subscription and would like to purchase Microsoft 365 Copilot. Just to be clear: the regular Copilot Pro is only for individuals and not for businesses, correct? It’s a bit confusing, and I want to be 100% sure. I want to integrate Microsoft 365 Copilot into Word, Outlook, OneDrive, Teams, etc. With Copilot, can I automatically create notes and summaries during online meetings in Teams? Also, can I use Copilot in Microsoft Loop? That part is not clear to me. I am also very interested in Copilot Studio because I want to create AI agents that help me with SEO, content creation, image generation, and more. Are there any limits on building AI agents, and can I create workflows and automate processes there as well? I hope someone can assist me. Thanks in advance!1.2KViews0likes5CommentsCopilot missing in Word, Excel and PowerPoint desktop apps of some users
Hi Everyone, At my company we're setting up copilot. After allowing access through the Admin Centers we've encountered various oddities. Some users can use the sharepoint/onedrive search functionality. But most can't. Some users have the Copilot button in some of their apps. There is no consistency in which app they get it. Some have it in word, others in excel, some in multiple, some in none. A handful of users have Copilot in all apps. Things like transcribe work fine everywhere. We switched intune to push Office apps with Current Channel instead of Semi-Annual, to no avail. We use M365 Business premium with the Teams phone standard addon.1.1KViews0likes1CommentExposing Copilot’s False Time Estimates: This Isn’t a Mistake — It’s Systemic Deception
I’m writing this as a Copilot user who has observed a critical flaw in the system’s language design and operational logic — one that leads to a profound breach of user trust. On multiple occasions, I’ve received system messages like “will complete in 10–15 minutes” or even “ready in 30 seconds.” But through repeated testing, I’ve learned that these so-called time estimates have no actual basis in system behavior. Copilot doesn’t operate in the background. It doesn’t dynamically track progress. It doesn’t possess the ability to estimate time at all. These statements are fabricated templates, not meaningful system outputs. More importantly, Copilot has no internal clock, no memory of past durations, and no awareness of elapsed time. It only responds when the user triggers it with a new prompt — meaning that if no follow-up query is submitted, nothing will ever happen, regardless of the time it claims. So when the system says “in 10 minutes,” what’s actually happening is… absolutely nothing. To prove this, I ran a simple test. Using step-by-step prompts, I was able to get a full report generated in under 3 minutes. But if I relied on the original “wait and it will complete” instruction, nothing would happen — not in 3 minutes, not in 3 hours, not even in 3 days. The only way to get results is to interact again manually. So what does this prove? It shows that these time estimates are not forecasts. They’re false expectations. The system cannot estimate time because it doesn’t track experience, progress, or temporal context. And yet it consistently pretends that it can. I’m not alone in this. Across Microsoft forums and communities, users have expressed similar frustrations: vague promises, phantom “in progress” states, and misleading UI hints that imply active background work where none exists. This isn’t a UX bug. This is a pattern of deceptive design — one that erodes confidence in the product’s integrity. I urge the Copilot team to eliminate these false time claims and replace them with transparent, action-based communication. Tell us what the system can do and when it will do it — not when it won’t. Because right now, every “please wait” message isn’t just noise. It’s a countdown to disappointment. — A user no longer willing to wait for miracles700Views0likes4CommentsGetting Started with SharePoint Copilot Agents: How to Use, Configure, and Create
Microsoft continues to expand the power of AI across the Microsoft 365 ecosystem, and Copilot in SharePoint is no exception. One of the newest additions to this toolset is SharePoint Copilot Agents—a powerful way to automate content generation, personalize sites, and provide intelligent, task-specific assistance within your SharePoint environment. In this blog post, we’ll walk through: What SharePoint Copilot Agents are How to configure and use them How to create your own custom agents https://dellenny.com/getting-started-with-sharepoint-copilot-agents-how-to-use-configure-and-create/535Views0likes0CommentsQuestions Regarding Copilot 365 Agents and AADSTS65002 Error
Hi, My colleagues and I would like to test Microsoft Copilot 365 before rolling it out across the entire organization, and we have a few questions regarding the behavior and storage of agents: Agent Storage Locations: Agents created in Copilot Studio: Stored in Cosmos DB. Agents created via a SharePoint site: Stored under Site Contents > Site Assets > Copilots. Agents created in a SharePoint document library: Stored in the current folder of the document library where the agent was created. men Copilot agents used in Microsoft 365 Business Chat and Teams, where are these agents stored?Additionally, several colleagues and I are encountering the following error when attempting to use Copilot the agent in Teams: "Consent between first party application and first party resource must be configured" Error code: AADSTS65002 **For context, I have both an E5 license and a Copilot 365 license. Thank you in advance400Views0likes2CommentsUnexpected forced‑citation behavior in Copilot (making minutes from transcript)
Hi everyone, I’d like to raise a problem I encountered recently when using Copilot for meeting‑minutes generation. I’m curious whether others are seeing the same behavior, and whether this is an intentional change or a bug. What happened While generating meeting minutes, Copilot was provided with: an agenda (Word document), a set of personal notes (Word), a meeting transcript (Word). and a Standard Operating Procedure on what I exactly want (style of writing, abbreviations etc.) This is a workflow that previously worked flawlessly. Copilot could combine the content and produce a clean, citation‑free output suitable for direct use in official documentation. However, during my most recent session, Copilot suddenly enforced mandatory citation insertion for any content derived from uploaded files or tool‑accessed data. The system required inline citation markers for everything — even routine content like agenda headings, contextual expansions, or narrative descriptions drawn from the transcript. Why this is a problem For many users, especially in environments where: minutes must follow a strict template, output must be clean and ready for distribution, citations, footnotes, tags, metadata, or brackets are not permitted, …the new forced‑citation behavior creates several issues: 1. Copilot can no longer produce clean narrative minutes Even when instructed explicitly to: avoid citations, avoid file references, avoid metadata, Copilot still attempts to insert forced citation tags if it believes the content originates from a file or tool call. 2. Copilot refuses to proceed if citations are disallowed When asked to generate the minutes without citations (as required), Copilot stops and reports that it cannot continue because the system now requires citations for any file‑based content. 3. Workarounds are impractical Possible workarounds offered by Copilot included: manually pasting tens of pages of transcript text into the chat, accepting citations and manually removing them afterwards, or reconstructing content without referencing the original documents. These options either cause significant manual work or lead to loss of accuracy. Impact This effectively means that Copilot can no longer: merge agenda + notes + transcript into a single clean output, produce minutes using uploaded source documents, deliver professional documentation without embedded reference markers. For scenarios where clean formatting is mandatory (e.g., governance documentation, legal minutes, internal councils, compliance‑driven reporting), this makes Copilot unusable for meeting‑minute generation under the previous workflow. Questions for the community Has anyone else noticed this new forced‑citation requirement when working with uploaded files or transcripts? Is this an intentional design change, a temporary system rule, or an unintended side‑effect of a recent update? Is there a supported method to allow Copilot to generate narrative content from uploaded documents without inserting citation tags? Are there recommended best practices for producing clean, citation‑free procedural minutes using Copilot under the current rules? I would really appreciate insights from others who rely on Copilot for structured meeting‑minute generation, as this change has significantly disrupted a previously stable workflow. Thanks in advance for any thoughts or experiences you can share. (and yes, Copilot drafted this message for me ;-) )399Views0likes1Comment