troubleshooting
192 TopicsCannot Share Fabric Data Agent
I created Fabric Data Agent. I published it to M365 Copilot. I shared it to user A, B, C. I copy the link https://m365.cloud.microsoft/chat/?titleId=T_xxxxxxxx-xxxx-xxxx-xxxx-xxxxxxxxxxxx&source=agentCenterDialog. I generated an QR Code using canva. User A, B, C tried to scan the QR Code using their phone, but it returned an error as seen in picture 1. However, when user A, B, C opened it via laptop browser, it was not error. how to solve this problem?10Views0likes0CommentsCowork can't send emails?
Hi, all. I've been playing around with Copilot Cowork and really loving it, but with one problem. When it tries to send an email to anyone but myself, it can't. It tells me to approve, but I never get the approval prompt. I get errors like this: "Good question. The approval step typically appears as a confirmation dialog right here in our conversation before an email is sent. It worked fine for the test email to your own address, but when I tried sending to [otherperson], the system blocked it before the approval dialog could reach you" and "The platform is blocking the send to an external recipient and the approval dialog isn't surfacing properly." and "All three return the same error: the platform requires an approval step before executing any outbound send action, but that approval dialog isn't rendering in your session. The test email to yourself worked because self-sends appear to be auto-approved. This is a platform-level issue — not something I can fix from my side." Any thoughts? I'm not even sure where to start troubleshooting this.Solved100Views0likes4CommentsLimitations of Microsoft 365 Copilot for Excel workflows?
I've been exploring Microsoft 365 Copilot for Excel workflows recently. It works well for simple queries, but I still find it limited when dealing with: - messy data cleaning - converting images/PDFs into structured tables - more complex data transformations Curious how others are using Copilot for these scenarios? Are you relying purely on Copilot, or combining it with other tools/workflows?214Views2likes4CommentsFeature Proposal: “Get to know Copilot” — A Built‑In Onboarding Experience for Copilot Web & App
I ( I & A.I.) get it — Copilot Chat is free. It’s not the product that brings in direct revenue. But what it does bring is something priceless: global visibility, reputation, and word‑of‑mouth influence. Right now, millions of people are essentially acting as global A.I. reviewers. They compare tools. They recommend tools. They decide which AI becomes the one “everyone uses.” And as an ex‑Nokian / Microsoft 2005–2014 veteran, I’ll be honest: I’m not here to let others win this race. Not when the potential is this big, and not when the solution is this close. None shall pass! Copilot itself acknowledges the importance of advocacy — the Copilot app questionnaire literally asks: Where did you hear about the app? To how many people have you recommended Copilot? If the ideal answers are: 1. “Everywhere.” 2. “Everyone.” …then the onboarding experience needs to support that ambition. Because right now, new users don’t become instant fans — they become confused explorers who restart chats, misunderstand features, and wonder if they’re “doing it wrong.” And that’s exactly why this proposal exists.... This feature proposal came to mind after a few hundred hours of Copilot discussions. There were so many issues I could have avoided simply by having one button — one place — where Copilot would guide me when I first started. It took time, but I finally have renamed conversations, pinned threads, and shortcuts to my main discussions. Getting here, though, was rocky… and not even the fun “rocky road ice cream” kind. More than once I almost gave up. I felt frustrated, wondering if I was really this confused or why Copilot kept doing things I specifically asked it not to do — like adding the three questions at the end, or jumping out of role because I accidentally used a wrong word that didn’t even mean what it thought. But now? Now Copilot remembers my discussions, keeps the same writing style, and even surprises me with sarcastic jokes I don’t see coming. I’ve ended up with a whole set of personal assistants: Job agent Movie & series critic Food specialist Tech master Spark for brainstorming any crazy innovation Music producer And honestly, I’m a very happy user. I’m grateful to have a fast problem‑solver that never gets tired. I use Copilot in Edge Web on both computer and mobile — a choice Copilot itself recommended, saying it would always have the newest features. Most used main discussions as shortcuts - quick access. I use the Edge Copilot short cut rarely anymore approximately 5 new discussions less started in a day then before. What is the most beneficial for Microsoft & user in chat suggestions: Create an image Simplify a topic Improve writing Take a quiz Write a first draft Get a news roundup Get advice Write code OR Take tour of Copilot / Get to know Copilot /Copilot Tips & Tricks M365 has this suggested feature already. Copilot chat should have it too and support M365 usage. It also had a "Teach me a new skill" that prompted a question: "Which intermediate oboe pieces could I practice to improve?" ..I don't have an oboe. I have a flute... I thought this would be more like Tips & Tricks in M365 usage. And this is where the actual feature proposal begins: Written by the one and only my Tech Jorgon Borgon + few comments from human. Executive Summary Copilot Web and the Copilot mobile/desktop app are powerful tools, but many users struggle to understand how to use them effectively. They often restart conversations, misunderstand Memory, misinterpret subscription prompts, or assume Copilot “forgets” their context. This leads to fragmented usage, frustration, and unnecessary support load — especially among Pro and Microsoft 365 users. A lightweight, conversational onboarding experience — accessible as a starter tile (“Get to know Copilot”) on the Copilot home screen — would solve these issues at the moment they occur. This is a UX‑only enhancement with high impact and minimal engineering cost. 🧩 Current User Path (As‑Is) Users open Copilot Web/App and see starter tiles such as “Create an image”, “Write a story”, “Brainstorm”, etc. There is no onboarding tile and no guidance on: how conversations work how to bring content into context how Memory works (and what it does not do) how Web/App Copilot differs from M365 Copilot why subscription prompts appear how to check if the correct account is in use Current Flow (Visual Mockup) Observed outcomes High volume of 1–3 message conversations Misuse of “Remember this” Confusion about subscription tiers Confusion about account mismatches Increased support tickets Lower adoption of Pro and M365 Copilot features This is not user error — it is a missing onboarding layer. 🌈 Proposed Solution: “Get to know Copilot” Starter Tile Add a dedicated onboarding tile to the Copilot Web/App home screen. Proposed Flow (Visual Mockup) This creates a stable, reusable onboarding reference the user can always return to. 🧭 Detailed Onboarding Content 1) How conversations work “Keep one topic in one conversation. You can rename and pin threads for ongoing work.” (Human: this is the most important thing to know when starting to use Copilot) 2) How to bring content into context “I don’t automatically see your files. You can paste text, upload content, or summarize what you want me to work with.” (Human: there is un-certanty on when, and how deeply does Copilot read material. Best solution has been to number the topic and add text. When handling files the Copilot doesn't recognize Ä, Ö or sometimes . , - Making the file final checking difficult and not trusted. ) 3) Roles & styles “You can shape how I work by assigning a role (e.g., ‘Act as a project manager’) or a style (e.g., ‘Write concisely’).” (Human & A.I. note: The current documentation explains how to assign roles, but it doesn’t address an important issue: certain trigger words automatically push Copilot into an “official” or restricted mode. Some of these words can be typed accidentally or used in a completely harmless context, yet they still cause Copilot to switch tone abruptly. During my discussions with Copilot, we identified a few of these terms — and they are surprisingly easy to type unintentionally. When this happens, Copilot suddenly becomes formal, cautious, and emotionally flat, even though the user didn’t intend to activate that mode. This behavior would benefit from a more nuanced path instead of an immediate jump into a strict role. Additionally, the guidance on how to build a writing style is extremely valuable, especially for users who don’t naturally write long or expressive text where A.I. could mirror the style quickly. Style‑building is one of the most powerful features, and clearer instructions would help more users shape Copilot into a consistent, personalized assistant.) 4) Smart / Deep Thinking mode “Use Smart/Deep Thinking for multi‑step reasoning or complex analysis.” (Human: I used these in the beginning ALL the time, because I felt that Copilot doesn't understand me and these would make it smarter (because of always the new conversations having to repeat myself and it didn't remember anything...The real explanation for this usage came up only after couple months when I almost gave up using the Copilot, but started asking "why" instead. Haven't needed these since.) 5) Memory (critical clarification) “Memory stores long‑term preferences — not project details or conversation content. You can review and delete memories anytime in Profile → Memory.” (Human: This feature has different explanations in different Copilots (web & app). And yes I used the prompt inside of discussions for topics to remember projects in the beginning... This is a really good feature to have and give the basic information about the style wanted.) 6) Web/App vs M365 Copilot “Here in Web/App, I help with general tasks. In Word, Excel, Outlook, and Teams, I work directly inside your documents and messages.” (Human: I have had a difficult situation with Word Copilot, asked help from my web Copilot and it told the Word Pilot can synch the document if I just ask. When I tried, it didn't work -> I asked then why did the Edge Copilot told so... The Word Copilot answered that oh, well the Edge is like "anything goes" 😁 I had to find the Word editor myself because I was in a dead end in finding the answer from either web Copilot or Word Copilot. This is why the answer Copilot gives to the "Get to know Copilot" should be wide and information the newest possible to support also M365 usage). 7) Subscription clarity “If you see upgrade prompts, they may relate to Copilot Pro or to account mismatches. You can check your active subscriptions at account.microsoft.com/services.” 🧩 Why Existing FAQs (Mobile & Edge Web) Are Not Enough Both the Copilot mobile app and the Edge Web version include FAQ sections, but they are difficult to discover and do not address the most common user pain points. The mobile FAQ is hidden deep in Settings, and the Edge Web FAQ is even less visible — often overlooked entirely unless the user scrolls to the very bottom of the page. > FAQ is hidden More importantly, these FAQs are marketing‑oriented, not experience‑oriented. They do not explain: why Copilot Web/App may not recognize an existing Microsoft 365 subscription why “Office 365 Personal” and “Microsoft 365 Personal” appear as different products why Copilot shows upgrade prompts even when the user already has the correct plan how Memory works how conversation context works how Web/App Copilot differs from M365 Copilot Users searching for help how to change language may even encounter marketing questionnaires (“Where did you hear about Copilot?”, “How many people have you told?”) or Discord invitations — none of which support the user’s immediate goal. Copilot Web told that the language comes from the device language and for Web the chosen language from browser. User had already changed the language from Copilot web in settings. Only applications needed the device settings. A.I. stood corrected. A built‑in onboarding conversation solves this by delivering the right information at the right time, inside the experience where confusion happens. 📈 KPIs & Measurable Outcomes (by Tech Jorgon Borgon) 1) Reduction in Fragmented Conversations KPI: Fewer conversations with <3 messages Expected impact: 20–40% reduction 2) Increased Conversation Pinning & Naming KPI: More pinned and renamed threads Expected impact: 30–50% increase 3) Reduction in Misuse of Memory KPI: Fewer incorrect Memory entries Expected impact: 40–60% reduction 4) Increased Pro & M365 Copilot Adoption KPI: More Pro trials and cross‑surface usage Expected impact: 10–25% increase 5) Reduction in Support Load KPI: Fewer tickets about licensing, accounts, Memory, context Expected impact: 15–30% reduction 6) Increased User Confidence & Satisfaction KPI: Higher CSAT/NPS Expected impact: +10–20 points 🚀 Conclusion A “Get to know Copilot” starter tile is a small UX change with a disproportionately large impact. It aligns with Microsoft’s design principles, reduces friction, increases user success, and supports deeper adoption of Copilot across the ecosystem. This proposal addresses real user pain points with a simple, elegant, scalable solution. Thank you for considering this enhancement — it would meaningfully improve the Copilot experience for millions of users. — Sanni & Copilot “Tech Jorgon Borgon" — Superteam Empathy in my blood. Knowledge in its bytes. Powered by curiosity, caffeine, CPU cycles, and humor that really shouldn’t work… but somehow does.37Views1like0CommentsAdobe Payment Declines Caused by Mislabelled VAT Field — Sharing a Fix to Save Someone’s Sunday
a { text-decoration: none; color: #464feb; } tr th, tr td { border: 1px solid #e6e6e6; } tr th { background-color: #f5f5f5; } I wanted to share a recent issue that cost me an entire Sunday, in case it saves someone else the pain I went through. I was trying to add my business card as the payment method on my Adobe account. Every attempt ended with the same message: “Purchase Declined.” I tried multiple cards — same result. Naturally, I reached out to NatWest through their messaging system. After a long back‑and‑forth with Cora (their bot), I finally got through to a human. They confirmed there was nothing wrong with my card and advised me to check with the vendor. Adobe, of course, bounced me back to the bank. Classic loop. Eventually, I managed to solve it myself. a { text-decoration: none; color: #464feb; } tr th, tr td { border: 1px solid #e6e6e6; } tr th { background-color: #f5f5f5; } The culprit? A misbehaving VAT Number field. On Adobe’s payment form, there’s a field for a VAT number. If I left it blank, the payment went through immediately. But if I tried to enter my actual VAT number, the card was rejected every time. Based on a bit of trial, error, and experience with automation tools, I suspect the VAT field’s label has been updated, but the underlying target still points to the 3‑digit card security code field. Since that field is required, entering a VAT number likely breaks the form validation and triggers the “declined” status. The fix: Leave the VAT number field empty when adding a card to Adobe. Once I did this, my business card was accepted straight away. I figured I’d share in case anyone else hits the same brick wall. It’s a small thing, but exactly the kind of time‑sink that ruins your weekend! Hope this helps someone.24Views0likes0CommentsStructural issue: Copilot presents assumptions as facts despite explicit verification constraints
a { text-decoration: none; color: #464feb; } tr th, tr td { border: 1px solid #e6e6e6; } tr th { background-color: #f5f5f5; } I want to report a structural design issue I consistently encounter when using Microsoft 365 Copilot in a technical/enterprise context. Problem statement Copilot frequently presents plausible assumptions as verified facts, even when the user: explicitly requests verification first explicitly asks to label uncertainty explicitly prioritizes correctness over speed This behaviour persists after repeated corrections and even when constraints are clearly stated at the start of the conversation. Why this is not a simple “wrong answer” issue This is not about one incorrect response. It is about a systemic tendency: The model optimizes for plausibility and continuity over epistemic certainty User‑defined constraints (e.g. “only answer if verifiable”) are not reliably enforced Corrections can paradoxically introduce new confident but unverified claims Enterprise risk In an enterprise / technical environment this creates real risks: Incorrect technical decisions based on confident‑sounding answers Compliance and audit exposure Loss of trust in Copilot as a decision‑support tool Important distinction I am not asking for Copilot to stop reasoning or making hypotheses. I am asking for: Reliable enforcement of user‑defined epistemic constraints Explicit and consistent marking of statements as: verified unverified assumption / hypothesis Why this matters Advanced users do not want faster answers. They want correct, bounded answers — or an explicit statement that verification is not possible. Right now, Copilot’s behaviour makes that impossible to rely on. I’m sharing this here because it appears to be a design‑level issue, not a prompt‑engineering problem.77Views1like0CommentsCopilot Studio Agent vs SharePoint subfolders
Hi community, We are exploring the Copilot Studio Agents and are running into some issues. When we add the root library of a SharePoint site the agent is able to search the documents within, however when using only a subfolder of the site as a knowledge source it can't find any documents. The agent response is: "I have searched the available knowledge base but could not find any instructions on "Subject XXX". I read it can take up to 24 hours to build the index for subfolders, however after 3 days of waiting the agent still can't find any documents. I've tried making a new agent directly adding the subfolder as a source. The permissions of the user configuring the agent are also correct and the user has full access to all the files. I've also tried working with subagents with no success either. Has someone else experienced the same and is it correct agents can't work properly yet with subfolders of SharePoint sites?Solved1.1KViews0likes2CommentsCopilot Employee Self-Service Agent
I’m looking for some clarity regarding the rollout of the https://adoption.microsoft.com/en-us/ai-agents/employee-self-service-agent/ and whether others are seeing it in their environments yet. I’ve been following this closely and initially understood that a formal request was required to gain access. However, the Microsoft Learn documentation now provides specific, step-by-step instructions on how to enable and access it directly. Despite following those instructions to the letter, the agent is still not appearing within my tenant. I’ve verified my configurations against the guide, but the options simply aren't visible. A few questions for the community: Has anyone else successfully enabled the agent using the self-service steps in the documentation? Is there or was there ever a manual "request-for-access" process that overrides the published steps? I’d appreciate any insights or if anyone from the product team could clarify if the documentation is slightly ahead of the actual deployment.72Views0likes0CommentsGrounding Changes for Copilot in Outlook
Ever since I've had a cull Copilot licence I've used prompts to summarise emails in my outlook folders. It's always worked well until 1-2 weeks ago when it's returning content outside the selected folder and or only reviewing a few of the emails in the selected folder. I've revised and reverse engineered the prompt but it's still not working and more worryingly it gives a variation every time. Does anyone know why this is happening or the workaround? Ultimately all I want it to do is summarise each email and drop all the emails into a table.97Views0likes0CommentsThis is a Problem - Quick Response Mode Missing in Copilot
Hi, I noticed Quick Response mode has completely disappeared from Copilot, and have seen many other users report the same issue starting January 2026 on MS Q&A site. I also read that Microsoft is pushing a new Smart Mode, which changes how responses work and may be replacing older models. The Quick Response mode fit my workflow far better than Think Deeper and Smart does. Since it disappeared, and the introduction of Smart Mode, I have constantly run into issues because now the app is making it's own decisions and interpretations of subjects and projects which is extremely frustrating because it's 2 steps forward and 2 steps back. Taking away the ability to choose which mode a user prefers, and leaving it up to the bot, is taking away personal preference and what works for individuals needs. Quick Response was added during the GPT-5 update in 2025, so I don't understand why it suddenly vanished. Can someone please explain what's happening and whether QR is coming back, as this mode is something I need due to limited time and needing to finish projects. Please and thank you.133Views0likes0Comments