Recent Discussions
Copilot returning document context error
I was using Copilot in PowerPoint to create presentations from Word files, without problems. Without changing anything, PowerPoint suddenly started returning an error when I opened the Copilot pane. This issue is only occurring on one computer, whereas my other laptop, using the same account, is working properly. I have tried signing out/in of my account and also reinstalling 365 on this computer. Error code: Error Message: Cannot read properties of null (reading 'DocumentContext') Error Stack: TypeError: Cannot read properties of null (reading 'DocumentContext') at https://fa000000129.resources.office.net/033f92d3-bc6d-439a-858a-a17acf70360a/1.0.2509.1003/en-us_web/appchat.js:1:841804 at Object.useMemo (https://fa000000129.resources.office.net/033f92d3-bc6d-439a-858a-a17acf70360a/1.0.2509.1003/en-us_web/taskpane.js:2:100202) at t.useMemo (https://fa000000129.resources.office.net/033f92d3-bc6d-439a-858a-a17acf70360a/1.0.2509.1003/en-us_web/taskpane.js:2:34060) at _i (https://fa000000129.resources.office.net/033f92d3-bc6d-439a-858a-a17acf70360a/1.0.2509.1003/en-us_web/appchat.js:1:841672) at ua (https://fa000000129.resources.office.net/033f92d3-bc6d-439a-858a-a17acf70360a/1.0.2509.1003/en-us_web/taskpane.js:2:94279) at Xl (https://fa000000129.resources.office.net/033f92d3-bc6d-439a-858a-a17acf70360a/1.0.2509.1003/en-us_web/taskpane.js:2:146207) at zu (https://fa000000129.resources.office.net/033f92d3-bc6d-439a-858a-a17acf70360a/1.0.2509.1003/en-us_web/taskpane.js:2:133543) at Iu (https://fa000000129.resources.office.net/033f92d3-bc6d-439a-858a-a17acf70360a/1.0.2509.1003/en-us_web/taskpane.js:2:133471) at Ou (https://fa000000129.resources.office.net/033f92d3-bc6d-439a-858a-a17acf70360a/1.0.2509.1003/en-us_web/taskpane.js:2:133334) at Cu (https://fa000000129.resources.office.net/033f92d3-bc6d-439a-858a-a17acf70360a/1.0.2509.1003/en-us_web/taskpane.js:2:130331)9Views0likes0CommentsCopilot Agent in SharePoint App
I created a new Copilot agent (using GPT-4) that refers to knowledge from SharePoint sites and published to SharePoint site. I configured 'Authenticate with Microsoft' under 'Security'. Approved the agent from SharePoint site and set as 'default' for the site. It works as expected when launched from a SharePoint web page. However, when the agent is launched from SharePoint app, it says, 'connect to your Microsoft account' and if I click Sign in button, it throws an error 'Cannot connect'. Then the agent fails. What am I missing here?13Views0likes1Comment🚀 A Proposal to Eliminate Copilot Fragmentation and Unify the UX
🚀 A Proposal to Eliminate Copilot Fragmentation and Unify the UX Body (English first) Currently in Microsoft 365, there are multiple entry points for Copilot — Copilot Pages, app-specific Copilots, and Copilot in Loop. This fragmentation often confuses users about which Copilot to use and where the outputs are saved. I propose a simple unification: ol {margin-bottom:0in;margin-top:0in;}ul {margin-bottom:0in;margin-top:0in;}li {margin-top:.0in;margin-bottom:8pt;}ol.scriptor-listCounterResetlist!list-758aeff6-9418-422e-b90e-e34b3c8e9a540 {counter-reset: section;}ol.scriptor-listCounterlist!list-758aeff6-9418-422e-b90e-e34b3c8e9a540 {list-style-type:numbered;}li.listItemlist!list-758aeff6-9418-422e-b90e-e34b3c8e9a540::before {counter-increment: section;content: counters(section, ".") ". "; display: inline-block;} General-purpose Copilot → Copilot Pages only Even when opening Word/Excel/Outlook/Loop via the sidebar, all interactions are handled by the Copilot in Copilot Pages. Specialized Copilot → Toggle switch Add a toggle at the top of Copilot Pages (“General ↔ Specialized”) to switch into app-specific Copilot functions only when needed. Save format → Standardize to Loop files All generated content is saved as .loop files in OneDrive/SharePoint and can be directly opened in the Loop app. Benefits ol {margin-bottom:0in;margin-top:0in;}ul {margin-bottom:0in;margin-top:0in;}li {margin-top:.0in;margin-bottom:8pt;}ol.scriptor-listCounterResetlist!list-758aeff6-9418-422e-b90e-e34b3c8e9a541 {counter-reset: section;}ol.scriptor-listCounterlist!list-758aeff6-9418-422e-b90e-e34b3c8e9a541 {list-style-type:bullet;}li.listItemlist!list-758aeff6-9418-422e-b90e-e34b3c8e9a541::before {counter-increment: section;content: none; display: inline-block;} Eliminates confusion caused by Copilot fragmentation → more intuitive UX. Users only decide “general or specialized” → very simple flow. Future apps can be added via the sidebar → high scalability. 👉 In short: “General = Pages, Specialized = Toggle, Save = Loop” This would greatly simplify and improve the Copilot experience. If you agree, please give it a like 👍 日本語版(for Japanese users) Copilot乱立を解消し、UXを一元化する提案 現在のMicrosoft 365では、Copilot Pages、各アプリ専用のCopilot、Loop+Copilotなど複数の入口が存在し、ユーザーが「どのCopilotを使えばよいのか」で混乱しています。 以下のように統一することを提案します: 汎用CopilotはCopilot Pagesに一本化 特化型Copilotは切替トグルで呼び分け 保存形式はLoopファイルに統一 効果 乱立による混乱がなくなり、UXが直感的に。 導線が「汎用か特化か」の選択だけに。 将来アプリが増えてもサイドバーに追加するだけで拡張可能。 👉 一言で言えば: 「汎用=Pages、特化=トグル、保存=Loop」 これでCopilot体験は大きく整理されます。 ご賛同いただける方はぜひ「いいね」をお願いします! 👍4Views0likes0CommentsDeleting AI developed meeting recaps
Hello, We have recently begun utilizing the meeting transcript and AI generated transcript functionality for our meetings. However, we can discuss potentially sensitive information on our calls and so we would like to understand how to delete the AI notes from the recap once we have had adequate time to digest and review. I can see how to delete the full transcript, but I do not see an option to delete the AI meeting notes. And we need to confirm that once deleted they are cleared for good and cannot be deemed discoverable. Thank you for help that anyone can provide.Researcher - Formating Issues
I have a serious issue to force Researcher to follow a specific format concerning citation within his response. Although it delivers a superior content, it does not comply with neither the instructions (word file) nor the template nor the prompt concerning the citations. I want clickable (hyperlinked) citations throughout the text that will be listed in a references section at the bottom of the report. I re-engineered instructions, template and prompt 100s of times according to it's own suggestions without success. It always miss something. I also prompted "not to synthesize anything untill the Template and Instructions rules applied". Again failed. Are other folks having similar challenges? Did you found any tips or tricks to force Researcher to comly in formating requests?29Views0likes1CommentCopilot Task Switching
Are there any planned enhancements to improve task-switching UI features or enable docking of Copilot chats and agents for easier return and continuity? Our users frequently engage with Copilot across multiple workflows throughout the day and often struggle to locate previous prompts. For example, a sales representative might begin a client proposal in Researcher, pause to retrieve a document for their manager, use Writing Coach to format an email to a leader, and then want to return to the original Researcher prompt. Simultaneously, they may run a skill builder agent during their lunch break. These transitions can be challenging without a more robust way to manage and revisit Copilot interactions. As a suggestion, it would be helpful to have the ability to pin active tasks or Copilot sessions to the left-hand navigation bar, allowing users to quickly return to key workflows without losing context.18Views1like0CommentsResearcher Compiling Consistency
I have noticed that for exactly the same query with the same input, researcher delivers partially inconsistent reports (e.g. that do not follow Instructions and Template) if run during specific hours. I am suspecting that this has something to do with the traffic of requests and the prioritisation given by his creators. Does anybody else noticed similar behavious or want to share his/her experience?Solved23Views0likes1CommentCrowdstrike EDR repeatedly flags m365copilot_autostarter.exe
We are seeing recurring CrowdStrike informational alerts for m365copilot_autostarter.exe, located under the WindowsApps directory for Microsoft OfficeHub (versions 19.2509.32081.0 and 19.2508.51171.0). The alerts are flagged as “meeting the machine learning-based on-sensor AV protection’s lowest-confidence threshold for malicious files”. Two hashes are repeatedly seen across different customer environments: - 2ee039508706a40e1ca608d2da670d8f8b4b3605343ae4601e7f2407db6a35e (timestamp: Sept 2, 2025) - ade2675e1247ffd1cbe4e408716a559fb502aeca26985a53d35755d1c13827f3 (timestamp: Aug 21, 2025) Both files appear clean in reputation checks, but they are unsigned and have no vendor information, which is raising questions in security tooling. Since these alerts are consistently triggered across Windows 10 and 11 endpoints in multiple environments, we are trying to confirm: - Is this a legitimate, recently introduced OfficeHub / Copilot component? - Why is it unsigned compared to other OfficeHub binaries? Any clarification from Microsoft would be appreciated.46Views0likes0Comments- 89KViews1like11Comments
Microsoft Explains the Differences Between Copilot Memories
Copilot memory is a term that refers to different things, including Copilot communication memory, a method to use the Graph to personalize responses for users. The idea is to use all the sources of information available through the Graph as Copilot responds to user prompts in Microsoft 365 apps instead of limiting sources to whatever the app works with. It’s a good idea, providing the Graph sources are accurate. https://office365itpros.com/2025/09/03/copilot-memory-types/30Views0likes0CommentsHow Retail Businesses Are Automating Customer Service with Copilot Studio
The retail industry is evolving rapidly, and customer expectations are higher than ever. Shoppers demand instant answers, personalized recommendations, and seamless support across digital and physical channels. Traditional customer service models—heavy reliance on human agents and call centers—struggle to meet these demands at scale. Enter Copilot Studio: a platform that empowers retailers to automate customer service workflows while maintaining personalization and efficiency. https://dellenny.com/how-retail-businesses-are-automating-customer-service-with-copilot-studio/16Views0likes0CommentsCombining Generative AI and Business Logic with Copilot Studio
Generative AI is reshaping how businesses build, scale, and optimize their digital solutions. By enabling natural language interactions, content creation, and decision support, AI offers a new level of flexibility. But to be truly impactful in a business environment, generative AI must work hand-in-hand with business logic—the structured rules and workflows that power day-to-day operations. This is where Copilot Studio comes in. https://dellenny.com/combining-generative-ai-and-business-logic-with-copilot-studio/23Views0likes0CommentsOutlook + OneDrive = Silent Syncing? A Serious Privacy Concern
🚨 Hey folks, I stumbled upon something that left me stunned — and frankly, furious. I was using classic Outlook (not the new web-based version) on my Windows machine. After what I assume was a silent update, email accounts I had previously configured on my Mac (also using Outlook) mysteriously appeared on my Windows device. I had deleted all profiles and started fresh, yet my IMAP and Pop accounts reappeared. Yes, you read that right: accounts from my Mac showed up on my Windows PC without my input. Digging deeper, I realized OneDrive had been quietly syncing my mail data — including accounts linked to cPanel. OneDrive was active but wouldn’t launch properly. Turns out, it was busy harvesting my account configurations behind the scenes. The moment I uninstalled OneDrive, the syncing stopped. Outlook no longer recognized any of my accounts. That’s when it hit me: my data had been synced somewhere, without my consent. 🔍 Here’s where I looked: • Outlook Data Files (.pst/.ost) — empty • Credential Manager — nothing • Registry — traces of profiles, no accounts • Microsoft Account Sync Settings — disabled I also found a new folder under Account Settings > Data Files linked to OneDrive. The path couldn’t be deleted or changed. Conclusion: Classic Outlook appears to sync account data via OneDrive or another Microsoft service — without asking the user. That’s not just sneaky. It’s a serious security and privacy issue. 💣 Message to Microsoft: If you're charging for software, don’t sneak behind users’ backs. If you're going to sync data, ask first. I paid for Outlook — and got surveillance. I’m still shocked this is even possible. If anyone else has experienced something similar, speak up. This needs visibility.25Views0likes0CommentsCopilot Memory enabled but not working – lack of transparency and support
Hello everyone, I'm a Microsoft 365 and Copilot Pro user, with all apps fully updated and set to the Current Channel. Although the Copilot Memory feature appears as enabled in my settings, it has stopped working since August 10–11, 2025. I contacted Microsoft support, provided remote access, and was escalated to senior agents. Eventually, I was told that it *might* be a server-side issue — but no official confirmation was given. As a paying user of premium services, I believe we deserve: - Clear acknowledgment of the issue - Estimated resolution timeline - Transparent communication from Microsoft Has anyone else experienced this? Is there any official update on the status of Copilot Memory? Thanks in advance!147Views0likes5CommentsCatch Up on Meetings and Tasks with Copilot in Teams
Microsoft Teams is a central hub for collaboration, meetings, and productivity. With the integration of Copilot in Teams, users can efficiently catch up on missed meetings, summarize key discussions, and stay on top of their tasks without sifting through hours of recordings. This blog will guide you on how to leverage Copilot to enhance your workflow in Teams. https://dellenny.com/catch-up-on-meetings-and-tasks-with-copilot-in-teams/35Views0likes0CommentsCopilot Studio and (Sharepoint) Metadata
Hi fellow members. Short question: Would like to use Copilot Studio (Power Virtual Agent) to include in the Semantic Index not only documents' contents, but also the documents' metadata found in the Sharepoint library where the documents reside. Does anyone know if: a) this is possible, or: b) planned possible, or: c) how to report as a requested feature Appreciated Regards2.5KViews7likes8CommentsCopilot Chat. Disable Agent Creation, Allow Agent use
Hi all First time posting here. I would like to manage agent creation centrally via fully licensed Copilot studio, and disable the end users ability to create Agents for themselves. It seems as though there is not a configuration option for this scenario and Agents can either be enabled or disabled. Can anyone suggest a solution? Many thanks RemSolved5.4KViews4likes16CommentsCopilot for Microsoft 365 : Architecture and Key Concepts
The following diagram displays the Copilot for Microsoft 365 service and tenant logical architecture. Architecture of Copilot for Microsoft 365 : Copilot for Microsoft 365 can generate responses anchored in the customer’s business content, such as: User documents Emails Calendar Chats Meetings Contacts Other business data Copilot for Microsoft 365 follows these foundational principles: Built on Microsoft’s comprehensive approach to security, compliance, and privacy. Architected to protect tenant, group, and individual data. Committed to responsible AI. Key components of Copilot for Microsoft 365 includes: Large Language Models (LLMs) Natural Language Processing (NLP) Microsoft 365 apps Microsoft Copilot (chat) Microsoft Syntex Microsoft Graph https://www.youtube.com/watch?v=B2-8wrF9Okc Users can initiate Copilot prompts from devices that have Microsoft 365 apps installed. Copilot components include: The Copilot service, which orchestrates the responses to user prompts. An instance of the Microsoft Graph for the data of your Microsoft 365 tenant. Your Microsoft 365 tenant that contains your organization data. Key concepts Microsoft 365 Copilot will only work with files saved to OneDrive. If files are stored locally on your PC, you will need to move them to OneDrive to activate Copilot ( as on date Mar'2024) Microsoft's Azure OpenAI Service privately hosts the LLMs used by Copilot for Microsoft 365. Copilot for Microsoft 365 only displays organizational data to which individual users have at least View permissions. It's important that organizations use the permission models available in Microsoft 365 services, such as SharePoint, to help ensure the right users or groups have the right access to the right content. Copilot for Microsoft 365 applies Microsoft Graph to synthesize and search content from multiple sources within your tenant. The Microsoft Graph API brings more context from user signals into the prompt, such as information from emails, chats, documents, and meetings. This information includes data from services like Outlook, OneDrive, SharePoint, Teams, and more. Only data a user has access to is returned in query responses, as illustrated in the following diagram. Microsoft 365 keeps your data logically isolated by tenant. This design, together with encryption, ensures privacy while processing and at rest. Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs, including those used by Copilot for Microsoft 365. Copilot is a shared service just like many other services in Microsoft 365. Communication between your tenant and Copilot components are encrypted. Microsoft Copilot for Microsoft 365 uses Azure OpenAI services for processing, not OpenAI’s publicly available services. Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs, including the LLMs used by Copilot for Microsoft 365. The Copilot for Microsoft service and tenant logical architecture supports an organization's security and compliance in several ways: Copilot operates as a shared service within Microsoft 365, ensuring encrypted communication between your tenant and Copilot components. Your data remains confidential and secure. Existing security and compliance policies deployed by your organization continue to apply. Copilot adheres to these policies, safeguarding sensitive information. The tenant boundary ensures data privacy, location compliance, and adherence to security protocols. Your data remains within the Microsoft 365 service boundary, protected by Microsoft 365's robust security measures. To ensure that Copilot for Microsoft 365 uses your content effectively, administrators should: Add a urlToItemResolver when you create your connection. A https://learn.microsoft.com/en-us/graph/api/resources/externalconnectors-urltoitemresolverbase enables the platform to detect when users share URLs from your external content with each other. Copilot for Microsoft 365 has a higher likelihood of displaying content shared with that user. As such, you should add a urlToItemResolver in https://learn.microsoft.com/en-us/graph/api/resources/externalconnectors-activitysettings when you https://learn.microsoft.com/en-us/graph/connecting-external-content-manage-connections#create-a-connection?azure-portal=true%E2%80%9D. Apply semantic labels. Semantic labels help Copilot for Microsoft 365 interpret the semantic meaning of your schema. Apply as many https://learn.microsoft.com/en-us/graph/connecting-external-content-manage-schema#semantic-labels?azure-portal=true%E2%80%9D to your schema as applicable. Add user activities on your items. For a list of supported user activity types, see https://learn.microsoft.com/en-us/graph/api/resources/externalconnectors-externalactivity. The system assigns greater importance to items that have more activities. Administrators can choose to let data out of the compliance boundary; for example, to query public web content using Microsoft Bing. For more information, see https://learn.microsoft.com/en-us/microsoftteams/platform/copilot/how-to-extend-copilot#how-to-make-your-graph-connector-work-better-with-copilot?azure-portal=true.34KViews7likes1CommentCopilot Agent Publishing on AppSource
Hi, does anyone know when Copilot agents will be available to publish on AppSource? I heard it's supposed to launch soon. Right now, it seems they only work within Teams apps that support Copilot. Just looking for any updates or confirmation. Thanks!11Views0likes0CommentsFailed to Publish Copilot Studio Agent
Hello! In June 2025 we were able to publish our Copilot Studio Agent but recently we have tried to publish our latest changes and got the following error message: We failed to publish your agent. Try publishing again later. Dynamics 365 Contact Center is not provisioned in the given environment. Telephony and NLU+ features require Dynamics 365 Contact Center to be enabled. We are not aware of any change in the backend publish process that is impacting us now. Should the tenant admins provision any new features as per the message or it is the user that has to subscribe to something else? Thanks in advanceSolved222Views0likes6Comments
Events
Register Here: GPT-5 in Copilot – Microsoft Adoption
Discover how the latest GPT-5 model is transforming the Microsoft Copilot experience across M365, Copilot Chat, and emerging Agent capabiliti...
Tuesday, Sep 09, 2025, 08:00 AM PDTOnline
0likes
12Attendees
0Comments
Recent Blogs
- Safeguard your data: Learn how to govern Microsoft 365 Copilot and agents by mitigating oversharing.Sep 02, 2025825Views1like0Comments
- Welcome to the August 2025 edition of What's new in Microsoft 365 Copilot!Aug 29, 20255KViews3likes0Comments