Recent Discussions
Researcher - Formating Issues
I have a serious issue to force Researcher to follow a specific format concerning citation within his response. Although it delivers a superior content, it does not comply with neither the instructions (word file) nor the template nor the prompt concerning the citations. I want clickable (hyperlinked) citations throughout the text that will be listed in a references section at the bottom of the report. I re-engineered instructions, template and prompt 100s of times according to it's own suggestions without success. It always miss something. I also prompted "not to synthesize anything untill the Template and Instructions rules applied". Again failed. Are other folks having similar challenges? Did you found any tips or tricks to force Researcher to comly in formating requests?24Views0likes1CommentCopilot Task Switching
Are there any planned enhancements to improve task-switching UI features or enable docking of Copilot chats and agents for easier return and continuity? Our users frequently engage with Copilot across multiple workflows throughout the day and often struggle to locate previous prompts. For example, a sales representative might begin a client proposal in Researcher, pause to retrieve a document for their manager, use Writing Coach to format an email to a leader, and then want to return to the original Researcher prompt. Simultaneously, they may run a skill builder agent during their lunch break. These transitions can be challenging without a more robust way to manage and revisit Copilot interactions. As a suggestion, it would be helpful to have the ability to pin active tasks or Copilot sessions to the left-hand navigation bar, allowing users to quickly return to key workflows without losing context.13Views0likes0CommentsResearcher Compiling Consistency
I have noticed that for exactly the same query with the same input, researcher delivers partially inconsistent reports (e.g. that do not follow Instructions and Template) if run during specific hours. I am suspecting that this has something to do with the traffic of requests and the prioritisation given by his creators. Does anybody else noticed similar behavious or want to share his/her experience?17Views0likes1CommentCrowdstrike EDR repeatedly flags m365copilot_autostarter.exe
We are seeing recurring CrowdStrike informational alerts for m365copilot_autostarter.exe, located under the WindowsApps directory for Microsoft OfficeHub (versions 19.2509.32081.0 and 19.2508.51171.0). The alerts are flagged as “meeting the machine learning-based on-sensor AV protection’s lowest-confidence threshold for malicious files”. Two hashes are repeatedly seen across different customer environments: - 2ee039508706a40e1ca608d2da670d8f8b4b3605343ae4601e7f2407db6a35e (timestamp: Sept 2, 2025) - ade2675e1247ffd1cbe4e408716a559fb502aeca26985a53d35755d1c13827f3 (timestamp: Aug 21, 2025) Both files appear clean in reputation checks, but they are unsigned and have no vendor information, which is raising questions in security tooling. Since these alerts are consistently triggered across Windows 10 and 11 endpoints in multiple environments, we are trying to confirm: - Is this a legitimate, recently introduced OfficeHub / Copilot component? - Why is it unsigned compared to other OfficeHub binaries? Any clarification from Microsoft would be appreciated.4Views0likes0CommentsCopilot Agent in SharePoint App
I created a new Copilot agent (using GPT-4) that refers to knowledge from SharePoint sites and published to SharePoint site. I configured 'Authenticate with Microsoft' under 'Security'. Approved the agent from SharePoint site and set as 'default' for the site. It works as expected when launched from a SharePoint web page. However, when the agent is launched from SharePoint app, it says, 'connect to your Microsoft account' and if I click Sign in button, it throws an error 'Cannot connect'. Then the agent fails. What am I missing here?5Views0likes0Comments- 89KViews1like11Comments
Microsoft Explains the Differences Between Copilot Memories
Copilot memory is a term that refers to different things, including Copilot communication memory, a method to use the Graph to personalize responses for users. The idea is to use all the sources of information available through the Graph as Copilot responds to user prompts in Microsoft 365 apps instead of limiting sources to whatever the app works with. It’s a good idea, providing the Graph sources are accurate. https://office365itpros.com/2025/09/03/copilot-memory-types/27Views0likes0CommentsHow Retail Businesses Are Automating Customer Service with Copilot Studio
The retail industry is evolving rapidly, and customer expectations are higher than ever. Shoppers demand instant answers, personalized recommendations, and seamless support across digital and physical channels. Traditional customer service models—heavy reliance on human agents and call centers—struggle to meet these demands at scale. Enter Copilot Studio: a platform that empowers retailers to automate customer service workflows while maintaining personalization and efficiency. https://dellenny.com/how-retail-businesses-are-automating-customer-service-with-copilot-studio/12Views0likes0CommentsCombining Generative AI and Business Logic with Copilot Studio
Generative AI is reshaping how businesses build, scale, and optimize their digital solutions. By enabling natural language interactions, content creation, and decision support, AI offers a new level of flexibility. But to be truly impactful in a business environment, generative AI must work hand-in-hand with business logic—the structured rules and workflows that power day-to-day operations. This is where Copilot Studio comes in. https://dellenny.com/combining-generative-ai-and-business-logic-with-copilot-studio/20Views0likes0CommentsOutlook + OneDrive = Silent Syncing? A Serious Privacy Concern
🚨 Hey folks, I stumbled upon something that left me stunned — and frankly, furious. I was using classic Outlook (not the new web-based version) on my Windows machine. After what I assume was a silent update, email accounts I had previously configured on my Mac (also using Outlook) mysteriously appeared on my Windows device. I had deleted all profiles and started fresh, yet my IMAP and Pop accounts reappeared. Yes, you read that right: accounts from my Mac showed up on my Windows PC without my input. Digging deeper, I realized OneDrive had been quietly syncing my mail data — including accounts linked to cPanel. OneDrive was active but wouldn’t launch properly. Turns out, it was busy harvesting my account configurations behind the scenes. The moment I uninstalled OneDrive, the syncing stopped. Outlook no longer recognized any of my accounts. That’s when it hit me: my data had been synced somewhere, without my consent. 🔍 Here’s where I looked: • Outlook Data Files (.pst/.ost) — empty • Credential Manager — nothing • Registry — traces of profiles, no accounts • Microsoft Account Sync Settings — disabled I also found a new folder under Account Settings > Data Files linked to OneDrive. The path couldn’t be deleted or changed. Conclusion: Classic Outlook appears to sync account data via OneDrive or another Microsoft service — without asking the user. That’s not just sneaky. It’s a serious security and privacy issue. 💣 Message to Microsoft: If you're charging for software, don’t sneak behind users’ backs. If you're going to sync data, ask first. I paid for Outlook — and got surveillance. I’m still shocked this is even possible. If anyone else has experienced something similar, speak up. This needs visibility.21Views0likes0CommentsCopilot Memory enabled but not working – lack of transparency and support
Hello everyone, I'm a Microsoft 365 and Copilot Pro user, with all apps fully updated and set to the Current Channel. Although the Copilot Memory feature appears as enabled in my settings, it has stopped working since August 10–11, 2025. I contacted Microsoft support, provided remote access, and was escalated to senior agents. Eventually, I was told that it *might* be a server-side issue — but no official confirmation was given. As a paying user of premium services, I believe we deserve: - Clear acknowledgment of the issue - Estimated resolution timeline - Transparent communication from Microsoft Has anyone else experienced this? Is there any official update on the status of Copilot Memory? Thanks in advance!144Views0likes5CommentsCatch Up on Meetings and Tasks with Copilot in Teams
Microsoft Teams is a central hub for collaboration, meetings, and productivity. With the integration of Copilot in Teams, users can efficiently catch up on missed meetings, summarize key discussions, and stay on top of their tasks without sifting through hours of recordings. This blog will guide you on how to leverage Copilot to enhance your workflow in Teams. https://dellenny.com/catch-up-on-meetings-and-tasks-with-copilot-in-teams/24Views0likes0CommentsCopilot Studio and (Sharepoint) Metadata
Hi fellow members. Short question: Would like to use Copilot Studio (Power Virtual Agent) to include in the Semantic Index not only documents' contents, but also the documents' metadata found in the Sharepoint library where the documents reside. Does anyone know if: a) this is possible, or: b) planned possible, or: c) how to report as a requested feature Appreciated Regards2.5KViews7likes8CommentsCopilot Chat. Disable Agent Creation, Allow Agent use
Hi all First time posting here. I would like to manage agent creation centrally via fully licensed Copilot studio, and disable the end users ability to create Agents for themselves. It seems as though there is not a configuration option for this scenario and Agents can either be enabled or disabled. Can anyone suggest a solution? Many thanks RemSolved5.3KViews4likes16CommentsCopilot for Microsoft 365 : Architecture and Key Concepts
The following diagram displays the Copilot for Microsoft 365 service and tenant logical architecture. Architecture of Copilot for Microsoft 365 : Copilot for Microsoft 365 can generate responses anchored in the customer’s business content, such as: User documents Emails Calendar Chats Meetings Contacts Other business data Copilot for Microsoft 365 follows these foundational principles: Built on Microsoft’s comprehensive approach to security, compliance, and privacy. Architected to protect tenant, group, and individual data. Committed to responsible AI. Key components of Copilot for Microsoft 365 includes: Large Language Models (LLMs) Natural Language Processing (NLP) Microsoft 365 apps Microsoft Copilot (chat) Microsoft Syntex Microsoft Graph https://www.youtube.com/watch?v=B2-8wrF9Okc Users can initiate Copilot prompts from devices that have Microsoft 365 apps installed. Copilot components include: The Copilot service, which orchestrates the responses to user prompts. An instance of the Microsoft Graph for the data of your Microsoft 365 tenant. Your Microsoft 365 tenant that contains your organization data. Key concepts Microsoft 365 Copilot will only work with files saved to OneDrive. If files are stored locally on your PC, you will need to move them to OneDrive to activate Copilot ( as on date Mar'2024) Microsoft's Azure OpenAI Service privately hosts the LLMs used by Copilot for Microsoft 365. Copilot for Microsoft 365 only displays organizational data to which individual users have at least View permissions. It's important that organizations use the permission models available in Microsoft 365 services, such as SharePoint, to help ensure the right users or groups have the right access to the right content. Copilot for Microsoft 365 applies Microsoft Graph to synthesize and search content from multiple sources within your tenant. The Microsoft Graph API brings more context from user signals into the prompt, such as information from emails, chats, documents, and meetings. This information includes data from services like Outlook, OneDrive, SharePoint, Teams, and more. Only data a user has access to is returned in query responses, as illustrated in the following diagram. Microsoft 365 keeps your data logically isolated by tenant. This design, together with encryption, ensures privacy while processing and at rest. Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs, including those used by Copilot for Microsoft 365. Copilot is a shared service just like many other services in Microsoft 365. Communication between your tenant and Copilot components are encrypted. Microsoft Copilot for Microsoft 365 uses Azure OpenAI services for processing, not OpenAI’s publicly available services. Prompts, responses, and data accessed through Microsoft Graph aren't used to train foundation LLMs, including the LLMs used by Copilot for Microsoft 365. The Copilot for Microsoft service and tenant logical architecture supports an organization's security and compliance in several ways: Copilot operates as a shared service within Microsoft 365, ensuring encrypted communication between your tenant and Copilot components. Your data remains confidential and secure. Existing security and compliance policies deployed by your organization continue to apply. Copilot adheres to these policies, safeguarding sensitive information. The tenant boundary ensures data privacy, location compliance, and adherence to security protocols. Your data remains within the Microsoft 365 service boundary, protected by Microsoft 365's robust security measures. To ensure that Copilot for Microsoft 365 uses your content effectively, administrators should: Add a urlToItemResolver when you create your connection. A https://learn.microsoft.com/en-us/graph/api/resources/externalconnectors-urltoitemresolverbase enables the platform to detect when users share URLs from your external content with each other. Copilot for Microsoft 365 has a higher likelihood of displaying content shared with that user. As such, you should add a urlToItemResolver in https://learn.microsoft.com/en-us/graph/api/resources/externalconnectors-activitysettings when you https://learn.microsoft.com/en-us/graph/connecting-external-content-manage-connections#create-a-connection?azure-portal=true%E2%80%9D. Apply semantic labels. Semantic labels help Copilot for Microsoft 365 interpret the semantic meaning of your schema. Apply as many https://learn.microsoft.com/en-us/graph/connecting-external-content-manage-schema#semantic-labels?azure-portal=true%E2%80%9D to your schema as applicable. Add user activities on your items. For a list of supported user activity types, see https://learn.microsoft.com/en-us/graph/api/resources/externalconnectors-externalactivity. The system assigns greater importance to items that have more activities. Administrators can choose to let data out of the compliance boundary; for example, to query public web content using Microsoft Bing. For more information, see https://learn.microsoft.com/en-us/microsoftteams/platform/copilot/how-to-extend-copilot#how-to-make-your-graph-connector-work-better-with-copilot?azure-portal=true.34KViews7likes1CommentCopilot Agent Publishing on AppSource
Hi, does anyone know when Copilot agents will be available to publish on AppSource? I heard it's supposed to launch soon. Right now, it seems they only work within Teams apps that support Copilot. Just looking for any updates or confirmation. Thanks!10Views0likes0CommentsFailed to Publish Copilot Studio Agent
Hello! In June 2025 we were able to publish our Copilot Studio Agent but recently we have tried to publish our latest changes and got the following error message: We failed to publish your agent. Try publishing again later. Dynamics 365 Contact Center is not provisioned in the given environment. Telephony and NLU+ features require Dynamics 365 Contact Center to be enabled. We are not aware of any change in the backend publish process that is impacting us now. Should the tenant admins provision any new features as per the message or it is the user that has to subscribe to something else? Thanks in advanceSolved218Views0likes6CommentsCopilot PC on Windows 11 — “The request took longer than expected” error on startup
**ENGLISH VERSION** Hi everyone, I’ve been trying to launch Copilot PC on Windows 11, but I consistently get the error: **“The request took longer than expected.”** Here’s what I’ve tried so far: - Reinstalled Copilot PC - Checked network and firewall settings - Tweaked registry keys (details below) Despite all that, the app still fails to launch. Has anyone else faced this issue or found a reliable fix? **System info:** - Windows 11 Pro (build 22631.2861) - Copilot PC version: 1.0.21023.1001 - Region: Vietnam Any help or insights would be appreciated! --- **PHIÊN BẢN TIẾNG VIỆT** Chào mọi người, Mình đang cố gắng khởi chạy ứng dụng Copilot PC trên Windows 11 nhưng liên tục gặp lỗi: **“The request took longer than expected.”** Những cách mình đã thử: - Gỡ và cài lại Copilot PC - Kiểm tra kết nối mạng và tường lửa - Chỉnh sửa một số khóa registry (chi tiết bên dưới) Tuy nhiên ứng dụng vẫn không thể khởi chạy. Không biết có ai từng gặp lỗi này hoặc có cách khắc phục ổn định không? **Thông tin hệ thống:** **System info:** - Windows 11 Pro (build 22631.2861) - Copilot PC version: 1.0.21023.1001 - Region: Vietnam Rất mong nhận được phản hồi từ mọi người!31Views0likes0CommentsIs copilot capable of rebuilding an EXISTING PowerPoint presentation based on a referenced POTX
Copilot told me it is possible "Yes, it is indeed possible to use Copilot for M365 in PowerPoint to open an old presentation and prompt Copilot to rebuild it using a specific company template.......After activating Copilot, you'll need to interact with it to specify your requirements. You can prompt Copilot with a command such as: "Rebuild this presentation using the [specific company template]"......... Copilot will then process your request and apply the specified company template to your old presentation. This includes updating the design, layout, and formatting to match the template guidelines. However, when I try this Copilot in PowerPoint says "Creating a new presentation will replace your existing slides, so you may want to save a copy first. Do you want me to make changes?" I think Copilot is not capable of rebuilding existing presentations based on a template. Correct?Solved4.2KViews0likes6CommentsSummarize Email Thread Feature Coming to Outlook
In late August, Microsoft plans to release the Copilot summarize email thread feature in Outlook clients without the need for a Microsoft 365 Copilot license. This news might seem surprising, but it’s simply a matter of business. If Microsoft doesn’t make basic AI features available in Outlook, ISVs (including OpenAI) will fill the gaps with add-ons. And that might make it harder to sell Microsoft 365 Copilot licenses. https://office365itpros.com/2025/08/26/summarize-email-thread/46Views0likes0Comments
Events
Register Here: GPT-5 in Copilot – Microsoft Adoption
Discover how the latest GPT-5 model is transforming the Microsoft Copilot experience across M365, Copilot Chat, and emerging Agent capabiliti...
Tuesday, Sep 09, 2025, 08:00 AM PDTOnline
0likes
9Attendees
0Comments
Recent Blogs
- Safeguard your data: Learn how to govern Microsoft 365 Copilot and agents by mitigating oversharing.Sep 02, 2025600Views1like0Comments
- Welcome to the August 2025 edition of What's new in Microsoft 365 Copilot!Aug 29, 20254.2KViews2likes0Comments