developer
7978 TopicsSharePoint: Using agents, AI-powered authoring, and automation, for high impact content management
With over 2 billion files added and 2 million sites created per day, SharePoint is the world’s most powerful and flexible content management platform. SharePoint allows you to create stunning intranets, orchestrate powerful workflows and develop business-critical applications, while also powering a breadth of innovations across Microsoft 365 in Teams, OneDrive, and Copilot. The SharePoint event today showcased an exciting breadth of innovations across SharePoint. With these innovations, we believe SharePoint is the best platform for managing content for AI, and the best application for using AI to achieve high impact business outcomes. The innovations shown today span three key areas – Agents built in SharePoint: create and manage AI experts for your SharePoint content with just a few clicks. AI-powered authoring for stunning intranets: use AI to easily create beautiful looking intranet sites using the best of what the web has to offer. Automation for streamlining business workflows: automate critical content-based business processes with AI. Visit the new event microsite on adoption.microsoft.com | Access all new videos, adoption resources, hackathon details, and more: https://aka.ms/SharePointEvent/Adoption https://aka.ms/SharePointEvent/Adoption. Agents built in SharePoint: AI experts for every user At Microsoft Ignite 2024, we announced the general availability of agents built in SharePoint. These agents are tailored assistants scoped to specific SharePoint sites and content, becoming your subject matter experts working on behalf of a person, team, or organization to handle simple tasks or more complex business processes. Every SharePoint site now includes an agent grounded to that site’s data ready for immediate use. With just a few clicks users can easily create their own custom agents with specialized skills scoped to their specific SharePoint files and folders. And of course, agents adhere to existing SharePoint content permissions and governance polices, ensuring your data and content are used in a safe and secure manner. We see our customers using agents today for a range of scenarios such as onboarding, product support, planning and more. Amey is just one such customer who is using agents built in SharePoint to get the most out of their content and knowledge real estate on SharePoint. AI-powered authoring: Create and communicate with AI. Today you saw a range of capabilities that allow you to build SharePoint sites that look better than ever. With improvements like flexible layouts, design ideas, co-authoring, and Copilot throughout the experience, this represents a huge step forward for SharePoint’s UX capabilities. Creating high quality, engaging content on SharePoint has never been easier. Creating with Copilot With the new "create with Copilot" feature you can either use a prompt or use a selection of templates to create an engaging page, grounded in the content of your choosing, in minutes. This allows you to spend less time on the mechanics of creating the page and instead focus on the core message and content to maximize engagement and impact. Using Microsoft 365 Copilot in SharePoint to help create and design the sites you want with ease, and assistance. Design ideas: Augment creativity with AI. Design Ideas leverages your content and provides several professionally designed suggestions, automatically taking your content-specific requirements into account. You can easily invoke Design Ideas with just the click of a single button and in seconds, transform your sections into user-ready content. You can even use Design Ideas starting with blank sections or plain blocks of text! Design Ideas in the right pane of a SharePoint page showcasing different suggestions. Flexible sections: Build unconstrained For the longest time ever, SharePoint restricted you to building within the three columns framework. We heard your feedback - with Flexible sections, you now have access to all 12 columns on the SharePoint canvas! Move images, text, and all your other favorite web parts around with fine control, and freely resize them in an easy, intuitive manner. And with AI-based authoring features like Design Ideas, it allows you to take full advantage of Flexible sections to provide the best possible layout recommendations for your content. 12 column canvas with Flexible Sections on a SharePoint page. These innovations and more have been rolling out to customers over the past several months. See how customers such as Avanade are using these capabilities to create engaging and beautiful content, and how Takeda sees a future with the latest innovations. AI and automation for streamlining your work Billions of pieces of content are added to SharePoint each day relevant to both, small team collaborations, and broad company-wide initiatives. This puts SharePoint in the middle of business processes both big and small. From out of the box simple approvals and automation, to site, doclib, and list templates and all the way to Power Platform integration – Automation in SharePoint spans the continuum of business processes to get work done faster and at higher quality. As just one example, over 3 billion Power Automate flows run against SharePoint every week. In this next phase of innovation, we are excited to share more about how automation and AI integrate together. Enhance business processes with AI Autofill is one feature that enriches content by extracting and generating structured metadata at scale – increasing the value of your content real estate. Using a natural language prompt you can describe the metadata you need, easily classifying, extracting, or generating new content as metadata. This will automate the process of new and modified files, saving you time and enhancing your business processes. Price change updates As part of our commitment to make advanced AI accessible to everyone, we are also excited to announce that SharePoint Autofill pricing has adjusted from $0.05 per page to $0.005 per page starting in March 2025! Learn more about SharePoint pay-as-you-go services. Next steps We are excited to announce the next SharePoint Hackathon - and invite you, our customers, partners, and MVPs, to craft experiences using AI and the latest SharePoint features. See more details and register here! SharePoint Hackathon - Share your designs and engage with likeminded makers: https://aka.ms/SharePointHackathon. And finally, I personally invite you all to attend the Microsoft 365 Community Conference (May 6-8, 2025) in Las Vegas. We’re showcasing the latest news and best practices for SharePoint and M365 with leaders, product makers, partners, and MVPs. We can’t wait to hear more about how you put Microsoft 365 to work in your organization. Thank you! Learn more and explore! To skill up your SharePoint IQ even further – review the full event + AMA, visit our new microsite, register for the hackathon, watch our new, in-depth SharePoint learning series videos from our incredible product team members: "SharePoint: From concept to creation to impact + Live AMA" New SharePoint Event microsite on adoption.microsoft.com SharePoint Hackathon + upcoming webinar series YouTube playlist of all 17 new video assets (Main event, learning series, customer voices, and upcoming hackathon webinars) 5-part SharePoint learning series SharePoint customer stories: Amey, Takeda, and Avanade The trust and feedback from you all, the SharePoint community, developers, customers and MVPs have helped us evolve SharePoint to be the best content cloud solution in the world. Thank you! Related resources Getting started with SharePoint is a breeze! Check out the latest tutorial on building a SharePoint site. NEW agents built in SharePoint adoption guide (adoption.microsoft.com) Agents built in SharePoint (adoption.microsoft.com) Subscribe to the SharePoint community blog Add and upvote feature request: SharePoint Feedback PortalS13KViews6likes1Comment[iOS] Custom App Icon still renders as white square
I am following up on a rendering issue where my custom app icon appears as a blank white square on the iOS client. Current Status: Despite following the standard guidelines, the icon fails to render on iOS (Works perfectly on Android/Desktop). What I have tried (and failed): Changed accentColor: Updated manifest from #FFFFFF to #4B6BF5 (Brand Blue) to avoid white-on-white issues. Version Bump: Increased manifest version from 1.0.17 to 1.0.18 to force cache invalidation. Re-installation: Completely uninstalled the app on iOS, cleared app data, and re-added it. Direct Upload via Teams Admin Center: I tried uploading the app package directly through the Admin Center, but the icon still fails to render on iOS. Commercial Marketplace: I have users who installed the app via the Commercial Marketplace, and they are also seeing a blank icon. "Publish to Org" via Developer Portal: I also tried the "Publish to Org" feature within the Developer Center, and the result is the same. Icon Validation: color.png: 192x192, PNG, Transparent background. outline.png: 32x32, PNG, Pure white pixels only with transparent background. Manifest Snippet: { "version": "1.0.18", "manifestVersion": "1.22", "id": "e4fae5da-30ea-4f90-9eae-807b2a13a127", "icons": { "outline": "outline.png", "color": "color.png" }, "accentColor": "#4B6BF5" } Questions & Request for Assistance: Has anyone faced a similar "blank white square" issue on iOS recently? Any suggestions would be greatly appreciated.53Views0likes2CommentsImplementing A2A protocol in NET: A Practical Guide
As AI systems mature into multi‑agent ecosystems, the need for agents to communicate reliably and securely has become fundamental. Traditionally, agents built on different frameworks like Semantic Kernel, LangChain, custom orchestrators, or enterprise APIs do not share a common communication model. This creates brittle integrations, duplicate logic, and siloed intelligence. The Agent‑to‑Agent Standard (A2AS) addresses this gap by defining a universal, vendor‑neutral protocol for structured agent interoperability. A2A establishes a common language for agents, built on familiar web primitives: JSON‑RPC 2.0 for messaging and HTTPS for transport. Each agent exposes a machine‑readable Agent Card describing its capabilities, supported input/output modes, and authentication requirements. Interactions are modeled as Tasks, which support synchronous, streaming, and long‑running workflows. Messages exchanged within a task contain Parts; text, structured data, files, or streams, that allow agents to collaborate without exposing internal implementation details. By standardizing discovery, communication, authentication, and task orchestration, A2A enables organizations to build composable AI architectures. Specialized agents can coordinate deep reasoning, planning, data retrieval, or business automation regardless of their underlying frameworks or hosting environments. This modularity, combined with industry adoption and Linux Foundation governance, positions A2A as a foundational protocol for interoperable AI systems. A2AS in .NET — Implementation Guide Prerequisites • .NET 8 SDK • Visual Studio 2022 (17.8+) • A2A and A2A.AspNetCore packages • Curl/Postman (optional, for direct endpoint testing) The open‑source A2A project provides a full‑featured .NET SDK, enabling developers to build and host A2A agents using ASP.NET Core or integrate with other agents as a client. Two A2A and A2A.AspNetCore packages power the experience. The SDK offers: A2AClient - to call remote agents TaskManager - to manage incoming tasks & message routing AgentCard / Message / Task models - strongly typed protocol objects MapA2A() - ASP.NET Core router integration that auto‑generates protocol endpoints This allows you to expose an A2A‑compliant agent with minimal boilerplate. Project Setup Create two separate projects: CurrencyAgentService → ASP.NET Core web project that hosts the agent A2AClient → Console app that discovers the agent card and sends a message Install the packages from the pre-requisites in the above projects. Building a Simple A2A Agent (Currency Agent Example) Below is a minimal Currency Agent implemented in ASP.NET Core. It responds by converting amounts between currencies. Step 1: In CurrencyAgentService project, create the CurrencyAgentImplementation class to implement the A2A agent. The class contains the logic for the following: a) Describing itself (agent “card” metadata). b) Processing the incoming text messages like “100 USD to EUR”. c) Returning a single text response with the conversion. The AttachTo(ITaskManager taskManager) method hooks two delegates on the provided taskManager - a) OnAgentCardQuery → GetAgentCardAsync: returns agent metadata. b) OnMessageReceived → ProcessMessageAsync: handles incoming messages and produces a response. Step 2: In the Program.cs of the Currency Agent Solution, create a TaskManager , and attach the agent to it, and expose the A2A endpoint. Typical flow: GET /agent → A2A host asks OnAgentCardQuery → returns the card POST /agent with a text message → A2A host calls OnMessageReceived → returns the conversion text. All fully A2A‑compliant. Calling an A2A Agent from .NET To interact with any A2A‑compliant agent from .NET, the client follows a predictable sequence: identify where the agent lives, discover its capabilities through the Agent Card, initialize a correctly configured A2AClient, construct a well‑formed message, send it asynchronously, and finally interpret the structured response. This ensures your client is fully aligned with the agent’s advertised contract and remains resilient as capabilities evolve. Below are the steps implemented to call the A2A agent from the A2A client: Identify the agent endpoint: Why: You need a stable base URL to resolve the agent’s metadata and send messages. What: Construct a Uri pointing to the agent service, e.g., https://localhost:7009/agent. Discover agent capabilities via an Agent Card. Why: Agent Cards provide a contract: name, description, final URL to call, and features (like streaming). This de-couples your client from hard-coded assumptions and enables dynamic capability checks. What: Use A2ACardResolver with the endpoint Uri, then call GetAgentCardAsync() to obtain an AgentCard. Initialize the A2AClient with the resolved URL. Why: The client encapsulates transport details and ensures messages are sent to the correct agent endpoint, which may differ from the discovery URL. What: Create A2AClient using new Uri (currencyCard.Url) from the Agent Card for correctness. Construct a well-formed agent request message. Why: Agents typically require structured messages for roles, traceability, and multi-part inputs. A unique message ID supports deduplication and logging. What: Build an AgentMessage: • Role = MessageRole.User clarifies intent. • MessageId = Guid.NewGuid().ToString() ensures uniqueness. • Parts contains content; for simple queries, a single TextPart with the prompt (e.g., “100 USD to EUR”). Package and send the message. Why: MessageSendParams can carry the message plus any optional settings (e.g., streaming flags or context). Using a dedicated params object keeps the API extensible. What: Wrap the AgentMessage in MessageSendParams and call SendMessageAsync(...) on the A2AClient. Outcome: Await the asynchronous response to avoid blocking and to stay scalable. Interpret the agent response. Why: Agents can return multiple Parts (text, data, attachments). Extracting the appropriate part avoids assumptions and keeps your client robust. What: Cast to AgentMessage, then read the first TextPart’s Text for the conversion result in this scenario. Best Practices 1. Keep Agents Focused and Single‑Purpose Design each agent around a clear, narrow capability (e.g., currency conversion, scheduling, document summarization). Single‑responsibility agents are easier to reason about, scale, and test, especially when they become part of larger multi‑agent workflows. 2. Maintain Accurate and Helpful Agent Cards The Agent Card is the first interaction point for any client. Ensure it accurately reflects: Supported input/output formats Streaming capabilities Authentication requirements (if any) Version information A clean and honest card helps clients integrate reliably without guesswork. 3. Prefer Structured Inputs and Outputs Although A2A supports plain text, using structured payloads through DataPart objects significantly improves consistency. JSON inputs and outputs reduce ambiguity, eliminate prompt‑engineering edge cases, and make agent behavior more deterministic especially when interacting with other automated agents. 4. Use Meaningful Task States Treat A2A Tasks as proper state machines. Transition through states intentionally (Submitted → Working → Completed, or Working → InputRequired → Completed). This gives clients clarity on progress, makes long‑running operations manageable, and enables more sophisticated control flows. 5. Provide Helpful Error Messages Make use of A2A and JSON‑RPC error codes such as -32602 (invalid input) or -32603 (internal error), and include additional context in the error payload. Avoid opaque messages, error details should guide the client toward recovery or correction. 6. Keep Agents Stateless Where Possible Stateless agents are easier to scale and less prone to hidden failures. When state is necessary, ensure it is stored externally or passed through messages or task contexts. For local POCs, in‑memory state is acceptable, but design with future statelessness in mind. 7. Validate Input Strictly Do not assume incoming messages are well‑formed. Validate fields, formats, and required parameters before processing. For example, a currency conversion agent should confirm both currencies exist and the value is numeric before attempting a conversion. 8. Design for Streaming Even if Disabled Streaming is optional, but it’s a powerful pattern for agents that perform progressive reasoning or long computations. Structuring your logic so it can later emit partial TextPart updates makes it easy to upgrade from synchronous to streaming workflows. 9. Include Traceability Metadata Embed and log identifiers such as TaskId, MessageId, and timestamps. These become crucial for debugging multi‑agent scenarios, improving observability, and correlating distributed workflows—especially once multiple agents collaborate. 10. Offer Clear Guidance When Input Is Missing Instead of returning a generic failure, consider shifting the task to InputRequired and explaining what the client should provide. This improves usability and makes your agent self‑documenting for new consumers.Bot configuration config/fetch returns same channel ID when switching channels within same team
Summary When invoking a bot’s configuration flow (configuration.fetchTask) via an @mention-based settings entry point across different channels within the same Team, the request body consistently contains the same channelId, even after switching channels. https://learn.microsoft.com/en-us/microsoftteams/platform/bots/how-to/bot-configuration-experience?tabs=teams-bot-sdk1%2Cteams-bot-sdk2%2Cteams-bot-sdk3 This makes it impossible to reliably determine which channel initiated the configuration. Environment Microsoft Teams (desktop client) Bot installed at team scope Same Team, multiple channels Bot supports team scopes. Configuration enabled via manifest: "configuration": { "team": { "fetchTask": true } } Steps to Reproduce Install the bot into a Team with multiple channels Navigate to Channel A Invoke the bot configuration using an @Bot settings / configuration entry point Observe the incoming config/fetch request payload Note the channelId in the request body Switch to Channel B (same Team) Invoke the same configuration entry point again Observe the incoming config/fetch request payload Expected Behavior The config/fetch request body should include a channel identifier corresponding to the channel where the configuration was initiated, e.g.: channelData.channel.id or another channel-scoped identifier that uniquely identifies the initiating channel Actual Behavior The config/fetch request body always contains the same channel ID The channel ID does not change when switching channels The value appears to be: either the Team’s General channel or a cached / team-level channel context As a result, the bot cannot determine which channel the user intended to configure. Impact This behavior prevents implementing per-channel configuration using the bot configuration experience because: Configuration requests cannot be reliably scoped to the initiating channel All configuration actions appear to target the same channel Users configuring different channels in the same Team unintentionally overwrite the same settings Additional Notes This occurs within the same Team Reproduced consistently across multiple channels Observed even when the configuration is invoked after explicitly switching channels Behavior suggests the configuration context may be team-scoped rather than channel-scoped, but this is not clearly documented84Views0likes3CommentsSharePoint Online Content Security Policy (CSP): Enforcement Dates and Guidance
Content Security Policy (CSP) is a critical browser security feature designed to protect web applications. SharePoint Online is rolling out CSP enforcement for all tenants starting March 1, 2026.3.1KViews1like12CommentsData Quality Error (Internal Service Error)
I am facing an issue while running the DQ scan, when i tried doing manual scan and scheduled scans both time i faced Internal Service Error. ( DataQualityInternalError Internal service error occurred .Please retry or contact Microsoft support ) Data Profiling is running successfully but for none of the asset, DQ is working. After the lineage patch which MS had fixed, they had introduced Custom SQL option to create a rule, and after that only i am facing this issue. Is anyone else also facing the same? I tried with different data sources (ADLS, and Synapse) its same for both. If anyone has an idea, do share it here, it will be helpful.31Views0likes1CommentWhat if post adaptive card and wait for a response timeout
Hi Team, Scenario: Post the adaptive card to the team chat. If the user doesn't respond after 2 hours, the adaptive card should be updated as "Expired"with a new design on the same conversation. In an attempt to comply with this, I used a "Post adaptive card in char and wait for response." Challenges: What would happen if I posted an adaptable card and waited for a response timeout? I wouldn't get a response. An error is being thrown. Expectations include waiting for a response after receiving an adapted card. In any case, your card has already produced a message ID and posted. This action should to return a Message Id in the event that a timeout occurs. Would you kindly respond if you have discovered a solution?3.1KViews3likes13CommentsMoving a column of text data into 3 columns of data?
I have a column of text data cells 1,2,3,4,5,6,7,8,9 and longer. I want to create 3 column of data to graph and manipulate Cell in Columns. 1,2,3 3,4,5 5,6,7 8,9,10 and longer. So i need to create 3 columns of data from 1 column of data. I am using Mac Excel 16 and I can not make this happen. I have tried all sorts of solutions. Help? Thank you,103Views0likes4CommentsCall SharePoint Server Subscription Edition REST API via App Only Authentication
Need to call SharePoint Server Subscription Edition REST API via App Only Authentication and found below article. https://learn.microsoft.com/en-us/sharepoint/dev/solution-guidance/security-apponly-azureacs can i do this using TokenHelper.cs as suggested in the article? Seems request tires to access ACS because of below settings in TokenHelper.cs and requests timeout since the code is executed from a Server where no access to Azure ACS. private static string GlobalEndPointPrefix = "accounts"; private static string AcsHostUrl = "accesscontrol.windows.net"; Any support on how to do this for On-Prem SharePoint is really appreciated.21Views0likes0CommentsData Security: Azure key Vault in Data bricks
Why this article? To remove the vulnerability of exposing the data base connection string in Databricks notebook directly, by using Azure key vault. Database connection strings are extremely confidential/vulnerable data, that we should not be exposed in the DataBricks notebook explicitly. Azure key vault is a secure option to read the secrets and establish connection. What do we need? Tenant Id of the app from the app registration with access to the azure key vault secrets Client Id of the of the app from the app registration with access to the azure key vault secrets Client secret of the app from the app registration with access to the azure key vault Where to find this information? Under the App registration, you can find the (application) Client Id, Directory (tenant) Id. Client secret value is found in the app registration of the service, under Manage -> Certificate & secrets. You can use an existing secret or create a new one and use it to access the key Vault secrets. Make sure the application is added with get access to read the secrets. Verify the key vault you are checking and using in Databricks is the same one with read access. You can verify this by going to the Azure key vault -> Access Policies and search for the application name. It should show up on search as below, this will confirm that the access of the application. What do we need to setup in Databricks notebook? Open your cluster and install azure.keyvault and azure-identity (installing version should be compatible with you cluster configuration, refer: https://docs.databricks.com/aws/en/libraries/package-repositories) In a new notebook, let’s start by importing the necessary modules. Your notebook would start with the modules, followed by tentatId, clientId, client secret, azure key vault URL , secretName of the connection string in the azure key vault and secretVersion. Lastly, we need to fetch the secret using the below code Vola, we have the DB connection string to perform the CRUD operations. Conclusion: By securely retrieving your database connection string from Azure Key Vault, you eliminate credential exposure and strengthen the overall security posture of your Databricks workflows. This simple shift ensures your notebooks remain clean, compliant, and production‑ready.