biz apps
39 TopicsConditional Access for Canvas Apps with Entra
In today's Power Platform landscape, administrators have a tough task securing the ever-increasing inventory of Canvas Apps across their tenant. Canvas apps often connect to sensitive data, run on a variety of devices, and serve diverse groups of users. That is why Conditional Access has become one of the most powerful tools in an admin’s toolkit, giving you fine grained control over how, where, and under what conditions users can access your apps. In this post, I will walk through what Conditional Access means for canvas apps, how it empowers admins to maintain strong security without adding friction for legitimate users, and example steps to apply your own conditional access policies to an app with PowerShell. What Conditional Access Brings to Canvas Apps Conditional Access brings granular, app-level security controls from Microsoft Entra ID directly into Power Apps. Instead of applying blanket restrictions across the entire tenant, you can enforce requirements—like MFA, compliant devices, or trusted networks—only on the apps that need them. This lets you match security to the sensitivity of each individual app. Key Benefits for Admins Tailored Protection for Sensitive Apps Not every app requires strict controls. Conditional Access allows you to tighten security only for apps that handle sensitive or regulated data, without over restricting everything else. Control Access by Device Type Admins can easily block or allow specific device categories—like preventing mobile access to a high-risk app or requiring managed devices for apps that contain confidential information. Alignment With Zero Trust Conditional Access enforces identity, device, and session checks in real time, supporting a Zero Trust approach without adding unnecessary friction for legitimate users. Environment-Specific Flexibility You can apply stricter policies in production and lighter ones in development or testing, helping teams build efficiently while keeping sensitive environments locked down. A Stronger Security Model Conditional Access does not replace existing apps or data permissions—it complements them. App-level security roles control what users can do inside an app, while Conditional Access governs whether they can get into the app at all. Together, they create a much more robust security posture. How to enable conditional access for a Canvas App example In this example, I will detail steps to set up conditional access for a Canvas App to ensure tenant guest users are not able to access the app. Step 1: Create an Authentication Context in Entra ID Go to the Microsoft Entra Admin Center. Navigate to Protection → Conditional Access → Authentication context. Click + New authentication context. Name it (e.g., BlockGuests_PowerAppX) Enable Publish to apps Save and note the Authentication Context ID Step 2: Create a Conditional Access Policy Go to Conditional Access → Policies → + New policy. Name the policy (e.g., Block Guests from Power App X). Assignments: Users or workload identities: Include: Guest or external users Target resources: Choose Authentication context Select the one you created earlier Access controls: Grant: Select Block access Enable the policy and click Create. Step 3: Assign the Authentication Context to the Power App Use PowerShell to bind the Authentication Context to the specific Power App: Open PowerShell as Administrator. Connect to Power Apps Add-PowerAppsAccount Run the command to attach the context to your canvas app Set-AdminPowerAppConditionalAccessAuthenticationContextIds -EnvironmentName "<your-environment-name>" ` -AppName "<your-app-id>" ` -AuthenticationContextIds "<your-auth-context-id>" This binding tells Power Apps: “When this app opens, trigger the Conditional Access policy tied to this context.” Step 4: Test the Policy Try accessing the app as a guest user. You should see access blocked based on the Conditional Access policy. Wrap Up A Stronger Security Model Conditional Access does not replace existing apps or data permissions—it complements them. App-level security roles control what users can do inside an app, while Conditional Access governs whether they can get into the app at all. Together, they create a much more robust security posture. Bottom Line Conditional Access gives admins the flexibility to apply the right security to the right app. Whether you are enforcing MFA, restricting device types, or securing production environments, it helps you protect sensitive data without slowing down the organization. Documentation for further reading: Manage Power Apps - Power Platform | Microsoft Learn Demo from Power CAT: Conditional Access Policies for Canvas Apps - Power CAT Live512Views2likes1CommentDrive demand for your offers with solution area campaigns in a box
Take your marketing campaigns further with campaigns in a box (CiaBs), collections of partner-ready, high-quality marketing assets designed to deepen customer engagement, simplify your marketing efforts, and grow revenue. Microsoft offers both new and refreshed campaigns for the following solution areas: Data & AI (Azure), Modern Work, Business Applications, Digital & App Innovation (Azure), Infrastructure (Azure), and Security. Check out the latest CiaBs below and get started today by visiting the Partner Marketing Center, where you’ll find resources such as step-by-step execution guides, customizable nurture tactics, and assets including presentations, e-books, infographics, and more. AI transformation Generate interest in AI adoption among your customers. As AI technology grabs headlines and captures imaginations, use this campaign to illustrate your audience’s unique opportunity to harness the power of AI to deliver value faster. Learn more about the campaign: AI Transformation (formerly Era of AI): Show audiences how to take advantage of the potential of AI to drive business value and showcase the usefulness of Microsoft AI solutions delivered and tailored by your organization. Data & AI (Azure) Our Data & AI campaigns demonstrate how your customers can win customers with AI-enabled differentiation. Show how they can transform their businesses with generative AI, a unified data estate, and solutions like Microsoft Fabric, Microsoft Power BI, and Azure Databricks. Campaigns include: Unify your intelligent data - Banking: Help your banking customers understand how you can help them break down data silos, meet compliance demands, and deliver on rising customer expectations. Innovate with the Azure AI platform: Help your customers understand the potential of generative AI solutions to differentiate themselves in the market—and inspire them to build GenAI solutions with the Azure AI platform. Unify your intelligent data and analytics platform - ENT: Show enterprise audiences how unifying data and analytics on an open foundation can help streamline data transformation and business intelligence. Unify your intelligent data and analytics platform - SMB: Create awareness and urgency for SMBs to adopt Microsoft Fabric as the AI-powered, unified data platform that will suit their analytics needs. Modern Work Our Modern Work campaigns help current and potential customers understand how they can effectively transform their businesses with AI capabilities. Campaigns include: Connect and empower your frontline workers: Empower your customers' frontline workers with smart, AI-enhanced workflows with solutions based on Microsoft Teams and Microsoft 365 Copilot. Use this campaign to show your customers how they can make their frontline workers feel more connected, leading to improved productivity and efficiency. Microsoft 365 Copilot SMB: Increase your audience's understanding of the larger potential of Microsoft 365 Copilot and how AI capabilities can accelerate growth and transform operations. Smart workplace with Teams: Use this campaign to show your customers how to use AI to unlock smarter communication and collaboration with Microsoft Teams and Microsoft 365 Copilot. This campaign demonstrates to customers how you can help them seamlessly integrate meetings, calls, chat, and collaboration to break down silos, gain deeper insights, and focus on the work that matters. Cloud endpoints: Help customers bring virtualized applications to the cloud by providing secure AI-powered productivity and development on any device with Microsoft Intune Suite and Windows in the cloud solutions. Business Applications Nurture interest with audiences ready to modernize and transform their business operations with these BizApps go-to-market resources. Campaigns include: AI-powered customer service: Highlight how AI-powered solutions like Microsoft Dynamics 365 are transforming customer service with more personalized experiences, smarter teamwork, and improved efficiency. Migrate and modernize your ERP with Microsoft Dynamics 365: Position yourself as the right partner to modernize or replace your customers' legacy on-premises ERP systems with a Copilot-powered ERP. Business Central for SMB: Offer customers Microsoft Dynamics 365, a comprehensive business management solution that connects finance, sales, service, and operations teams with a single application to boost productivity and improve decision-making. AI-powered CRM: Help your customers enhance their customer experiences and close more deals with Microsoft 365 Dynamics Sales by making data AI-ready, which empowers them to create effective marketing content with Microsoft 365 Copilot and pass qualified leads on to sales teams. Use this campaign to show audiences how Copilot and AI can supercharge their CRM to increase productivity and efficiency, ultimately leading to better customer outcomes. Modernize at scale with AI and Microsoft Power Platform: This campaign is designed to introduce the business values unlocked with Microsoft Power Platform, show how low-code solutions can accelerate development and drive productivity, and position your company as a valuable asset in the deployment of these solutions. Digital & App Innovation (Azure) Position yourself as the strategic AI partner of choice and empower your customers to grow their businesses by helping them gain agility and build new AI applications faster with intelligent experiences. Campaigns include: Build and modernize AI apps: Help customers building new AI-infused applications and modernizing their application estate take advantage of the Azure AI application platform. Accelerate developer productivity: Help customers reimagine the developer experience with the world’s most-adopted AI-powered platform. Use this campaign to show customers how you can use Microsoft and GitHub tools to help streamline workflows, collaborate better, and deliver intelligent apps faster. Infrastructure (Azure) Help customers tap into the cloud to expand capabilities and boost their return on investment by transforming their digital operations. Campaigns include: Modernize VDI to Azure Virtual Desktop - SMB: Show SMB customers how they can meet the challenges of virtual work with Azure Virtual Desktop and gain flexibility, reliability, and cost-effectiveness. Migrate VMware workloads to Azure: Help customers capitalize on the partnership between VMware and Microsoft so they can migrate VMware workloads to Azure in an efficient and cost-effective manner. Migrate and secure Windows Server and SQL Server and Linux - ENT: Showcase the high return on investment (ROI) of using an adaptive cloud purpose-built for AI workloads, and help customers understand the value of migrating to Azure at their own pace. Modernize SAP on the Microsoft Cloud: Reach SAP customers before the 2027 end-of-support deadline for SAP S/4HANA to show them the importance of having a plan to migrate to the cloud. This campaign also underscores the value of moving to Microsoft Azure in the era of AI. Migrate and secure Windows Server and SQL Server and Linux estate - SMB: Use this campaign to increase understanding of the value gained by migrating from an on-premises environment to a hybrid or cloud environment. Show small and medium-sized businesses that they can grow their business, save money, improve security, and more when they move their workload from Windows Server, SQL Server, and Linux to Microsoft Azure. Security Demonstrate the power of modern security solutions and help customers understand the importance of robust cybersecurity in today’s landscape. Campaigns include: Defend against cybersecurity threats: Increase your audience's understanding of the powerful, AI-driven Microsoft unified security platform, which integrates Microsoft Sentinel, Microsoft Defender XDR, Security Exposure Management, Security Copilot, and Microsoft Threat Intelligence. Data security: Show customers how Microsoft Purview can help fortify data security in a world facing increasing cybersecurity threats. Modernize security operations: Use this campaign to sell Microsoft Sentinel, an industry-leading cloud-native SIEM that can help your customers stay protected and scale their security operations.1.6KViews2likes0CommentsFY24 Business Applications Partner Incentives Overview - November!
When: November 07, 2023 4:00 PM (Europe/Dublin) Register here Join us to learn about the priorities and strategy for the Business Application solution area. We’ll discuss how you can integrate partner incentives into your business strategy to grow your business and deliver excellent customer value. Topics include: Trends shaping partner opportunity in Business Applications How Microsoft’s partner incentives portfolio is aligned to our solution priorities in FY24 Changes coming to partner incentives in FY24 Earning opportunities available in Microsoft Commerce Incentives Register here!272Views2likes0CommentsFY24 Business Applications Partner Incentives Overview
Summary Join us to learn about the priorities and strategy for the Business Application solution area. We’ll discuss how you can integrate partner incentives into your business strategy to grow your business and deliver excellent customer value. Content is focused to partners interested in earning incentives when delivering Business Applications solutions to customers. Topics include: Trends shaping partner opportunity in Business Applications How Microsoft’s partner incentives portfolio is aligned to our solution priorities in FY24 Changes coming to partner incentives in FY24 Earning opportunities available in Microsoft Commerce Incentives When: August 02, 2023 3:30 PM (Europe/Dublin) Duration: 60 minutes Click here to register804Views2likes0CommentsBreaking the Shackles of Legacy Portals: Power Pages as Enterprise SaaS
It's time to stop building "Portals" and start deploying Enterprise SaaS. For years, enterprise teams building web portals have been shackled by rigid Dynamics 365 schemas and heavy, template-driven UIs. Traditional Power Apps Portals required developers to follow the portal's own schema structure—page templates, web forms, lists, content snippets—and inherit data models dictated by D365 modules. That era is over. Power Pages has evolved into a secure, enterprise-grade, low-code SaaS platform for creating, hosting, and administering business websites—and as of early 2026, two milestone GA releases have removed the last remaining constraints. Here are six ways those shackles are broken: 🎨 1. UI Liberation with Single-Page Applications — Now GA Single-Page Application support in Power Pages reached General Availability on February 8, 2026, starting with site version 9.8.1.x and later. Developers can now build fully custom, client-side rendered web applications using React, Angular, or Vue and deploy them directly to Power Pages using the Power Platform CLI. This is not a workaround or a bolt-on—Microsoft describes this GA release as making the SPA experience "production ready". What this means in practice: the traditional portal constructs—ASP.NET and Liquid templates, web forms, lists—become optional implementation details, not architectural constraints. Your UI is completely custom and API-driven, calling Power Pages Web APIs for all data operations. The GA release also resolved issues where Power Pages platform styles could override custom CSS, and included updated guidance for authentication configuration and local development setup. Developers can run SPAs locally with full authentication and Web API access, enabling JavaScript hot reload and local debugging without deploying changes to the portal on every iteration. At this point, the traditional portal schema becomes an implementation option—not a constraint. (Ensure your Power Platform CLI is on the latest version for full capabilities.) 🗄️ 2. Data Model Autonomy — Your Entities, Your Rules Power Pages connects to Microsoft Dataverse, but you are no longer forced to borrow a Dynamics 365 schema. Teams can design their own data model from scratch—whether it has five tables or hundreds with complex relationships—tailored to the business domain. Those custom Dataverse tables serve the SPA directly via Web APIs, without needing to build model-driven or canvas apps. This is a fundamental departure. The platform uses the same shared business data stored in Dataverse that other Power Platform components can leverage, but your portal is no longer tethered to any pre-existing Dynamics module. You own your entity model entirely. The result: headless CMS flexibility backed by the security and reliability of Dataverse, without the overhead of a CRM schema you didn't ask for. ☁️ 3. Fully Managed Platform — No Infrastructure Burdens Goodbye, custom web hosting and plumbing. Power Pages is a fully managed SaaS platform—Microsoft handles provisioning, hosting, CDN, scaling, and availability. Authentication is built in, with full support for enterprise identity providers including Microsoft Entra ID and Microsoft Entra External ID, along with table permissions and web roles enforcement on every API call. Organizations can also allow anonymous access or configure private sites as needed. Even advanced backend needs are now covered natively. Server Logic in Power Pages reached General Availability on April 1, 2026, delivering native server-side JavaScript execution with the maturity, governance, and extensibility required for enterprise production workloads. Alongside GA, Microsoft announced two enhancements that reinforce enterprise readiness: Governance control to disable external calls — administrators can restrict outbound connectivity from the Server Logic layer to comply with internal policies and regulatory requirements. Support for unbound Dataverse custom actions — enabling deeper integration with existing business logic layers. The result? Teams focus only on business logic, integrations, and user experience. As Hope Bradford, Senior Director of IT at Kelly Staffing, stated: "Power Pages lets us build personalized client experiences without managing complex infrastructure while maintaining enterprise trust and security." Kelly Staffing's Helix UX portal, built on Power Pages, Dataverse, and Power Automate, now handles over 38,000 client interactions per day. 🛡️ 4. Enterprise-Grade Security and Telemetry Security and governance are first-class citizens on the platform. The 2025–2026 release wave introduced enterprise-grade controls for Power Pages including role-based access and authentication through Entra, Data Loss Prevention (DLP) rules for external data access, IP-based restrictions, maintenance mode options, and built-in diagnostics and monitoring dashboards. Across the broader Power Platform, Microsoft is investing in enterprise observability—the April 2026 update introduced alerting and data metrics in Power Platform Monitor (covering metrics such as app open success rate, time to interactive, data request success rate, and data request latency), enabling IT teams to define health thresholds, receive proactive notifications, and take guided action. This level of governance—audit, monitoring, diagnostics—traditionally required significant custom engineering. Now it is out of the box. 💲 5. Scalable, Usage-Based Licensing One of the most significant licensing shifts: Power Pages became its own product, decoupling from Power Apps licensing entirely. Both internal and external users now fall under the same licensing model, making Power Pages viable for internal use cases like HR services and request management—not just external portals. The model is usage-based (Monthly Active Users), purchased as capacity packs per site: Authenticated Users (Pre-paid): $200 per site/month for 100 users Anonymous Users (Pre-paid): $75 per site/month for 500 users Authenticated Users (Pay-As-You-Go): $4.00 per user/site/month, on-demand Anonymous Users (Pay-As-You-Go): $0.30 per user/site/month, on-demand Each authenticated-user subscription plan includes 2 GB database capacity and 16 GB file capacity. For applications serving tens of thousands of users, this capacity-based model is strategically superior to per-user or per-app seat licenses. Pay-as-you-go costs roughly twice as much as pre-paid capacity packs but suits seasonal or unpredictable usage patterns (e.g., tax season, annual HR enrollment). Tradeoff to consider: Pre-paid packs require upfront commitment and do not roll over month to month, so organizations with highly variable traffic must carefully model usage to avoid over- or under-provisioning. ⚠️ Pricing disclaimer: The figures above are illustrative examples sourced from publicly available Microsoft documentation. Actual costs may vary based on customer type (enterprise vs. corporate), volume commitments, negotiated agreements, and account structure. Final pricing is determined through Microsoft account teams and contracts. ⚡ 6. Rapid Modernization with AI-Assisted Development Power Pages now integrates directly with AI-assisted development workflows. Microsoft announced the public preview of the Power Pages plugin for GitHub Copilot CLI and Claude Code on February 24, 2026, providing an AI-assisted workflow for creating, deploying, and managing modern SPA sites on Power Pages. Developers can scaffold pages, configure data access, and wire up logic using natural language commands, dramatically reducing the time to modernize large enterprise applications. SPAs are deployed using Power Platform CLI commands, and the entire development loop is designed to be streamlined for professional developers. This means that even large, complex in-house enterprise applications—hundreds of tables, complex relationships, tens of thousands of users—can be remodeled on Power Pages far more efficiently than legacy approaches required. You migrate your own custom model into Dataverse, build your SPA, wire up integrations, and the platform handles everything else. The Bottom Line If you are still managing custom Azure websites, maintaining SQL servers, or stitching together bespoke PaaS stacks for internal business tools, you are carrying unnecessary operational weight. Power Pages is no longer just a D365 portal. It is a fully managed, enterprise-grade SaaS platform that gives you total UI freedom (SPA support: GA since February 2026), native server-side logic (GA since April 2026), your own data architecture without D365 schema dependencies, built-in security and governance, and a licensing model that scales to enterprise volumes. The industry is underestimating this shift. The shackles are off. Deploy, don't build.When Your CRM Plays Hide-and-Seek: The Mystery of Missing Columns in Dynamics 365
What Happened? Personal views across multiple organizations started showing empty columns, even though the data was there. For businesses that rely on these views for daily decisions, this isn’t a normal glitch—the end users can easily assume that “no data” in the view means no data in the record and make decisions based on an incomplete picture of customer information. The Detective Work Our investigation uncovered the culprit: a mismatch between two behind-the-scenes players—View XML and Fetch XML. Think of them as the blueprint and the builder. When they don’t talk to each other properly, your view looks fine but can’t fetch the data it needs. Why It Matters This isn’t just a tech hiccup—it’s a reminder of how small cracks in system design can ripple into big business headaches. It also highlights the need for smarter automation and better error detection in enterprise platforms. The Fix (and the Frustration) The good news? A manual fix was quickly identified over a year ago. The bad news? It was manual. And many impacted users didn’t even know that their views were bad. We needed a better solution and now, we have one. Starting in October 2025, Microsoft rolled out a behind the scenes fix option. Once it’s turned on, any time a user opens a corrupted view, that view will be automatically updated to display the correct data. If the user has permission to edit the view, the updates will be saved so that the view will be permanently fixed. But there’s still a catch. Microsoft doesn’t want to enable a process that makes data changes (in this case, the data is the view definition) without your company’s permission. If your organization is running into this issue, here’s a quick test to assure that the new Microsoft fix will work for you: Identify views that you know are not rendering properly and that you will be able to access If you create a copy of a view that is corrupted, the copy will also have the corruption, so you can create additional views to test In your browser URL bar, append the following to the end of your Dynamics URL: &flags=FCB.DataSetViewFixMissingFetchColumns=true This will enables a “feature flag” that fixes the views issue, but only for the user that added the flag to the URL and only until their session expires. Other users will not see the update. If your URL looks like this: https://org7909d641.crm.dynamics.com/main.aspx?appid=12345678-1234-1234-1234-123456789012 … make it look like this and hit enter https://org7909d641.crm.dynamics.com/main.aspx?appid=12345678-1234-1234-1234-123456789012&flags=FCB.DataSetViewFixMissingFetchColumns=true Wait for Dynamics to reload Test opening the corrupted views. You’ll see that they’re magically working as expected When you are satisfied that this is working for you as desired, open a case with Microsoft Support and request that your organization be enabled for the DataSetViewFixMissingFetchColumns FCB. This will enable the fix for all users across your Dynamics organization Takeaway: If your CRM starts acting like a magician hiding data, don’t panic. The data is still there—you just need to coax it back with the right fix. And now there’s an option to make sure that this issues goes away for good.375Views1like0Comments🚀 Export D365 CE Dataverse Org Data to Cosmos DB via the Office365 Management Activity API
📘 Preface This post demonstrates one method to export Dynamics 365 Customer Engagement (CE) Dataverse organization data using the Office 365 Management Activity API and Azure Functions. It is feasible for customers to build a custom lake-house architecture with this feed, enabling advanced analytics, archiving, or ML/AI scenarios. https://learn.microsoft.com/en-us/office/office-365-management-api/office-365-management-activity-api-reference 🧭 When to Use This Custom Integration While Microsoft offers powerful native integrations like Dataverse Synapse Link and Microsoft Fabric, this custom solution is observed implemented and relevant in the following scenarios: Third-party observability and security tools already use this approach Solutions such as Splunk and other enterprise-grade platforms commonly implement integrations based on the Office 365 Management Activity API to ingest tenant-wide audit data. This makes it easier for customers to align with existing observability pipelines or security frameworks. Customers opt out of Synapse Link or Fabric Whether due to architectural preferences, licensing constraints, or specific compliance requirements, some customers choose not to adopt Microsoft’s native integrations. The Office Management API offers a viable alternative for building custom data export and monitoring solutions tailored to their needs. 🎯 Why Use the Office 365 Management Activity API? Tenant-wide Data Capture: Captures audit logs and activity data across all Dataverse orgs in a tenant. Integration Flexibility: Enables export to Cosmos DB, cold storage, or other platforms for analytics, compliance, or ML/AI. Third-party Compatibility: Many enterprise tools use similar mechanisms to ingest and archive activity data. 🏗️ Architecture Overview Azure Function App (.NET Isolated): Built as webhook, processes notifications, fetches audit content, and stores filtered events in Cosmos DB. Cosmos DB: Stores audit events for further analysis or archiving. Application Insights: Captures logs and diagnostics for troubleshooting. 🛠️ Step-by-Step Implementation https://learn.microsoft.com/en-us/office/office-365-management-api/get-started-with-office-365-management-apis#build-your-app 1. Prerequisites Azure subscription Dynamics 365 CE environment (Dataverse) Azure Cosmos DB account (SQL API) Office 365 tenant admin rights Enable Auditing in Dataverse org 2. Register an Azure AD App Go to Azure Portal > Azure Active Directory > App registrations > New registration Note: Application (client) ID Directory (tenant) ID Create a client secret Grant API permissions: ActivityFeed.Read ActivityFeed.ReadDlp ServiceHealth.Read Grant admin consent 3. Set Up Cosmos DB Create a Cosmos DB account (SQL API) Create: Database: officewebhook Container: dynamicsevents Partition key: /tenantId Note endpoint URI and primary key 4. Create the Azure Function App Use Visual Studio or VS Code Create a new Azure Functions project (.NET 8 Isolated Worker) Add NuGet packages: Microsoft.Azure.Functions.Worker Microsoft.Azure.Cosmos Newtonsoft.Json Function Logic: Webhook validation Notification processing Audit content fetching Event filtering Storage in Cosmos DB 5. Configure Environment Variables { "OfficeApiTenantId": "<your-tenant-id>", "OfficeApiClientId": "<your-client-id>", "OfficeApiClientSecret": "<your-client-secret>", "CrmOrganizationUniqueName": "<your-org-name>", "CosmosDbEndpoint": "<your-cosmos-endpoint>", "CosmosDbKey": "<your-cosmos-key>", "CosmosDbDatabaseId": "officewebhook", "CosmosDbContainerId": "dynamicsevents", "EntityOperationsFilter": { "incident": ["create", "update"], "account": ["create"] } } 6. Deploy the Function App Build and publish using Azure Functions Core Tools or Visual Studio Restart the Function App from Azure Portal Monitor logs via Application Insights 🔔 How to Subscribe to the Office 365 Management Activity API for Audit Notifications To receive audit notifications, you must first subscribe to the Office 365 Management Activity API. This is a two-step process: https://learn.microsoft.com/en-us/office/office-365-management-api/office-365-management-activity-api-reference#start-a-subscription 1. Fetch an OAuth2 Token Authenticate using your Azure AD app credentials to get a bearer token: https://learn.microsoft.com/en-us/office/office-365-management-api/get-started-with-office-365-management-apis # Define your Azure AD app credentials $tenantId = "<your-tenant-id>" $clientId = "<your-client-id>" $clientSecret = "<your-client-secret>" # Prepare the request body for token fetch $body = @{ grant_type = "client_credentials" client_id = $clientId client_secret = $clientSecret scope = "https://manage.office.com/.default" } # Fetch the OAuth2 token $tokenResponse = Invoke-RestMethod -Method Post -Uri "https://login.microsoftonline.com/$tenantId/oauth2/v2.0/token" -Body $body $token = $tokenResponse.access_token 2. Subscribe to the Content Type Use the token to subscribe to the desired content type (e.g., Audit.General): https://learn.microsoft.com/en-us/office/office-365-management-api/office-365-management-activity-api-reference#working-with-the-office-365-management-activity-api $contentType = "Audit.General" $headers = @{ Authorization = "Bearer $token" "Content-Type" = "application/json" } $uri = "https://manage.office.com/api/v1.0/$tenantId/activity/feed/subscriptions/start?contentType=$contentType" $response = Invoke-RestMethod -Method Post -Uri $uri -Headers $headers $response ⚙️ How the Azure Function Works 🔸 Trigger The Azure Function is triggered by notifications from the Office 365 Management Activity API. These notifications include audit events across your entire Azure tenant—not just Dynamics 365. 🔸 Filtering Logic Each notification is evaluated against your business rules: Organization match Entity type (e.g., incident, account) Operation type (e.g., create, update) These filters are defined in the EntityOperationsFilter environment variable: { "incident": ["create", "update"], "account": ["create"] } 🔸 Processing If the event matches your filters, the function fetches the full audit data and stores it in Cosmos DB. If not, the event is ignored. 🔍 Code Explanation: The Run Method 1. Webhook Validation https://learn.microsoft.com/en-us/office/office-365-management-api/office-365-management-activity-api-reference#webhook-validation string validationToken = query["validationToken"]; if (!string.IsNullOrEmpty(validationToken)) { await response.WriteStringAsync(validationToken); response.StatusCode = HttpStatusCode.OK; return response; } 2. Notification Handling https://learn.microsoft.com/en-us/office/office-365-management-api/office-365-management-activity-api-reference#receiving-notifications var notifications = JsonConvert.DeserializeObject<dynamic[]>(requestBody); foreach (var notification in notifications) { if (notification.contentType == "Audit.General" && notification.contentUri != null) { // Process each notification } } 3. Bearer Token Fetch string bearerToken = await GetBearerTokenAsync(log); if (string.IsNullOrEmpty(bearerToken)) continue; 4. Fetch Audit Content https://learn.microsoft.com/en-us/office/office-365-management-api/office-365-management-activity-api-reference#retrieve-content var requestMsg = new HttpRequestMessage(HttpMethod.Get, contentUri); requestMsg.Headers.Authorization = new AuthenticationHeaderValue("Bearer", bearerToken); var result = await httpClient.SendAsync(requestMsg); if (!result.IsSuccessStatusCode) continue; var auditContentJson = await result.Content.ReadAsStringAsync(); 5. Deserialize and Filter Audit Records https://learn.microsoft.com/en-us/office/office-365-management-api/office-365-management-activity-api-schema#dynamics-365-schema var auditRecords = JsonConvert.DeserializeObject<dynamic[]>(auditContentJson); foreach (var eventData in auditRecords) { string orgName = eventData.CrmOrganizationUniqueName ?? ""; string workload = eventData.Workload ?? ""; string entityName = eventData.EntityName ?? ""; string operation = eventData.Message ?? ""; if (workload != "Dynamics 365" && workload != "CRM" && workload != "Power Platform") continue; if (!entityOpsFilter.ContainsKey(entityName)) continue; if (!entityOpsFilter[entityName].Contains(operation)) continue; // Store in Cosmos DB } 6. Store in Cosmos DB var cosmosDoc = new { id = Guid.NewGuid().ToString(), tenantId = notification.tenantId, raw = eventData }; var partitionKey = (string)notification.tenantId; var resp = await cosmosContainer.CreateItemAsync(cosmosDoc, new PartitionKey(partitionKey)); 7. Logging and Error Handling https://learn.microsoft.com/en-us/office/office-365-management-api/office-365-management-activity-api-reference#errors log.LogInformation($"Stored notification in Cosmos DB for contentUri: {notification.contentUri}, DocumentId: {cosmosDoc.id}"); catch (Exception dbEx) { log.LogError($"Error storing notification in Cosmos DB: {dbEx.Message}"); } 🧠 Conclusion This solution provides a robust, extensible pattern for exporting Dynamics 365 CE Dataverse org data to Cosmos DB using the Office 365 Management Activity API. Solution architects can use this as a reference for building or evaluating similar integrations, especially when working with third-party archiving or analytics solutions.250Views1like0CommentsTransform business process with agentic business applications (Americas)
September 30 - October 1, 2025 | 7:00 AM – 10:00 AM AMER (PDT) Overview This bootcamp is designed to equip you with the AI skills and clarity needed to harness the power of Copilot Studio and AI Agents in Dynamics 365. Participants will learn what AI agents are, why they matter in Dynamics 365, and how to design and build agents that address customer needs today while preparing for the AI-native ERP and CRM future. Building from the fundamentals of Copilot Studio and its integration across Dynamics 365 applications, we’ll explore how first-party agents are built, why Microsoft created them, and where their current limitations open opportunities for partner-led innovation. We’ll then expand into third-party agent design and extensibility, showing how partners can create differentiated solutions that deliver unique value. Finally, we will provide a forward-looking perspective on Microsoft’s strategy with Model Context Protocol (MCP), Agent-to-Agent (A2A) orchestration, and AI-native business applications - inspiring partners to create industry-specific agents that solve real customer pain points and showcase their expertise. Join this virtual event to accelerate your technical readiness journey on AI agents in Dynamics 365. Register today and mark your calendars to gain valuable insights from our Microsoft SMEs. Don’t miss this opportunity to learn about the latest developments and elevate your partnership with Microsoft Event prerequisites Participants should have some familiarity and work experience with the associated solutions. Additionally, we suggest having knowledge of the relevant role-based certification content (although passing the exam is not mandatory). You can find free self-paced learning content and technical documentation related to the workshop topics at Microsoft Learn. Earn a digital badge Attendees who participate in the live sessions of this workshop will earn a digital badge. These badges, which serve as a testament to your engagement and learning, can be conveniently accessed and shared through the Credly digital platform. Please note that accessing on-demand content does not meet the criteria for earning a badge. REGISTER TODAY!132Views1like0Comments🚀 Scaling Dynamics 365 CRM Integrations in Azure: The Right Way to Use the SDK ServiceClient
This blog explores common pitfalls and presents a scalable pattern using the .Clone() method to ensure thread safety, avoid redundant authentication, and prevent SNAT port exhaustion. 🗺️ Connection Factory with Optimized Configuration The first step to building a scalable integration is to configure your ServiceClient properly. Here's how to set up a connection factory that includes all the necessary performance optimizations: public static class CrmClientFactory { private static readonly ServiceClient _baseClient; static CrmClientFactory() { ThreadPool.SetMinThreads(100, 100); // Faster thread ramp-up ServicePointManager.DefaultConnectionLimit = 65000; // Avoid connection bottlenecks ServicePointManager.Expect100Continue = false; // Reduce HTTP latency ServicePointManager.UseNagleAlgorithm = false; // Improve responsiveness _baseClient = new ServiceClient(connectionString); _baseClient.EnableAffinityCookie = false; // Distribute load across Dataverse web servers } public static ServiceClient GetClient() => _baseClient.Clone(); } ❌ Anti-Pattern: One Static Client for All Operations A common anti-pattern is to create a single static instance of ServiceClient and reuse it across all operations: public static class CrmClientFactory { private static readonly ServiceClient _client = new ServiceClient(connectionString); public static ServiceClient GetClient() => _client; } This struggles under load due to thread contention, throttling, and unpredictable behavior. ⚠️ Misleading Fix: New Client Per Request To avoid thread contention, some developers create a new ServiceClient per request, however the below does not truly create seperate connection unless RequireNewInstance=True connection string param or useUniqueInstance:true constructor param are utilized. Many a times these intricate details are missed out and causing same connection be shared across threads with high lock times compounding overall slowness. public async Task Run(HttpRequest req) { var client = new ServiceClient(connectionString); // Use client here } Even with above flags there is a risk of auth failures and SNAT exhaustion in a high throughout service integration scenario due to repeated OAuth authentication every time a ServiceClient instance is created with constructor. ✅ Best Practice: Clone Once, Reuse Per Request The best practice is to create a single authenticated ServiceClient and use its .Clone() method to generate lightweight, thread-safe copies for each request: public static class CrmClientFactory { private static readonly ServiceClient _baseClient = new ServiceClient(connectionString); public static ServiceClient GetClient() => _baseClient.Clone(); } Then, in your Azure Function or App Service operation: ❗ Avoid calling the factory again inside helper methods. Clone once and pass the client down the call stack. public async Task HandleRequest() { var client = CrmClientFactory.GetClient(); // Clone once per request await DoSomething1(client); await DoSomething2(client); } public async Task DoSomething1(ServiceClient client) { await client.Create(); // Avoid new client cloning but just use passed down client as is } 🧵 Parallel Processing with Batching When working with large datasets, combining parallelism with batching using ExecuteMultiple can significantly improve throughput—if done correctly. 🔄 Common Mistake: Dynamic Batching Inside Parallel Loops Many implementations dynamically batch records inside Parallel.ForEach, assuming consistent batch sizes. But in practice, this leads to: Inconsistent batch sizes (1 to 100+) Unpredictable performance Difficult-to-analyze telemetry ✅ Fix: Chunk Before You Batch public static List> ChunkRecords(List records, int chunkSize) { return records .Select((record, index) => new { record, index }) .GroupBy(x => x.index / chunkSize) .Select(g => g.Select(x => x.record).ToList()) .ToList(); } public static void ProcessBatches(List records, ServiceClient serviceClient, int batchSize = 100, int maxParallelism = 5) { var batches = ChunkRecords(records, batchSize); Parallel.ForEach(batches, new ParallelOptions { MaxDegreeOfParallelism = maxParallelism }, batch => { using var service = serviceClient.Clone(); // Clone once per thread var executeMultiple = new ExecuteMultipleRequest { Requests = new OrganizationRequestCollection(), Settings = new ExecuteMultipleSettings { ContinueOnError = true, ReturnResponses = false } }; foreach (var record in batch) { executeMultiple.Requests.Add(new CreateRequest { Target = record }); } service.Execute(executeMultiple); }); } 🚫 Avoiding Throttling: Plan, Don’t Just Retry While it’s possible to implement retry logic for HTTP 429 responses using the Retry-After header, the best approach is to avoid throttling altogether. ✅ Best Practices Control DOP and batch size: Keep them conservative and telemetry driven. Use alternate app registrations: Distribute load across identities but do not overload the Dataverse org. Avoid triggering sync plugins or real-time workflows: These amplify load. Address long-running queries: Optimize operations with Microsoft support help before scaling Relax time constraints: There’s no need to finish a job in 1 hour if it can be done safely in 3. 🌐 When to Consider Horizontal Scaling Even with all the right optimizations, your integration may still hit limits under the HTTP stack—such as: WCF binding timeouts SNAT port exhaustion Slowness not explained by Dataverse telemetry In these cases, horizontal scaling becomes essential. App Services: Easily scale out using auto scale rules. Function Apps (service model): Scale well with HTTP or service bus triggers. Scheduled Functions: Require deduplication logic to avoid duplicate processing. On-Premises VM: When D365 SDK based integrations hosted on VM infra, they shall need horizontal scaling by increasing servers. 🧠 Final Thoughts Scaling CRM integrations in Azure is about resilience, observability, and control. Follow these patterns: Clone once per thread Pre-chunk batches Tune with telemetry evidence Avoid overload when you can Scale horizontally when needed—but wisely Build integrations that are fast, reliable, and future proof.532Views1like0Comments