sap on azure
57 TopicsThe Journey of Copilot: From Setup to Mastery for Azure SAP customers
Introduction: GitHub Copilot integrates as an extension or plugin within developer tools commonly used in SAP and Azure scenarios, such as Visual Studio, Visual Studio Code, and other supported IDEs. These tools are often used alongside SAP development (e.g., ABAP, CAP, or integrations with S/4HANA and Azure services). Before you begin, ensure you have access to Copilot, through an organizational license (common in enterprise environments). Install GitHub Copilot Step 1: Install Required Extensions Open Visual Studio Code Go to Extensions (Ctrl + Shift + X) Install the following extensions: GitHub Copilot GitHub Copilot Chat GitHub Copilot for Azure (Microsoft extension) When installing the Azure extension, it may prompt you to install additional Azure tools, accepting all required components. Step 2: Sign in and Authenticate Sign in to your GitHub account Sign in to your Azure account Complete authentication in the browser Return to VS Code Both logins are required: GitHub → enables Copilot Azure → enables Azure resource access and tools Step 3: Enable and Verify Setup Open Copilot Chat (Ctrl + Alt + I) Check that Copilot is active Verify Azure integration by typing a test prompt: What Azure resources are deployed and running in my subscription? If you get a response → setup is successful Step 4: Configure Azure Context (Important for SAP) Set your Azure tenant / subscription (Entra ID) Ensure correct environment for: SAP on Azure (S/4HANA, SAP NetWeaver) SAP BTP extensions Optional: Enable Agent Mode for automation tasks (deployments, scripts) Get Started in Your SAP Development Environment Open your preferred IDE (Visual Studio, VS Code, or Eclipse with SAP tooling) Access the Copilot chat or assistant panel within the IDE Sign in with your GitHub account (and organizational account if required) Start using Copilot in your SAP development scenarios Use Copilot for SAP Workloads Inline suggestions Get real-time code suggestions for SAP-related languages (e.g., JavaScript, Java, ABAP extensions, CAP models) Ask questions in chat Understand existing logic, SAP APIs, or integration patterns (e.g., “Explain this service” or “How does this SAP function work?”) Generate and improve code Create boilerplate logic, unit tests, and integration code faster Identify performance or design improvements in existing SAP code Enhance with SAP Context Provide additional context (files, APIs, or SAP objects) to improve suggestions Optionally connect Copilot to SAP data or services using enterprise integrations Use Copilot to support: SAP BTP extensions S/4HANA integrations Fiori/UI5 development and APIs Once you start interacting with Copilot, it acts as an AI assistant within your SAP development workflow, helping you write code faster, understand existing logic, and accelerate innovation across your SAP and Azure landscape. The Hidden Layer: Network Configuration for SAP Customers As you begin using GitHub Copilot within your SAP development and integration environment, you may notice performance differences, especially when working within corporate networks. In most cases, Copilot connects securely to GitHub services over the internet using HTTPS, without requiring additional setup. However, in SAP enterprise environments where strict governance, security policies, and compliance controls are in place, network traffic is often routed through proxies, firewalls, or VPNs. What You Need to Know Copilot may require additional configuration when operating behind corporate proxies or firewalls Proxy settings can be configured: Directly within your IDE Or through environment variables such as HTTP_PROXY and HTTPS_PROXY Authentication to enterprise proxies may require: Basic credentials Or enterprise mechanisms such as Kerberos-based authentication Enterprise Considerations for SAP Landscapes Organizations may require custom SSL certificates for secure outbound connections Network security policies may restrict access to external services Required Copilot and GitHub endpoints must be allowed to ensure connectivity Why This Matters for SAP Customers In SAP environments especially those involving S/4HANA, SAP BTP, or hybrid/on‑premise systems network security is tightly controlled. Proper configuration ensures that Copilot can securely interact with external services while still complying with enterprise security standards. Once configured correctly, Copilot integrates seamlessly into your SAP development workflow, enabling secure, reliable, and high‑performance AI-assisted development within your governed enterprise environment. Configure Network Settings (if required) for Azure SAP Environments In Azure‑hosted SAP landscapes (such as S/4HANA on Azure, SAP BTP, or hybrid environments), network configuration plays a critical role in enabling GitHub Copilot securely. Network setup is primarily required in enterprise environments where security controls such as proxies, firewalls, VPNs, or Azure networking policies are enforced. Default Behavior GitHub Copilot connects securely over HTTPS No additional configuration is required in open network environments Proxy & Enterprise Network Configuration If your Azure SAP environment uses controlled outbound access: Configure proxy settings: HTTP_PROXY HTTPS_PROXY Directly within your IDE (Visual Studio, VS Code) Or via environment variables: Supported authentication methods: Basic authentication Kerberos (common in enterprise identity setups) Additional considerations: Ensure required GitHub/Copilot endpoints are allowed in Azure firewall or network security groups Install custom SSL certificates if your organization uses SSL inspection Note: Visual Studio typically inherits Windows/Azure VM proxy settings Troubleshooting Network Issues in Azure SAP Scenarios If Copilot stops responding or behaves inconsistently, the issue is often related to enterprise network controls in Azure or hybrid SAP architectures. Common Causes Proxy or firewall blocking outbound connectivity SSL certificate validation failures VPN or private network restrictions (ExpressRoute / private endpoints) Quick Diagnostics: Test connectivity from your Azure VM or development machine: curl --verbose https://copilot-proxy.githubusercontent.com/_ping If using a proxy: curl --verbose -x http://PROXY:PORT -i -L https://copilot-proxy.githubusercontent.com/_ping HTTP 200 → Connectivity is working Errors → Network blocking or configuration issue Recommended Troubleshooting Steps Verify proxy settings are correctly configured Check SSL certificates and trust chain Review Azure firewall, NSG, or proxy rules Validate required endpoints are reachable Enable verbose logs or diagnostics in your IDE for deeper analysis Best Practice for Azure SAP Customers Adopt a structured troubleshooting approach: Validate connectivity Trace the network path (proxy, firewall, DNS) Fix configuration issues systematically This aligns with the governance and operational discipline already used in SAP and Azure environments. Outcome: A Confident Copilot User in Azure SAP By following this approach, you move beyond basic usage and gain full control of Copilot within your enterprise landscape. You will be able to: Deploy and use Copilot across Azure SAP environments Integrate it securely within enterprise networking constraints Troubleshoot issues with confidence using systematic diagnostics Conclusion: GitHub Copilot is no longer a black box, it becomes a trusted, secure, and intelligent AI assistant seamlessly integrated into your Azure and SAP development ecosystem. As you adopt it into your workflow, development becomes faster, cleaner, and more efficient. More importantly, you gain a reliable partner that enhances productivity and supports innovation, ensuring that you are no longer coding alone, but collaborating with AI to deliver better outcomes. Reference links: https://docs.github.com/en/copilot/how-tos/set-up/install-copilot-extension Get Started with GitHub Copilot - Visual Studio (Windows) | Microsoft Learn https://learn.microsoft.com/en-us/azure/developer/github-copilot-azure/get-started?pivots=visual-studio-code Network settings for GitHub Copilot - GitHub Docs Troubleshooting network errors for GitHub Copilot - GitHub Docs GitHub Copilot for SAP ABAP in VS Code: Setup Guid... - SAP CommunityEnabling Agentic Data Governance with Hybrid Cloud Flexibility in Azure
The “Why” Do you manage data in a complex multi-cloud environment? Are you struggling with data silos, evolving regulations, and the pressure to maintain control and compliance across on-prem and multiple clouds? Do you ever wish an intelligent assistant could help shoulder the load of data governance? If so, I can relate. Let me tell you a story that might sound familiar. Meet Mark (pictured above). He is a data governance officer at Contoso (a fictional but very representative enterprise). Mark’s day job is ensuring data governance and compliance across his company’s vast hybrid cloud estate – think around ~2 million data assets sprawled across 12+ datacenters on-premises and in different public clouds. Regulatory requirements are constantly shifting. Customer data is increasingly sensitive. Each department and region has its own way of doing things. Mark is fighting an uphill battle with data silos and disconnected cloud operations. He bounces between a patchwork of tools – spreadsheets, cloud consoles, governance portals – trying to answer basic questions: Where is our data? Who’s using it? Are we in compliance? Armed with an old desk calculator and a pile of paper-based reports (a perfect 1990s backdrop), he is dealing with the data around him that has exploded in volume and complexity. What if Mark had a single pane of glass. The glass that reflects and acts. It reflects your governance state and enforces compliance – a self-hydrating pane of glass accompanied by a conversational AI. And he’s not alone. We’re all living in a data overload era. Every day, organizations generate and ingest more information than ever before. Transistors and mainframes gave way to the internet boom of the ’90s, then an explosion of mobile devices in the 2000s, social media in the 2010s, and now widespread cloud computing – all funneling data into our systems at an exponential rate. On top of that, a new wave of AI and conversational interfaces has arrived here in the mid-2020s, making data more accessible but also increasing expectations for real-time insight. It’s no wonder modern IT leaders feel overwhelmed. But these challenges are also opportunities. The way I see it, the incredible growth of data and cloud capabilities means we have a chance to reimagine data governance. The fact that I’m writing about this right now is no coincidence. My customers are looking to resolve problems in this space. In my conversations with them, I hear the same needs: We want better governance, more visibility, streamlined oversight… and cherry on top, we want it in an “agentic” fashion. In other words, they want to delegate the grunt work to the platform toolset augmented by AI, so they can focus on higher-value tasks. The “What” That vision – agentic data governance with hybrid cloud flexibility – became the driver for this work. This is a modular solution, and you have these building block style components (cloud services, governance tools, AI agents), which you can snap them together into an intended solution. Think of it as a jumpstart kit for continuous data governance across multiple clouds, with autonomous (“agentic”) assistance baked in that you can leverage and build upon. It’s not the final, productized solution – more a vision of what’s possible. Contoso’s Requirements These are the high-level requirements from Contoso: Data governance across clouds under one roof A single pane of glass dashboard consolidating reporting on the 5 governance domains: o Visibility on data residency and lineage o PII (Personally Identifiable Information) must run on a CC (Confidential Compute) o Security software (Defender) compliance o Resource tagging compliance (foundational for a good governance posture) o OS updates compliance Ability to enforce compliance in an agentic manner with a human in the loop Agentic enforcement of compliance pertaining to residency and confidential compute Solution – The breakdown The solution is comprised of 8 modules addressing these requirements. These solution modules are: Foundational (Landing zones, Data Sources, Operational setup, Policies, etc.) Dashboard Hydration + Agentic Reporting – Residency Compliance Dashboard Hydration + Agentic Reporting – Confidential Compute for PII Compliance Dashboard Hydration + Agentic Reporting – MS Defender Compliance Dashboard Hydration + Agentic Reporting – Resource Tag Compliance Dashboard Hydration + Agentic Reporting – OS Updates/Patch Compliance Enforce Compliance via Copilot Agent - Residency Compliance Enforce Compliance via Copilot Agent – CC PII Compliance Solution – The architecture view These are the main technical components that make up the solution architecture: Data sources of all shapes and sizes on the left, governed by the native Azure or the Arc plane. Additional Azure services across the bottom layer for the foundational governance posture Microsoft Purview, in the top middle, as the unified data governance platform Microsoft Fabric, in the bottom middle, as the end-to-end ingestion and analytics platform Microsoft Power Platform, on the right, as the low code/no code business flow and the copilot agent experience Solution – The end user view So how does Mark see this solution as a data governance officer? He doesn’t see all the intricacies of the solution integration and the logic execution. He sees two things: A Power BI dashboard running on Microsoft Fabric with A compliance dashboard with an overall score in each of the five compliance domains alongside scores for each of the data products across these domains Additional reporting views for more granular reporting Fabric-based pipeline that hydrates the underlying semantic models from various sources to keep the reports fresh and current A Copilot agent (in Teams) for both: Reporting on all compliance domains Enforcing in-scope compliance across selected domains The agent takes care of it - queries Fabric’s semantic model, calls Azure Function endpoints, updates Purview glossary terms, applies Azure tags, and sends Teams notifications. The “How” – Residency Compliance Let’s pick a few modules to walk through how these solution modules work together to give a cohesive agentic governance experience to Mark. It’s Monday morning, and Mark logs into the Contoso governance portal with a cup of coffee in hand. Instead of a dozen browser tabs, he has two main tools opened: the Data Governance Dashboard and the Contoso Governance Copilot agent. To address some inquiries that came as an assigned action to him, he interacted with the agent. During this interaction, not only did he validate if there were any residency missing in the unified data governance platform (Purview), but he was also able to address a mismatch between Purview and Azure resource, based on the designed principles. Here is the snippet of the chat: Now, under the hood, several components have worked on behalf of the agent in performing this governance checking and applying the necessary course of action: Even before Mark's conversation with the agent, an ongoing hydration process keeps the Fabric Power BI dashboard up to date. Dashboard Hydration + Agentic Reporting – Residency Compliance A Fabric notebook runs the residency scorecard code block through a pipeline. It reads two Lakehouse tables containing latest residency information from Purview and the approved region list Then, the notebook gets a Microsoft Entra bearer token Once acquired, the notebook then calls an Azure Function endpoint This endpoint, then searches for the Azure resources associated with the data products in Purview using an Azure resource tag. The notebook then compares the declared Purview residency with the approved region list and the associated resource’s region The notebook then calculates the final 0 / 25 / 50 / 75 / 100 residency compliance score and a reason. For example: A data product without an associated Azure resource gets a 0, while a data product whose residency in Purview is an approved region by Contoso, and also matches with the associated Azure resource, gets a 100. It then writes the results to the relevant residency compliance Lakehouse tables The dedicated compliance table then feeds to the semantic model for reporting The compliance Power BI dashboard is hydrated Enforce Compliance via Copilot Agent - Residency Compliance With the dashboard data regularly updated, the agent follows this logic, the updated reporting data, and the actions at its disposal, during the earlier conversation with Mark : Mark initiates the conversation with the agent The agent calls a Power Automate flow This flow retrieves Purview’s residency information stored in the Fabric semantic model 5, 6, 7 and 8. When Mark asks to investigate further on a data product, the agent carries the conversation using a topic, which then leverages a flow, which uses a Power Automate custom connector to access an Azure Function endpoint. This endpoint then retrieves latest glossary (residency) information about the data product in question, from Purview, and provides a preview back to the user 10, 11, 12, and 13. If the update criteria are met, and if there is no conflict, and with Mark’s blessings, the topic then calls another flow to access the Functions Purview Update endpoint, and make the glossary (residency) update in Purview for that data product The “How” – Confidential Compute for PII Compliance Dashboard Hydration + Agentic Reporting – Confidential Compute for PII Compliance The following snippet shows how Mark addresses the compliance risk with a critical data product (application), S/4 HANA, and performed the necessary compliance actions, such as tagging the associated resources and notifying the data product owners via Teams channel. The following diagram shows the under-the-hood hydration flow for confidential compute compliance: Enforce Compliance via Copilot Agent – CC PII Compliance Finally, the diagram below shows how Mark’s conversation flows through the main solution components: Outcome Stepping back, what did we accomplish for Mark and Contoso? We turned an onslaught of governance challenges into an opportunity to modernize how data is managed. This gave Mark: Centralized Visibility into data assets across the landscape through Purview and a unified dashboard Proactive compliance enabled with automated checks - controlled with Purview exports and Fabric pipeline schedules And compliance enforcement using an agent Hybrid Cloud Consistency. By using Azure Arc and a foundational data plane management setup Reduced Operational overhead with agentic reporting and compliance Though the solution is comprised of wide variety of components/services, it is built from standard building blocks and is relatively simple to implement. In total, the solution combined around a dozen Azure services and over 40 distinct components (from Purview catalogs to data pipelines, to custom functions and flows). You can choose to implement some or all the compliance domains. Or, better yet, build upon and create new domains and pave new paths. Wrap-up I believe many enterprises could take a similar journey. If you’re facing these issues, consider this an invitation to think differently about data governance. Start with the pieces you already have – your own building blocks of cloud services and data – and imagine what you could build. Chances are that a lot of the heavy lifting can be orchestrated with today’s technology. And with the rise of AI copilots, the dream of agentic data governance – where your policies are continuously enforced by smart agents – is no longer science fiction. It’s here, right now, waiting for you to take it for a spin. Next steps Watch the video narrative on SAP on Azure YouTube channel: Build it with the GitHub Repository: https://github.com/moazmirza/data-sov-and-hyb-cloud Comments/questions: Here, or @ LinkedIn /moazmirza Solution Selfies Azure Policy Compliance - Foundational Governance Posture Purview Data Product Catalog and Data Lineage Purview Governance Metadata à Fabric Lakehouse Fabric Semantic Model Additional Fabric Power BI Dashboard Copilot Studio Topic Flow Azure Function Endpoints250Views0likes0CommentsChaos Engineering vs. STAF for SAP: Resilience Validation vs. Functional Assurance
Introduction: As SAP environments transition to cloud platforms such as Azure, one strategic question consistently surfaces: “STAF proves SAP works, Chaos Engineering proves it survives. Why do we need both?” The short answer: STAF and Chaos Engineering serve different purposes and treating them as interchangeable can expose SAP production environments to unseen risk. A Quick Comparison for Mission Critical SAP Engagements In the world of SAP on Azure, reliability and resilience are non-negotiable. Two powerful approaches. Chaos Engineering for SAP and SAP Testing Automation Framework (STAF) help ensure mission-critical workloads remain robust. But what sets them apart, and how do they complement each other? Why This Matters SAP workloads often underpin core business processes. Downtime or misconfiguration can lead to significant operational and financial impact. While both Chaos Engineering and STAF aim to improve system reliability, they do so in very different ways. Chaos Engineering for SAP Chaos Engineering is about proactively testing resilience by introducing controlled failures into your environment. Using tools like Azure Chaos Studio, engineers simulate real-world disruptions such as VM shutdowns, DNS failures, or network latency, to validate how SAP systems recover under stress. Key Benefits: Identifies hidden weaknesses in architecture. Improves operational resilience through real-world failure scenarios. Enables game days and BCDR drills for mission-critical workloads. SAP Testing Automation Framework (STAF) STAF focuses on automating high availability (HA) and configuration compliance testing for SAP clusters on Azure. It uses Ansible playbooks and Python modules to execute controlled failover scenarios like node crashes or process termination and generates auditable reports. Key Benefits: Speeds up deployment readiness. Reduces manual testing effort. Validates HA configurations against best practices. Side-by-Side Comparison Aspect Chaos Engineering for SAP SAP Testing Automation Framework (STAF) Primary Goal Validate resiliency under unpredictable conditions Automate HA and configuration compliance testing Scope Infrastructure-level stress and failure injection SAP cluster failover and HA validation Approach Simulate real-world outages (VM shutdown, DNS failure) Controlled failover scenarios (node crash, process kill) Tools Used Azure Chaos Studio Ansible playbooks + Python modules Output Observability insights, recovery behavior reports Auditable HTML compliance reports Use Case BCDR drills, game days, proactive risk identification Pre-go-live readiness, periodic HA audits Complementarity Tests resilience beyond planned scenarios Ensures HA configuration meets best practices When to Use Each STAF → Before go-live or during periodic audits to validate HA setup. Chaos Engineering → For resilience testing under unexpected failures and operational stress. Key Takeaway These approaches are complementary, not competing. Use STAF for structured HA validation and compliance. Use Chaos Engineering for real-world resilience testing and operational confidence. Next Steps Explore Azure Chaos Studio for chaos experiments. Download STAF from GitHub and integrate it into your SAP deployment pipeline. Combine both for a comprehensive resiliency strategy. Conclusion: The two concepts of STAF and Chaos Engineering are not alternatives but complements to each other. While the former tests the accuracy of the SAP system and the business processes involved in its functionality, the latter tests the system in the real world with failures to confirm its ability to cope with such failures in the cloud environment of Azure. Therefore, the use of the STAF concept alone gives us the confidence that the SAP system works as expected, but the addition of Chaos Engineering gives us the confidence that the system will still work even when things go wrong. Ref links: SAP Testing Automation Framework (STAF): About SAP Testing Automation Framework | Microsoft Learn SAP Testing Automation Framework High Availability Testing | Microsoft Learn anukarnam/SAPTesting-Automation-Framework-: The SAP Test Automation Framework is a set of tools and solutions developed to simplify and automate the process of testing SAP systems and other associated third-party applications. It helps to overcome the challenges associated with manual testing by offering strong automation solutions. Chaos Engineering – Resilience & Failure Readiness: What is Azure Chaos Studio? - Azure Chaos Studio | Microsoft Learn Understand chaos engineering and resilience with Chaos Studio - Azure Chaos Studio | Microsoft Learn Using Azure Chaos Studio to Fortify SAP Systems Testing and Resiliency | Microsoft Community HubAzure SRE Agent Architecture and Creation: Practical Benefits for SAP on Azure Customers
Introduction: SRE Agent is an AI-powered service designed to support site reliability engineering practices through automation and intelligent decision-making. It reduces operational toil, improves uptime, and delivers consistent results by seamlessly integrating with Azure services and external systems to perform operational tasks with limited manual intervention. Azure SRE Agent reduces operational toil by automating routine and repetitive tasks, allowing teams to concentrate on high impact initiatives. Operational work frequently involves managing diverse Azure resources in combination with on-premises environments, often requiring orchestration across multiple tools. SRE Agent delivers an AI driven platform that unifies these systems and automates operational workflows from start to finish. How Azure SRE Agent Architecture and Creation Benefit SAP on Azure Customers: The SRE Agent architecture is particularly well suited for SAP workloads, which are inherently mission critical and span multiple Azure services, including compute, storage, networking, databases, and monitoring. By creating an Azure SRE Agent and associating it with SAP related resource groups, customers gain a unified operational control plane that continuously analyzes telemetry from Azure Monitor, logs, and metrics to identify issues impacting SAP availability, performance, and stability. Through automated diagnostics, root cause analysis, and guided or approval-based remediation, Azure SRE Agent significantly reduces manual troubleshooting during SAP incidents. In addition, its support for scheduled health checks, configuration validation, and compliance audits aligns closely with SAP best practices and change controlled environments, enabling customers to transition from reactive operations to a proactive, automated, and scalable model that improves uptime and operational confidence at scale. Centralized Azure Service Management Capabilities: This diagram illustrates Azure SRE Agent as the centralized automation and intelligence layer that manages Azure resources through Azure CLI and REST APIs, providing a unified control plane for operational tasks across the platform. From this single point, the agent connects to five core service domains: Compute (such as Virtual Machines, App Service, Container Apps, AKS, Functions and more), Storage (including Blob storage, file shares, managed disks, and storage accounts), Networking (covering Vnets, load balancers, application gateways, and network security groups), Databases (Azure SQL, Cosmos DB, PostgreSQL, MySQL, and Redis), and Monitoring & Management (Azure Monitor, Log Analytics, Application Insights, and Azure Resource Manager). Together, the layout shows how Azure SRE Agent enables consistent, automated, and scalable operations across diverse Azure services from a single, AI-driven management layer. Creating an SRE Agent in the Azure Portal: Access the Azure portal and complete the following steps to create an SRE Agent. From Home → Create a resource, search for “sre agent” in the Azure Marketplace. The results clearly highlight Azure SRE Agent (Preview) as an official Microsoft Azure service, confirming that it is provisioned like any other native Azure resource rather than an external tool or addon. Please select the Azure subscription in which the agent will be deployed and confirm the available Azure SRE Agent (Preview) plan. In the Basics step, select the subscription and resource group where the Azure SRE Agent will be created. You then provide agent specific details, including the agent's name and the Azure region in which the agent will be deployed and operated. This configuration ensures that the SRE Agent is established as a first-class Azure resource, governed, scoped, and managed using the same Azure constructs as any other native service. In this step, you define the level of access the agent will have over the Azure resource groups it manages, ensuring alignment with your organization’s security and governance requirements. Two permission levels are available: Reader: The agent has read only access to the assigned resource groups. It can observe resource state, analyze telemetry, and generate insights, but any remediation actions require temporary elevation using the user’s permissions after explicit approval. Privileged: The agent is granted permission to execute approved actions directly on detected resources and resource types within its assigned resource groups. This enables faster, more automated remediation while still operating within Azure RBAC controls and approval of workflows. This screen confirms that the Azure SRE Agent (Preview) has been successfully deployed in the Azure Portal. The banner “Your deployment is complete” indicates that all required resources were provisioned without errors and that the agent is now active in the selected subscription and resource group. This screen shows the Azure portal search experience after the Azure SRE Agent has been successfully deployed. By typing “my” in the top search bar, the portal surfaces both services and resources associated with the user’s subscription. Under the Resources section, the newly created Azure SRE Agent instance (for example, mysreapp) appears, confirming that the agent is now registered. Below screen shows the Azure SRE Agent chat interface for the deployed agent (mysreapp) within the Azure portal. It represents the primary interaction surface where users engage with Azure SRE Agent using natural language to monitor, diagnose, and remediate issues across the Azure resources associated with the agent. On the left navigation pane, users can manage chat threads, review activities, access the agent builder, monitor health and insights, and configure settings. The main panel displays a new chat thread with a prompt inviting the user to ask a question or execute a command. The quick action buttons (such as App Services, Container Apps and AKS) provide guided entry points to common operational scenarios, helping users get started quickly without needing to remember specific commands. Once the chat window opens, Azure SAP customers can begin interacting with the Azure SRE Agent using natural language to monitor and manage their SAP landscapes on Azure. To get started, try questions such as: What can you help me with my SAP systems? Which SAP subscriptions, resource groups, or SAP related resources are you managing? What alerts should I configure for my SAP workload (for example, SAP HANA, ASCS, or application servers)? Show me a comparison of successful requests versus errors for SAP dependent applications across all subscriptions. If you are troubleshooting a specific SAP issue, you can ask more targeted questions, for example: Why is my SAP system or SAP HANA database slow? Why is my SAP application or central services instance not responding? Can you investigate issues with my SAP workload? Can you retrieve key metrics (such as CPU, memory, disk I/O, or HANA latency) for my SAP resources? Conclusion: Azure SRE Agent empowers SAP customers with a centralized, AI driven operations layer built for managing complex, SAP landscapes on Azure. By integrating natively with Azure and using standard management interfaces, the agent delivers continuous, end-to-end visibility across the compute, storage, networking, database, and monitoring layers that underpin SAP workloads. This unified operational view enables teams to detect and understand issues affecting SAP availability, performance, and stability faster and with greater confidence. By combining automated diagnostics, intelligent root cause analysis, and guided or approval-based remediation, Azure SRE Agent dramatically reduces manual effort and accelerates incident resolution. Built-in support for proactive health checks, configuration validation, and compliance auditing aligns with SAP best practices and change controlled environments, allowing customers to move beyond reactive firefighting. Reference links: Tutorial: Troubleshoot an App Using Azure SRE Agent and Azure App Service Preview | Microsoft Learn Billing for Azure SRE Agent Preview | Microsoft Learn Incident Management in Azure SRE Agent Preview | Microsoft LearnSAP RISE & HANA Data Migration: AWS S3 to Azure Blob Storage via Azure Storage Mover
Introduction: Cloud migration using Azure Storage Mover enables high‑volume data transfer from Amazon S3 into Azure Blob Storage as part of SAP on Azure modernization initiatives, including RISE and SAP HANA deployments. Azure Arc multicloud connectors provide control‑plane integration with AWS, allowing Azure to authenticate and discover S3 buckets and expose them as managed external resources. Storage Mover then orchestrates the end‑to‑end data movement using S3 source and Blob target endpoints, supporting large‑scale migrations with telemetry, integrity validation, and incremental sync options. Migrated datasets land in Azure Blob Storage, where they can be consumed by SAP HANA, SAP Data Intelligence, SAP S/4Hana, BW/4Hana and downstream Azure analytics or backup workflows. The process completes as the data lands in Azure Blob Storage, establishing a cloud‑native destination for ongoing analytics, applications, or archival needs. This guide outlines the essential prerequisites, service limits, and configuration steps required to implement migration with clarity and operational confidence. End‑to‑End Migration with SAP Consumption Diagram: The migration process begins with Amazon S3, which serves as source storage containing the files and folders you intend to transfer to Azure. Azure then connects securely to your AWS environment through the Azure Arc Multicloud Connector, which authenticates to AWS, discovers your S3 buckets, and exposes their inventory to Azure as part of its cross‑cloud control‑plane integration. Once connected, Azure Storage Mover a fully managed migration engine executes the data transfer using the configured S3 source endpoint and Azure Blob target endpoint. Storage Mover supports large‑scale migrations of up to 500 million objects per job, provides incremental sync options, and offers detailed logging and job tracking for operational visibility. Once migrated datasets are staged in Azure Blob Storage, where they can be leveraged by SAP HANA, SAP Data Intelligence, SAP S/4HANA, BW/4HANA, and integrated Azure analytics or backup workflows. The cloud‑native destination that supports containers, lifecycle policies, and seamless integration with Azure analytics and application services such as Azure Synapse, Azure Data Factory, and AI workloads. 1) Prerequisites: An active Azure subscription with the required permissions to manage Azure Storage Mover and Azure Arc resources. An AWS account with access to the Amazon S3 bucket you plan to migrate. A destination Azure Storage account to receive the transferred data. An Azure Storage Mover resource already deployed in your Azure environment. 2) Service Limits: Supports up to 500 million objects per migration job. Allows a maximum of 10 concurrent migration jobs per subscription (higher limits available through support). Archived objects are not automatically rehydrated: Data in Deep Archive or Glacier must be restored before migration. Private networking is not supported: Secure data transfer is enforced using trusted Azure IP ranges. 3) Create Multicloud Connector for AWS: Use Azure Arc to connect AWS services to Azure, enabling discovery and data transfer operations. Steps include selecting subscription, region, AWS account ID, and adding Inventory and Storage Data Management solutions. 4) Configure Endpoints: AWS S3 Source Endpoint - Navigate to Storage Endpoints → Source endpoints → Add endpoint. Select the AWS S3 bucket from the multicloud connector. Azure Blob Storage Target Endpoint - Select your subscription, storage account, and blob container. Assign Storage Blob Data Contributor RBAC role. 5) Creating Migration Project & Job Definition: Create a migration project under Project Explorer. Add a Job Definition specifying source S3 endpoint and target Blob endpoint. Select migration mode: Mirror, Incremental, or Full copy. 6) Run & Monitor Migration Job: Start the job from the Job Properties pan. Monitor speed, progress %, and estimated completion time in Migration Overview. Review logs for warnings or transfer errors. 7) Post-Migration Validation: Verify data integrity and completeness. Conduct UAT testing. Enable incremental sync if needed. Optionally delete S3 bucket after migration is validated. Conclusion: The combined use of Azure Storage Mover and Azure Arc Multicloud Connector provides robust, scalable, and secure architecture for migrating data from Amazon S3 to Azure Blob Storage especially for SAP, RISE, and HANA modernization scenarios. By extending Azure’s control plane into AWS, Arc enables authenticated discovery of S3 buckets while Storage Mover orchestrates high‑throughput, policy‑driven data movement using defined source and target endpoints. With support for large object counts, telemetry, and sync capabilities, the solution ensures operational visibility and data integrity throughout the migration lifecycle. Once transferred, data lands in Azure Blob Storage, ready to integrate with SAP HANA, SAP Data Intelligence, SAP S/4HANA, BW/4HANA, and Azure-native analytics and AI services, establishing a high‑performance foundation for SAP workloads in the cloud. Reference links: Migrate data from Amazon S3 to Azure Storage - Azure Data Factory | Microsoft Learn 3476767 - How-To: Connect S/4HANA on premise with Azure Blob | SAP Knowledge Base ArticleMigration from SAP ERP On-Premises to SAP S/4HANA in Microsoft Azure
This describes the tool guided migration of an on-premises SAP ERP system into Microsoft Azure, combined with a system conversion to SAP S/4HANA. The Software Update Manager (SUM) acts as the technical engine for the conversion. I will also explain how the SAP Cloud Appliance Library streamlines this process through a step-by-step approach. In general, there are three primary paths for migrating to SAP S/4HANA: Selective Data Transition New Implementation SAP S/4HANA System Conversion 1.System Conversion It will break down into the Preparation Phase and Realization Phase. Preparation Phase System Requirements This phase ensures that the current SAP ECC system, infrastructure, database, and operating system meet the minimum prerequisites for a conversion to SAP S/4HANA. Key activities include: Verifying supported OS, DB, and Unicode requirements Checking add-ons and thirdparty components Confirming hardware capacity (CPU, RAM, storage) Assessing source release compatibility for SUM execution This step forms the technical foundation before any planning can begin. Maintenance Planner SAP Maintenance Planner validates and prepares the system stack for conversion. It checks: Active add-ons and their compatibility Installed components and required upgrade paths Target SAP S/4HANA release stack Required XML file generation for SUM Outcome: A stack XML file used by SUM to guide the technical conversion. Simplification Item Check (SICheck) SICheck analyzes the ECC system for mandatory functional and technical changes required by SAP S/4HANA. This includes: Identifying simplification items (Example: Finance, Logistics, master data changes) Highlighting inconsistencies in custom or standard objects Showing mandatory actions before conversion (Example: Customer Vendor Integration (CVI) for Business Partner (BP), Open Item Management updates) This provides a detailed “to-do list” to bring the system into an S/4HANAcompliant state. Custom Code Preparation This phase ensures that custom developments (Zprograms, enhancements, exits) will work on the S/4HANA environment. Activities include: Running ABAP Test Cockpit (ATC) checks Identifying usage-based custom code via SAP Readiness Check / UPL Adapting code for removed or deprecated data structures (Example: MATDOC, tables replaced in S/4HANA) Planning remediation for performance or syntax changes This ensures custom code does not break after conversion. Realization Phase Software Update Manager (SUM) The Software Update Manager serves as the technical engine for system conversion. It performs: Database migration to SAP HANA Software component upgrade to S/4HANA Data conversion and migration Technical downtime execution Post-processing of the system landscape SUM combines upgrade, migration, and conversion into one guided procedure. You can perform the conversion using the in-place option, allowing the existing SAP ECC system to remain on-premises. Alternatively, you can combine the move with a transition to a hyperscaler, an approach that becomes particularly powerful in the context of RISE with SAP. RISE with SAP provides a comprehensive, modular cloud transformation offering that bundles software, infrastructure, and managed services into a single contract. It enables organizations to modernize their SAP landscape by running SAP S/4HANA in a hyperscaler environment (such as Microsoft Azure) while SAP takes responsibility for technical operations at the application layer. This includes lifecycle management, technical monitoring, SLA-backed operations, security patching, and upgrading orchestration. RISE also supports business transformation through embedded tools, extensibility options, and continuous innovation. By integrating your system conversion with RISE with SAP, you can streamline the journey to S/4HANA, reduce operational overhead, shift from CAPEX (Capital Expenditure) to OPEX (Operational Expenditure), and accelerate innovation using cloud-scale capabilities while SUM delivers the technical conversion engine underneath. Application-Specific Follow-Up Activities After the technical conversion, functional teams’ complete configuration and validation tasks specific to their modules. Example: Finance: Activation of Universal Journal, data reconciliation, asset accounting migration Logistics: Credit management migration, new ATP setup Security: Role and authorization adjustments SAP Basis and ABAP team: Fiori activation and launchpad configurations Techno functional: Business validation and testing SAP Basis: Cutover activities and golive preparation These steps ensure that the converted system is functional, optimized, and ready for productive use. Summary View: Phase Step Purpose Preparation System Requirements Ensure technical foundation is ready Maintenance Planner Validate system stack & generate XML SICheck Identify required functional simplifications Custom Code Preparation Analyze & adapt custom developments Realization Software Update Manager Perform technical upgrade & data conversion Application Follow-up Complete module-specific configuration & validation New implementation: New Implementation with DMO (Database Migration Option) A New Implementation (Greenfield approach) means building a completely new SAP S/4HANA system and migrating selected data into it. How DMO fits in: DMO is not used on the new S/4HANA system itself, instead it is used on the source ECC system when needed to support the transition process. You would use DMO when you want to: Upgrade and/or migrate the old ECC system to the SAP HANA database temporarily Enable smoother extraction of data using SAP Migration Cockpit or 3rdparty ETL tools Prepare the source system technically and functionally before data migration to the new S/4HANA system DMO prepares the old system, but the final target is a clean, newly installed S/4HANA instance. We have two options when using the Software Update Manager (SUM): DMO with System Move: In this scenario, SUM begins the procedure on the source system and then continues execution on the target system. This is typically used when migrating to a new host or infrastructure while performing the upgrade and database migration in one combined process. DMO Migration Option – Move to SAP S/4HANA on a Hyperscaler (DMOVE2S4): Here, SUM starts an additional application server that runs in the target environment but still belongs to the source system landscape. This enables a controlled transition to a hyperscaler environment (such as Azure) while completing the conversion and migration steps required for SAP S/4HANA. In both cases, several preparatory tasks must be completed in the target environment, such as: Selective Data Transition (SDT) with DMO: Selective Data Transition is a hybrid approach between Brownfield and Greenfield. It allows moving only the data you choose, such as: Specific company codes Selected historical periods Organizational carveouts M&A (Merger and acquisition) landscape consolidation How DMO fits in: DMO is typically used as the first step in preparing the source ECC system. It migrates the source ECC system to SAP HANA Performs required technical upgrades Ensures compatibility with the S/4HANA data model Prepares system objects so that selective extraction tools (SNP, Natuvion, CBS, etc.) can run After DMO, partner tools extract selected data into the target S/4HANA system. DMO modernizes and upgrades the source system, enabling selective extraction and migration. Target Environment Preparation Tasks (for DMO with System Move & DMOVE2S4) Before SUM can execute migration steps in the target environment, several technical preparations must be completed. These ensure that the new infrastructure (IaaS, or hyperscaler) is fully ready for the handover from the source system. Provisioning the Target Infrastructure You must set up a clean, properly sized environment that will serve as the new application host or target system. This includes: Creating virtual machines or hosts (on hyperscaler) Ensuring CPU, RAM, and storage meet SAP sizing guidelines Preparing appropriate disk layout for /usr/sap, /sapmnt, /hana/shared, and log/data volumes (for HANA scenarios) Operating System Preparation The OS must meet SAP and SUM prerequisites: Install a certified OS version (Example: RHEL, SLES, Windows if applicable) Apply required OS patches and kernel versions Configure OS locales, time synchronization, and system limits (ulimits, transparent huge pages, UUID config) Create SAP system users (Example: <sid>adm, sidadm, sapadm) if not automatically provisioned Network and Connectivity Setup DMO requires bidirectional connectivity between source and target systems: Open required TCP ports (e.g., DIAG, RFC, HANA SQL ports, SAP Host Agent ports) Validate hostname resolution using DNS or /etc/hosts Set up VPN, ExpressRoute, or Peering if migrating to a hyperscaler Ensure no restrictive firewalls block SUM or SAP Host Agent communication SAP Host Agent Installation SUM requires a functional Host Agent on the target system: Install SAP Host Agent (latest version recommended) Configure Host Agent service user and permissions Validate connectivity from source to target Host Agent File System Preparation Depending on your architecture: Set up NFS shares for /sapmnt (if shared in distributed system environments) Prepare directories for SUM extraction and temporary files Ensure proper ownership and permissions: sidadm:sapsys Database Preparation For HANA-based targets: Provision SAP HANA DB following sizing guidelines Configure data and log volumes with recommended I/O throughput Install the correct HANA version compatible with your SUM stack Validate HANA OS parameters (vm.dirty_background_ratio, thp, huge pages) Ensure network configuration supports SAP HANA replication if needed Software Staging and Media Preparation For SUM to continue the target: Download and stage SAP software media (kernel, stack files, SAP HANA installation media, archives) Ensure directories are accessible to SUM during the handover phase Upload SUM SAR files and extract them on the target host if required by your scenario Security and User Setup Depending on your landscape: Configure secure shell (SSH) trust between source and target (for SUM) Set up service users and groups (sapsys, <sid>adm) Validate OS-level sudo rules if needed for certain scripts or root actions Parameter Alignment Between Source & Target To ensure SUM can continue seamlessly: Synchronize system parameters (Example: time zone, code page, locale) Ensure consistent SAP profiles (DEFAULT.PFL, instance profiles) Confirm kernel patch levels where required Storage & Backup Preparation Before running SUM: Configure snapshot policies if supported by your hyperscaler Ensure backup tools or agents are installed (Azure Backup, thirdparty agents) Validate I/O throughput to avoid SUM performance bottlenecks during migration Validation & Health Checks Before starting SUM: Run OS validation scripts provided by SAP Test Host Agent connectivity from the source system Confirm network speed between source ↔ target meets SAP minimum requirements Validate free disk space for SUM logs, dumps, and temporary directories Summary Comparison Approach Target System Role of DMO When to Use New Implementation Brandnew S/4HANA installation Prepares source ECC (HANA migration + upgrade) before extracting data Modernization, redesign, bestpractice adoption Selective Data Transition Part-new, part-reused S/4HANA system Prepares source ECC (technical readiness for selective extraction) Carveouts, mergers, consolidations, partial history moves System Conversion – Data Migration Option to Microsoft Azure: The diagram illustrates a coordinated, tool driven, end-to-end migration path where: The Customer sets direction and validates outcomes Maintenance Planner creates a certified conversion plan SAP CAL automatically provisions the Azure target landscape SUM executes all technical conversion and database migration steps Together, these components create a standardized, repeatable, and automated path to move from SAP ERP on-premises to SAP S/4HANA on Microsoft Azure. Conclusion: The Database Migration Option (DMO) of the Software Update Manager provides a powerful and flexible framework for transitioning SAP systems to SAP S/4HANA whether through a classic system conversion, a new implementation scenario, or a selective data transition approach. Both DMO with System Move and DMOVE2S4 extend these capabilities by enabling migrations to new infrastructure or hyperscale environments while maintaining a controlled, SAP supported technical procedure. Regardless of which DMO scenario is selected, success hinges thoroughly preparing the target environment. Proper provisioning of infrastructure, operating system configuration, network readiness, SAP Host Agent installation, file system setup, software staging, and security alignment ensure a smooth and stable handover from source to target. These preparation activities minimize technical risk, reduce downtime, and enable SUM to execute migration and conversion with high reliability. By combining SAP’s proven migration tooling with a well-prepared target landscape, organizations can confidently modernize their SAP footprint, leverage hyperscale scalability, and move toward a future ready SAP S/4HANA platform aligned with cloud transformation strategies. Reference links: SUM & DMO on SAP Help Portal: Software Update Manager | SAP Help Portal SAP Note 2377305 – DMO: Database Migration Option: https://me.sap.com/notes/2377305 SAP Readiness Check: Integration Between SAP Readiness Check and SAP Cloud ALM | SAP Help Portal SAP CAL Homepage: https://cal.sap.com/ RISE with SAP Overview: RISE with SAP | Transformation journey to SAP Business Suite RISE with SAP S/4HANA Cloud documentation: https://help.sap.com/docs/RISE_WITH_SAP SAP on Azure migration: SAP on Azure Migration – SAP Intelligent Enterprise | Microsoft Azure1.2KViews2likes0CommentsAzure delivers the first cloud VM with Intel Xeon 6 and CXL memory - now in Private Preview
Intel released their new Intel Xeon 6 6500/6700 series processor with P-cores this year. Intel Xeon 6 processors provide performance and scalability by delivering outstanding performance for transactional and analytical workloads and provide scale-up capacities of up to 64TB of memory. In addition, Intel Xeon 6 supports the new Compute Express Link (CXL) standard that enables memory expansion to accommodate larger data sets in a cost-effective manner. CXL Flat Memory Mode is a unique Intel Xeon 6 capability that enhances the ability to right-size the compute-to-memory ratio and improve scalability without sacrificing performance. This enhanced ability can help run SAP S/4HANA more efficiently and help enable greater flexibility for configurations so they can better align with business needs and improve the total cost of ownership. In collaboration with SAP and Intel, Microsoft is delighted to announce private preview of CXL technology on Azure M-series family of VMs. We believe that, when combined with advancements in the new Intel Xeon 6 processors, it can tackle the challenges of managing the growing volume of data in SAP software, meet the increased demand for faster compute performance and reduce overall TCO. Stefan Bäuerle, SVP, Head of BTP, HANA & Persistency at SAP noted: “Intel Xeon 6 helps deliver system scalability to support the growing demand for high-performance computing and growing database capacity among SAP customers.” Elyse Ge Hylander, Senior Director, Azure SAP Compute stated: “At Microsoft, we are continually exploring new technological innovations to improve our customer experience. We are thrilled about the potential of Intel’s new Xeon 6 processors with CXL and Flat Memory Mode. This is a big step forward to deliver the next-level performance, reliability, and scalability to meet the growing demands of our customers.” Bill Pearson, Vice President of Data Center and Artificial Intelligence at Intel states: “Intel Xeon 6 represents a significant advancement for Intel, opening up exciting business opportunities to strengthen our collaboration with Microsoft Azure and SAP. The innovative instance architecture featuring CXL Flat Memory Mode is designed to enhance cost efficiency and performance optimization for SAP software and SAP customers.” If you are interested in joining our CXL private preview in Azure, contact Mseries_CXL_Preview@microsoft.com ### Co-author: Phyllis Ng - Senior Director of Hardware Strategic Planning (Memory and Storage) - MicrosoftSAP on Azure Product Announcements Summary – SAP TechEd 2025
Today at SAP TechEd 2025, we are excited to share the next evolution of the Microsoft-SAP partnership. Building on decades of collaboration, we continue to advance RISE with SAP on Azure and deepen integrations with SAP S/4HANA Cloud public edition. Our latest innovations deliver enhanced security for SAP and non-SAP workloads, while unified analytics and AI-driven Copilot experiences empower customers to make smarter decisions. These advancements are designed to help customers accelerate their digital transformation, drive operational excellence, and unlock new business value. Customer Spotlight: Medline Medline’s SAP transformation on Microsoft Azure is fueling new levels of agility and intelligence across its operations with SAP on Azure. The company’s migration boosted system resilience, improved key SAP workload transaction times by more than 80% and enabled real-time collaboration and predictive analytics for clinicians and business users - laying the groundwork to extend these insights through Copilot and Azure AI. “When we partnered on the migration, it ushered in a completely new way in which Microsoft and Medline work together. It became a partnership, with the cloud migration becoming a stepping stone to bigger and brighter, more business-outcome–driven engagements.” — Jason Kaley, SVP, IT Operations & Architecture, Medline Customer Spotlight: Commerz Real Commerz Real, a German financial services firm specializing in real estate, infrastructure, and leasing, modernized its SAP infrastructure by migrating its complete SAP landscape to SAP RISE on Azure. Built to address stringent regulatory, security, and performance demands, the platform delivers high scalability, real-time monitoring, and faster, more stable operations. “The decision to use Microsoft Azure was a deliberate one. In the past, security concerns and strict regulatory requirements kept us from moving SAP to the cloud. Today we say: If you don’t do that, you won’t survive in the market.” — Nadine Felderer, Head of SAP Services, Commerz Real We are pleased to announce additional SAP with Microsoft product updates and details to further help customers innovate on the most trusted cloud for SAP. Bi-directional Agent to Agent communication between Microsoft Copilot and SAP Joule. Enterprise-ready SAP API enablement for AI through MCP in Azure API Management. General Availability of our agentless Sentinel for SAP data connector with significantly simpler onboarding through SAP Integration Suite. Ready for the future. SAP released S/4HANA Cloud public edition for our Sentinel Solution for SAP. Microsoft Entra ID advances SAP identity governance with new OAuth 2.0 support, SAP IAG integration preview, and expanded SAP Access Control migration for unified, secure access. Advanced support for High Availability with SAP ASE (Sybase) database backup on Azure Backup. SAP Deployment Automation Framework now supports highly available scale-out architectures with HANA System Replication for large-scale resilient configurations. SAP Testing Automation Framework enhances high availability testing with offline Pacemaker cluster validation for RHEL/SUSE, and native Linux-based validation tools quality checks Enhanced SAP Inventory and Observability Dashboard to reduce operational risk, and supports production-ready SAP systems, along with a customizable Windows Quality Checks PowerShell template. Let's dive into the summary details of product updates and services. Extend and Innovate and Secure Copilot Studio and SAP Joule Since the release of the Joule and Copilot integration earlier this year, we have seen great interest and adoption with customers and partners. The Joule as a host integration is planned to be released later this year. Integrating Joule with Microsoft 365 Copilot | SAP Help Portal For customers on their journey towards RISE and GROW, we also worked on the Azure API Management team to enable the exposure of SAP OData Services from your SAP Systems as an MCP server which then can be consumed in Copilot using Microsoft Copilot Studio. This enables the interaction of end-users with their SAP system based on any OData services. For more details, check out Expose REST API in API Management as MCP server and Copilot + SAP: Azure API Management, MCP and SAP OData. To simplify the integration and help customers and partners get started faster, we are releasing preconfigured Copilot Studio Agent that can orchestrate over other agents like SAP, Fabric and Microsoft 365. Customers can use these agents out of the box or use them as a foundation to extend and build their own Copilot Agents. Microsoft Security for SAP Security is being reengineered for the AI era - moving beyond static, rule-bound controls and after-the-fact response toward platform-led, machine-speed defense. Attackers think in graphs - Microsoft does too. We are bringing relationship-aware context to Microsoft Security suite - so defenders and AI can see connections, understand the impact of a potential compromise (blast radius), and act faster across pre-breach and post-breach scenarios. SAP S/4HANA Cloud public edition Add-on for Microsoft Sentinel for SAP (preview): Enables deep, native integration of SAP telemetry with Sentinel, bringing advanced threat detection, investigation, and response to SAP workloads running in the cloud. Microsoft Sentinel for SAP Agentless Data Connector: Now generally available, the agentless connector significantly simplifies deployment while delivering secure, high-fidelity ingestion of SAP audit and application logs into Sentinel. Expanded Security Guidance: Enhanced guidance for Microsoft Defender, Ransomware Protection, and Cyber Defense for SAP, helping customers implement best practices for hardening SAP environments and responding to evolving threats. Cost-Efficient Long-Term Log Storage: Organizations can now take advantage of Sentinel Data Lake to retain SAP logs for 12 years at scale for compliance (NIS2, DORA) and forensic use cases - at a fraction of traditional storage costs. Purview shipping most requested features updates for our existing SAP connectors (SNC mode support in preview, CDS view support, and scoped metadata scanning) and a new connector for BW/4HANA. SAP has reiterated end of maintenance for SAP Identity Management (SAP IDM) by end of 2027 and is collaborating with Microsoft so customers can migrate identity scenarios to Microsoft Entra ID as the recommended successor approach. Provisioning backbone in place: Microsoft Entra released new features for the built‑in connector for SAP Cloud Identity Services (CIS) to support authentication with OAuth 2.0, and provisioning of groups to streamline authorization management in downstream SAP targets like SAP S/4HANA and SAP BTP, enabling HR‑driven, end‑to‑end identity lifecycles. Private Preview: Microsoft Entra Integration with SAP IAG: The private preview for Microsoft Entra integration with SAP Identity Access Governance (IAG) is now underway. Selected customers are testing Entra ID Governance access packages that include SAP IAG roles as resources, routing of access approvals through SAP IAG, and provisioning of roles across both systems. Sign-Up here. Enhanced Integration Scope with SAP Access Control (AC): Driven by direct customer feedback, Microsoft and SAP are expanding the migration and integration scope to include SAP Access Control (AC). This enhancement will enable comprehensive access management, risk analysis, and policy enforcement on-premises, leveraging Microsoft Entra’s governance capabilities for improved security and compliance. Together, these innovations give customers end-to-end visibility and protection across SAP landscapes—spanning public cloud, hybrid, and on-premises deployments. SAP on Azure Software Products and Services Azure Backup for SAP We are committed to expanding backup support for additional SAP workloads. Following the general availability of ASE backup, we have further enhanced its capabilities with the introduction of high availability configuration support. This enhancement delivers automatic backup support for SAP systems setup with Replication Server, ensuring seamless protection after failover or failback events without the need for manual intervention. As a result, users benefit from immediate and continuous data protection, along with a simplified restore process using a single backup chain. We have expanded our Snapshot backup capability for SAP HANA by adding Recovery Services Vault support. This will help customers store their snapshot backups with long term retention, while gaining protection from Ransomware attacks. Vault support brings in capabilities like immutability, soft-delete enablement, multi-user-authorization to further safeguard the data. We have also launched the preview for “Scale-out” support configurations for SAP HANA streaming backup, expanding our overall topology support. SAP Deployment Automation Framework We are releasing updates to the SAP Deployment Automation Framework (SDAF) and SAP Testing Automation Framework (STAF) that expand testing coverage, improve reliability, and provide additional deployment flexibility for SAP environments on Azure. SAP Deployment Automation Framework (SDAF) SDAF deployment and configuration scenarios now include scale-out architectures with HANA System Replication (HSR). This enhancement addresses resiliency requirements for large-scale deployments requiring multi-node scale-out configurations with built-in replication capabilities. SDAF now supports GitHub Actions in addition to existing deployment methods including Azure DevOps pipelines, CLI scripts, and the WebApp interface. Organizations using GitHub for source control and infrastructure management can now deploy and manage SAP environments using their existing workflows and tooling preferences. SAP Testing Automation Framework (STAF) STAF now supports offline validation for SAP Pacemaker clusters. This capability enables testing of resource agent failover mechanisms without executing live cluster operations, reducing risk during validation cycles and allowing for pre-deployment verification of high availability configurations. The high availability testing suite has been updated to include SAPHanaSr-ANGI tests, ensuring compatibility with SUSE Linux Enterprise Server 15 and SAP HANA 2.0 SP5 environments. This update addresses the requirements of organizations running current SAP HANA releases on modern SUSE distributions. Configuration checks in preview, represents a rewrite of the open-source Quality Checks tool, now integrated as a native capability within STAF. This tool validates SAP on Azure installations against Microsoft reference architecture and configuration guidance. Azure Center and Azure Monitor for SAP Solutions We are pleased to share that Azure Center for SAP solutions (ACSS) is now available in Italy North, providing end-to-end SAP workload management to more customers across Europe. Additionally, Azure Monitor for SAP solutions (AMS) is now available in Italy North. AMS continues to help SAP customers reliably monitor their mission-critical workloads on Azure with comprehensive insights. Get started: Azure Center for SAP solutions | Microsoft Learn What is Azure Monitor for SAP solutions? | Microsoft Learn Azure Portal Azure Center for SAP solutions Tools and Frameworks We have refreshed our SAP on Azure Well-Architected Framework and the accompanying SAP on Azure Assessment to reflect the latest platform guidance. The update aligns with recent Azure innovations—including VMSS Flex, Premium SSD v2, Capacity Reservation Groups, Mv3-series, and NVMe-based SKUs—so architects and admins can plan and deploy with current best practices. The assessment is also now surfaced on the main Assessments hub for easier access and can be used as a repeatable checkpoint throughout your SAP deployment lifecycle. Quality Checks (PowerShell) for windows: We have published a lightweight, read-only script for customers running SAP on Windows and SQL Server on Microsoft Azure. It performs post-provisioning health checks and outputs a color-coded HTML report plus JSON. Use it as a baseline template—customize the thresholds to your environment, and feel free to contribute enhancements to cover your configuration requirements. Observability Dashboard: Based on customer feedback, we have expanded the dashboard to surface design-impacting signals for running specialized workloads on Azure. It now offers Overview, Security, Networking, and Inventory views, plus extended reports for managers and hands-on engineers. Updates make it easier to review VM redundancy, spot orphaned resources, see Capacity Reservation Groups with their associated VMs in the primary region, and count Public IPs on the Basic SKU—helping you stay on top of infrastructure hygiene and avoid unsupported configurations. SAP + Microsoft Co-Innovations Microsoft and SAP are always working on new solutions to help our customers adapt and grow their businesses in several areas including AI, Business Suite, Data, Cloud ERP, Security, SAP BTP, among others. Recently, we started a new era of Agentic AIOps collaboration between SAP and Microsoft with fully orchestrated multi-agent ecosystem for mission critical workload. Please check out this blog to learn more.Agentic Integration with SAP, ServiceNow, and Salesforce
Copilot/Copilot Studio Integration with SAP (No Code) By integrating SAP Cloud Identity Services with Microsoft Entra ID, organizations can establish secure, federated identity management across platforms. This configuration enables Microsoft Copilot and Teams to seamlessly connect with SAP’s Joule digital assistant, supporting natural language interactions and automating business processes efficiently. Key Resources as given in SAP docs (Image courtesy SAP): Configuring SAP Cloud Identity Services and Microsoft Entra ID for Joule Enable Microsoft Copilot and Teams to Pass Requests to Joule Copilot Studio Integration with ServiceNow and Salesforce (No Code) Integration with ServiceNow and Salesforce, has two main approaches: Copilot Agents using Copilot Studio: Custom agents can be built in Copilot Studio to interact directly with Salesforce CRM data or ServiceNow knowledge bases and helpdesk tickets. This enables organizations to automate sales and support processes using conversational AI. Create a custom sales agent using your Salesforce CRM data (YouTube) ServiceNow Connect Knowledge Base + Helpdesk Tickets (YouTube) 3rd Party Agents using Copilot for Service Agent: Microsoft Copilot can be embedded within Salesforce and ServiceNow interfaces, providing users with contextual assistance and workflow automation directly inside these platforms. Set up the embedded experience in Salesforce Set up the embedded experience in ServiceNow MCP or Agent-to-Agent (A2A) Interoperability (Pro Code) - (Image courtesy SAP) If you choose a pro-code approach, you can either implement the Model Context Protocol (MCP) in a client/server setup for SAP, ServiceNow, and Salesforce, or leverage existing agents for these third-party services using Agent-to-Agent (A2A) integration. Depending on your requirements, you may use either method individually or combine them. The recently released Azure Agent Framework offers practical examples for both MCP and A2A implementations. Below is the detailed SAP reference architecture, illustrating how A2A solutions can be layered on top of SAP systems to enable modular, scalable automation and data exchange. Agent2Agent Interoperability | SAP Architecture Center Logic Apps as Integration Actions Logic Apps is the key component of Azure Integration platform. Just like so many other connectors it has connectors for all this three platforms (SAP, ServiceNow, Salesforce). Logic Apps can be invoked from custom Agent (built in action in Foundry) or Copilot Agent. Same can be said for Power Platform/Automate as well. Conclusion This article provides a comprehensive overview of how Microsoft Copilot, Copilot Studio, Foundry by A2A/MCP, and Azure Logic Apps can be combined to deliver robust, agentic integrations with SAP, ServiceNow, and Salesforce. The narrative highlights the importance of secure identity federation, modular agent orchestration, and low-code/pro-code automation in building next-generation enterprise solutions.2.3KViews1like0Comments