sap on azure
51 TopicsAzure delivers the first cloud VM with Intel Xeon 6 and CXL memory - now in Private Preview
Intel released their new Intel Xeon 6 6500/6700 series processor with P-cores this year. Intel Xeon 6 processors provide performance and scalability by delivering outstanding performance for transactional and analytical workloads and provide scale-up capacities of up to 64TB of memory. In addition, Intel Xeon 6 supports the new Compute Express Link (CXL) standard that enables memory expansion to accommodate larger data sets in a cost-effective manner. CXL Flat Memory Mode is a unique Intel Xeon 6 capability that enhances the ability to right-size the compute-to-memory ratio and improve scalability without sacrificing performance. This enhanced ability can help run SAP S/4HANA more efficiently and help enable greater flexibility for configurations so they can better align with business needs and improve the total cost of ownership. In collaboration with SAP and Intel, Microsoft is delighted to announce private preview of CXL technology on Azure M-series family of VMs. We believe that, when combined with advancements in the new Intel Xeon 6 processors, it can tackle the challenges of managing the growing volume of data in SAP software, meet the increased demand for faster compute performance and reduce overall TCO. Stefan Bäuerle, SVP, Head of BTP, HANA & Persistency at SAP noted: “Intel Xeon 6 helps deliver system scalability to support the growing demand for high-performance computing and growing database capacity among SAP customers.” Elyse Ge Hylander, Senior Director, Azure SAP Compute stated: “At Microsoft, we are continually exploring new technological innovations to improve our customer experience. We are thrilled about the potential of Intel’s new Xeon 6 processors with CXL and Flat Memory Mode. This is a big step forward to deliver the next-level performance, reliability, and scalability to meet the growing demands of our customers.” Bill Pearson, Vice President of Data Center and Artificial Intelligence at Intel states: “Intel Xeon 6 represents a significant advancement for Intel, opening up exciting business opportunities to strengthen our collaboration with Microsoft Azure and SAP. The innovative instance architecture featuring CXL Flat Memory Mode is designed to enhance cost efficiency and performance optimization for SAP software and SAP customers.” If you are interested in joining our CXL private preview in Azure, contact Mseries_CXL_Preview@microsoft.com ### Co-author: Phyllis Ng - Senior Director of Hardware Strategic Planning (Memory and Storage) - MicrosoftSAP on Azure Product Announcements Summary – SAP TechEd 2025
Today at SAP TechEd 2025, we are excited to share the next evolution of the Microsoft-SAP partnership. Building on decades of collaboration, we continue to advance RISE with SAP on Azure and deepen integrations with SAP S/4HANA Cloud public edition. Our latest innovations deliver enhanced security for SAP and non-SAP workloads, while unified analytics and AI-driven Copilot experiences empower customers to make smarter decisions. These advancements are designed to help customers accelerate their digital transformation, drive operational excellence, and unlock new business value. Customer Spotlight: Medline Medline’s SAP transformation on Microsoft Azure is fueling new levels of agility and intelligence across its operations with SAP on Azure. The company’s migration boosted system resilience, improved key SAP workload transaction times by more than 80% and enabled real-time collaboration and predictive analytics for clinicians and business users - laying the groundwork to extend these insights through Copilot and Azure AI. “When we partnered on the migration, it ushered in a completely new way in which Microsoft and Medline work together. It became a partnership, with the cloud migration becoming a stepping stone to bigger and brighter, more business-outcome–driven engagements.” — Jason Kaley, SVP, IT Operations & Architecture, Medline Customer Spotlight: Commerz Real Commerz Real, a German financial services firm specializing in real estate, infrastructure, and leasing, modernized its SAP infrastructure by migrating its complete SAP landscape to SAP RISE on Azure. Built to address stringent regulatory, security, and performance demands, the platform delivers high scalability, real-time monitoring, and faster, more stable operations. “The decision to use Microsoft Azure was a deliberate one. In the past, security concerns and strict regulatory requirements kept us from moving SAP to the cloud. Today we say: If you don’t do that, you won’t survive in the market.” — Nadine Felderer, Head of SAP Services, Commerz Real We are pleased to announce additional SAP with Microsoft product updates and details to further help customers innovate on the most trusted cloud for SAP. Bi-directional Agent to Agent communication between Microsoft Copilot and SAP Joule. Enterprise-ready SAP API enablement for AI through MCP in Azure API Management. General Availability of our agentless Sentinel for SAP data connector with significantly simpler onboarding through SAP Integration Suite. Ready for the future. SAP released S/4HANA Cloud public edition for our Sentinel Solution for SAP. Microsoft Entra ID advances SAP identity governance with new OAuth 2.0 support, SAP IAG integration preview, and expanded SAP Access Control migration for unified, secure access. Advanced support for High Availability with SAP ASE (Sybase) database backup on Azure Backup. SAP Deployment Automation Framework now supports highly available scale-out architectures with HANA System Replication for large-scale resilient configurations. SAP Testing Automation Framework enhances high availability testing with offline Pacemaker cluster validation for RHEL/SUSE, and native Linux-based validation tools quality checks Enhanced SAP Inventory and Observability Dashboard to reduce operational risk, and supports production-ready SAP systems, along with a customizable Windows Quality Checks PowerShell template. Let's dive into the summary details of product updates and services. Extend and Innovate and Secure Copilot Studio and SAP Joule Since the release of the Joule and Copilot integration earlier this year, we have seen great interest and adoption with customers and partners. The Joule as a host integration is planned to be released later this year. Integrating Joule with Microsoft 365 Copilot | SAP Help Portal For customers on their journey towards RISE and GROW, we also worked on the Azure API Management team to enable the exposure of SAP OData Services from your SAP Systems as an MCP server which then can be consumed in Copilot using Microsoft Copilot Studio. This enables the interaction of end-users with their SAP system based on any OData services. For more details, check out Expose REST API in API Management as MCP server and Copilot + SAP: Azure API Management, MCP and SAP OData. To simplify the integration and help customers and partners get started faster, we are releasing preconfigured Copilot Studio Agent that can orchestrate over other agents like SAP, Fabric and Microsoft 365. Customers can use these agents out of the box or use them as a foundation to extend and build their own Copilot Agents. Microsoft Security for SAP Security is being reengineered for the AI era - moving beyond static, rule-bound controls and after-the-fact response toward platform-led, machine-speed defense. Attackers think in graphs - Microsoft does too. We are bringing relationship-aware context to Microsoft Security suite - so defenders and AI can see connections, understand the impact of a potential compromise (blast radius), and act faster across pre-breach and post-breach scenarios. SAP S/4HANA Cloud public edition Add-on for Microsoft Sentinel for SAP (preview): Enables deep, native integration of SAP telemetry with Sentinel, bringing advanced threat detection, investigation, and response to SAP workloads running in the cloud. Microsoft Sentinel for SAP Agentless Data Connector: Now generally available, the agentless connector significantly simplifies deployment while delivering secure, high-fidelity ingestion of SAP audit and application logs into Sentinel. Expanded Security Guidance: Enhanced guidance for Microsoft Defender, Ransomware Protection, and Cyber Defense for SAP, helping customers implement best practices for hardening SAP environments and responding to evolving threats. Cost-Efficient Long-Term Log Storage: Organizations can now take advantage of Sentinel Data Lake to retain SAP logs for 12 years at scale for compliance (NIS2, DORA) and forensic use cases - at a fraction of traditional storage costs. Purview shipping most requested features updates for our existing SAP connectors (SNC mode support in preview, CDS view support, and scoped metadata scanning) and a new connector for BW/4HANA. SAP has reiterated end of maintenance for SAP Identity Management (SAP IDM) by end of 2027 and is collaborating with Microsoft so customers can migrate identity scenarios to Microsoft Entra ID as the recommended successor approach. Provisioning backbone in place: Microsoft Entra released new features for the built‑in connector for SAP Cloud Identity Services (CIS) to support authentication with OAuth 2.0, and provisioning of groups to streamline authorization management in downstream SAP targets like SAP S/4HANA and SAP BTP, enabling HR‑driven, end‑to‑end identity lifecycles. Private Preview: Microsoft Entra Integration with SAP IAG: The private preview for Microsoft Entra integration with SAP Identity Access Governance (IAG) is now underway. Selected customers are testing Entra ID Governance access packages that include SAP IAG roles as resources, routing of access approvals through SAP IAG, and provisioning of roles across both systems. Sign-Up here. Enhanced Integration Scope with SAP Access Control (AC): Driven by direct customer feedback, Microsoft and SAP are expanding the migration and integration scope to include SAP Access Control (AC). This enhancement will enable comprehensive access management, risk analysis, and policy enforcement on-premises, leveraging Microsoft Entra’s governance capabilities for improved security and compliance. Together, these innovations give customers end-to-end visibility and protection across SAP landscapes—spanning public cloud, hybrid, and on-premises deployments. SAP on Azure Software Products and Services Azure Backup for SAP We are committed to expanding backup support for additional SAP workloads. Following the general availability of ASE backup, we have further enhanced its capabilities with the introduction of high availability configuration support. This enhancement delivers automatic backup support for SAP systems setup with Replication Server, ensuring seamless protection after failover or failback events without the need for manual intervention. As a result, users benefit from immediate and continuous data protection, along with a simplified restore process using a single backup chain. We have expanded our Snapshot backup capability for SAP HANA by adding Recovery Services Vault support. This will help customers store their snapshot backups with long term retention, while gaining protection from Ransomware attacks. Vault support brings in capabilities like immutability, soft-delete enablement, multi-user-authorization to further safeguard the data. We have also launched the preview for “Scale-out” support configurations for SAP HANA streaming backup, expanding our overall topology support. SAP Deployment Automation Framework We are releasing updates to the SAP Deployment Automation Framework (SDAF) and SAP Testing Automation Framework (STAF) that expand testing coverage, improve reliability, and provide additional deployment flexibility for SAP environments on Azure. SAP Deployment Automation Framework (SDAF) SDAF deployment and configuration scenarios now include scale-out architectures with HANA System Replication (HSR). This enhancement addresses resiliency requirements for large-scale deployments requiring multi-node scale-out configurations with built-in replication capabilities. SDAF now supports GitHub Actions in addition to existing deployment methods including Azure DevOps pipelines, CLI scripts, and the WebApp interface. Organizations using GitHub for source control and infrastructure management can now deploy and manage SAP environments using their existing workflows and tooling preferences. SAP Testing Automation Framework (STAF) STAF now supports offline validation for SAP Pacemaker clusters. This capability enables testing of resource agent failover mechanisms without executing live cluster operations, reducing risk during validation cycles and allowing for pre-deployment verification of high availability configurations. The high availability testing suite has been updated to include SAPHanaSr-ANGI tests, ensuring compatibility with SUSE Linux Enterprise Server 15 and SAP HANA 2.0 SP5 environments. This update addresses the requirements of organizations running current SAP HANA releases on modern SUSE distributions. Configuration checks in preview, represents a rewrite of the open-source Quality Checks tool, now integrated as a native capability within STAF. This tool validates SAP on Azure installations against Microsoft reference architecture and configuration guidance. Azure Center and Azure Monitor for SAP Solutions We are pleased to share that Azure Center for SAP solutions (ACSS) is now available in Italy North, providing end-to-end SAP workload management to more customers across Europe. Additionally, Azure Monitor for SAP solutions (AMS) is now available in Italy North. AMS continues to help SAP customers reliably monitor their mission-critical workloads on Azure with comprehensive insights. Get started: Azure Center for SAP solutions | Microsoft Learn What is Azure Monitor for SAP solutions? | Microsoft Learn Azure Portal Azure Center for SAP solutions Tools and Frameworks We have refreshed our SAP on Azure Well-Architected Framework and the accompanying SAP on Azure Assessment to reflect the latest platform guidance. The update aligns with recent Azure innovations—including VMSS Flex, Premium SSD v2, Capacity Reservation Groups, Mv3-series, and NVMe-based SKUs—so architects and admins can plan and deploy with current best practices. The assessment is also now surfaced on the main Assessments hub for easier access and can be used as a repeatable checkpoint throughout your SAP deployment lifecycle. Quality Checks (PowerShell) for windows: We have published a lightweight, read-only script for customers running SAP on Windows and SQL Server on Microsoft Azure. It performs post-provisioning health checks and outputs a color-coded HTML report plus JSON. Use it as a baseline template—customize the thresholds to your environment, and feel free to contribute enhancements to cover your configuration requirements. Observability Dashboard: Based on customer feedback, we have expanded the dashboard to surface design-impacting signals for running specialized workloads on Azure. It now offers Overview, Security, Networking, and Inventory views, plus extended reports for managers and hands-on engineers. Updates make it easier to review VM redundancy, spot orphaned resources, see Capacity Reservation Groups with their associated VMs in the primary region, and count Public IPs on the Basic SKU—helping you stay on top of infrastructure hygiene and avoid unsupported configurations. SAP + Microsoft Co-Innovations Microsoft and SAP are always working on new solutions to help our customers adapt and grow their businesses in several areas including AI, Business Suite, Data, Cloud ERP, Security, SAP BTP, among others. Recently, we started a new era of Agentic AIOps collaboration between SAP and Microsoft with fully orchestrated multi-agent ecosystem for mission critical workload. Please check out this blog to learn more.Agentic Integration with SAP, ServiceNow, and Salesforce
Copilot/Copilot Studio Integration with SAP (No Code) By integrating SAP Cloud Identity Services with Microsoft Entra ID, organizations can establish secure, federated identity management across platforms. This configuration enables Microsoft Copilot and Teams to seamlessly connect with SAP’s Joule digital assistant, supporting natural language interactions and automating business processes efficiently. Key Resources as given in SAP docs (Image courtesy SAP): Configuring SAP Cloud Identity Services and Microsoft Entra ID for Joule Enable Microsoft Copilot and Teams to Pass Requests to Joule Copilot Studio Integration with ServiceNow and Salesforce (No Code) Integration with ServiceNow and Salesforce, has two main approaches: Copilot Agents using Copilot Studio: Custom agents can be built in Copilot Studio to interact directly with Salesforce CRM data or ServiceNow knowledge bases and helpdesk tickets. This enables organizations to automate sales and support processes using conversational AI. Create a custom sales agent using your Salesforce CRM data (YouTube) ServiceNow Connect Knowledge Base + Helpdesk Tickets (YouTube) 3rd Party Agents using Copilot for Service Agent: Microsoft Copilot can be embedded within Salesforce and ServiceNow interfaces, providing users with contextual assistance and workflow automation directly inside these platforms. Set up the embedded experience in Salesforce Set up the embedded experience in ServiceNow MCP or Agent-to-Agent (A2A) Interoperability (Pro Code) - (Image courtesy SAP) If you choose a pro-code approach, you can either implement the Model Context Protocol (MCP) in a client/server setup for SAP, ServiceNow, and Salesforce, or leverage existing agents for these third-party services using Agent-to-Agent (A2A) integration. Depending on your requirements, you may use either method individually or combine them. The recently released Azure Agent Framework offers practical examples for both MCP and A2A implementations. Below is the detailed SAP reference architecture, illustrating how A2A solutions can be layered on top of SAP systems to enable modular, scalable automation and data exchange. Agent2Agent Interoperability | SAP Architecture Center Logic Apps as Integration Actions Logic Apps is the key component of Azure Integration platform. Just like so many other connectors it has connectors for all this three platforms (SAP, ServiceNow, Salesforce). Logic Apps can be invoked from custom Agent (built in action in Foundry) or Copilot Agent. Same can be said for Power Platform/Automate as well. Conclusion This article provides a comprehensive overview of how Microsoft Copilot, Copilot Studio, Foundry by A2A/MCP, and Azure Logic Apps can be combined to deliver robust, agentic integrations with SAP, ServiceNow, and Salesforce. The narrative highlights the importance of secure identity federation, modular agent orchestration, and low-code/pro-code automation in building next-generation enterprise solutions.675Views1like0CommentsSAP Business Data Cloud Now Available on Microsoft Azure
We’re thrilled to announce that SAP Business Data Cloud (SAP BDC) including SAP Databricks is now available on Microsoft Azure marking a major milestone in our strategic partnership with SAP and Databricks and our commitment to empowering customers with cutting-edge Data & AI capabilities. SAP BDC is a fully managed SaaS solution designed to unify, govern, and activate SAP and third-party data for advanced analytics and AI-driven decision-making. Customers can now deploy SAP BDC on Azure in US East, US West and Europe West, with additional regions coming soon, and unlock transformative insights from their enterprise data with the scale, security, and performance of Microsoft’s trusted cloud platform. Why SAP BDC on Azure Is a Game-Changer for Data & AI Deploying SAP BDC on Azure enables organizations to accelerate their Data & AI initiatives by modernizing their SAP Business Warehouse systems and leveraging a modern data architecture that includes SAP HANA Cloud, data lake files and connectivity to Microsoft technology. Whether it’s building AI-powered intelligent applications, enabling semantically rich data products, or driving predictive analytics, SAP BDC on Azure provides the foundation for scalable, secure, and context-rich decision-making. Running SAP BDC workloads on Microsoft Azure unlocks the full potential of enterprise data by integrating SAP systems with non-SAP data using Microsoft’s powerful Data & AI services - enabling customers to build intelligent applications grounded in critical business context. Why Azure is an Ideal Platform for Running SAP BDC Microsoft Azure stands out as a leading cloud platform for hosting SAP solutions, including SAP BDC. Azure’s global infrastructure, high-performance networking, and powerful Data & AI capabilities make it an ideal foundation for large-scale SAP workloads. When organizations face complex data environments and need seamless interoperability across tools, Azure’s resilient backbone and enterprise-grade services provide the scalability and reliability essential for building a robust SAP data architecture. Under the Hood: SAP Databricks in SAP BDC is Powered by Azure Databricks A key differentiator of SAP BDC on Azure is that SAP Databricks, a core component of BDC, runs on Azure Databricks—Microsoft’s first-party service. Azure Databricks is a fully managed first party service making Microsoft Azure the optimal cloud for running Databricks workloads. It uniquely offers: Native integration with Microsoft Entra ID for seamless access control. Optimized performance with Power BI, delivering unmatched analytics speed. Enterprise-grade security and compliance, inherent to Azure’s first-party services. Joint engineering and unified support from Microsoft and Databricks. Zero-copy data sharing between SAP BDC and Azure Databricks, enabling frictionless collaboration across platforms. This deep integration ensures that customers benefit from the full power of Azure’s AI, analytics, and governance capabilities while running SAP workloads. Expanding Global Reach: What’s Next While SAP BDC is now live in three Azure regions US East, US West and Europe - we’re just getting started. Over the next few months, availability will expand to additional Azure regions such as Brazil and Canada. For the remaining regions, a continuously updated roadmap can be found on the SAP Roadmap Explorer website Final Thoughts This launch reinforces Microsoft Azure’s longstanding partnership with SAP, backed by over 30 years of trusted partnership and co-innovation. With SAP BDC now available on Azure, customers can confidently modernize their data estate, unlock AI-driven insights, and drive business transformation at scale. Stay tuned as we continue to expand availability and bring even more Data & AI innovations to our joint customers over the next few months.MSL correction from clone to multistate HANA DB Cluster SUSE activation
Introduction: SAP HANA system replication involves configuring one primary node and at least one secondary node. Any changes made to the data on the primary node are replicated to the secondary node synchronously. This ensures that we have a consistent and up-to-date backup, which is crucial for maintaining the integrity and availability of our data. Problem Description: Azure VM was in a degraded state causing a major outage since the SAP cluster was unable to start. Node health score (-1000000) did not reset automatically after redeploying and remained until manual intervention. Consider below execution if your cluster nodes are running on SLES 12 or later: Please note that promotable is not supported. Replace <placeholders> with your instance number and HANA system ID. sudo crm configure primitive rsc_SAPHana_<HANA SID>HDB<instance number> ocf:suse:SAPHana operations $id="rsc_sap<HANA SID>_HDB<instance number>-operations" op start interval="0" timeout="3600" op stop interval="0" timeout="3600" op promote interval="0" timeout="3600" op monitor interval="60" role="Master" timeout="700" op monitor interval="61" role="Slave" timeout="700" params SID="<HANA SID>" InstanceNumber="<instance number>" PREFER_SITE_TAKEOVER="true" DUPLICATE_PRIMARY_TIMEOUT="7200" AUTOMATED_REGISTER="false" sudo crm configure ms msl_SAPHana_<HANA SID>HDB<instance number> rsc_SAPHana<HANA SID>_HDB<instance number> meta notify="true" clone-max="2" clone-node-max="1" target-role="Started" interleave="true" sudo crm resource meta msl_SAPHana_<HANA SID>_HDB<instance number> set priority 100 Cutover steps: These steps encompass pre-steps, execution steps, post-validation steps, and the rollback plan. First, we have the pre-steps, which involve preparations and checks that need to be completed before we proceed with the main execution. This ensures that everything is in order and ready for the next phase. Next, we move on to the execution steps. These are the core actions that need to be carried out to ensure the task is completed accurately and efficiently. It's crucial that we follow these steps meticulously to avoid any issues. Post-validation steps come after the execution. This phase involves verifying the results and ensuring that everything works as expected. Pre-Steps: Check cluster status: crm status crm configure show SAPHanaSR-showAttr Ensure no pending operations or failed resources: crm_mon -1 Confirm replication is healthy: hdbnsutil -sr_state SAPHanaSR-showAttr Backup current configuration: crm configure show > /root/cluster_config_backup.txt Execution Steps: Enable maintenance mode: sudo crm configure property maintenance-mode=true Delete the incorrect clone resource: crm configure delete msl_SAPHana_<SID>_HDB<instance> Recreate using ms primitive: sudo crm configure ms msl_SAPHana_<SID>_HDB<instance> rsc_SAPHana_<SID>_HDB<instance> meta notify="true" clone-max="2" clone-node-max="1" target-role="Started" interleave="true" maintenance="true" sudo crm resource meta msl_SAPHana_<HANA SID>_HDB<instance number> set priority 100 Disable maintenance mode: crm configure property maintenance-mode=false Refresh resource and disable maintenance: sudo crm resource refresh msl_SAPHana_<SID> wait 10 seconds Check HSR status match in all SAPHanaSR-showAttr and crm_mon -A -1 and hdbnsutil -sr_state sudo crm resource maintenance msl_SAPHana_<SID> off Post Validation steps: crm status crm configure show SAPHanaSR-showAttr Rollback Plan: Enable maintenance mode: crm configure property maintenance-mode=true sudo crm resource maintenance msl_SAPHana_<SID> on Restore configuration from backup: "crm configure load update /root/cluster_config_backup.txt" Recreate the previous clone configuration if needed: crm configure clone msl_SAPHana_<SID>_HDB<instance> rsc_SAPHana_<SID>_HDB<instance> \ meta notify=true clone-max=2 clone-node-max=1 target-role=Started interleave=true promotable=true Disable maintenance and refresh resources: crm configure property maintenance-mode=false sudo crm resource refresh msl_SAPHana_<SID> wait 10 seconds sudo crm resource maintenance msl_SAPHana_<SID> off Perform below steps during actual execution: Task Description Team Pre Step: Submit a CAB request for approval Basis Perform Pre-checks · Check cluster status: SBD,pacemaker, coro services, sbd messages, isscsi, constraint crm status crm configure show SAPHanaSR-showAttr · Ensure no pending operations or failed resources: crm_mon -R1 -Af -1 · Confirm replication is healthy: hdbnsutil -sr_state · Backup current configuration: Pre-change crm configure show > /hana/shared/SID/dbcluster_backup_prechange.txt crm configure show | sed -n '/primitive rsc_SAPHana_SID_HD/,/^$/p' crm configure show | sed -n '/clone msl_SAPHana_SID_HD/,/^$/p' Basis Execution Get Go ahead from Leadership team Basis Step 0 – Put cluster into maintenance mode Basis crm resource maintenance g_ip_SID_HD on Basis #Backup current configuration: When cluster, msl, g_ip is in maintenance crm configure show > /hana/shared/SID/dbcluster_backup_prehealth.txt Basis Step 1 – (If not already done) clear Node 1 health and ensure topology/azure-events are running on both nodes (this avoids scheduler surprises when we re-manage) Basis #Execute on m1vms*(Ideally it can be executed on any node) crm_attribute -N vm** -n '#health-azure' -v 0 crm_attribute --node vm** --delete --name "azure-events-az_curNodeState" crm_attribute --node vm**--delete --name "azure-events-az_pendingEventIDs" SOPS crm resource cleanup health-azure-events-cln crm resource cleanup cln_SAPHanaTopology_SID_HD Basis #Backup current configuration: When health correct is complete and msl correction remaining. crm configure show > /hana/shared/SID/dbcluster_backup_premsl.txt Basis Step 2 – Convert the wrapper inside a single atomic transaction We delete the promotable clone wrapper only (not the primitive), then create the ms wrapper with the same name msl_SAPHana_SID_HD so existing colocation/order constraints that reference the name keep working. Basis # Remove the promotable clone wrapper (keeps rscSAPHanaSIDHD primitive intact) crm configure delete msl_SAPHana_SID_HD Basis # Recreate as multi-state (ms) for classic agents sudo crm configure ms msl_SAPHana_SID_HD rsc_SAPHana_SID_HD meta notify="true" clone-max="2" clone-node-max="1" target-role="Started" interleave="true" maintenance="true" Basis sudo crm resource meta msl_SAPHana_SID_HD set priority 100 Basis Step 3 – Re‑enable cluster management of IP and HANA Basis Prechecks by MSFT, SUSE Teams MSFT/SUSE Precheck by BASIS Team Basis crm configure property maintenance-mode=false crm resource refresh msl_SAPHana_SID_HD wait 10 seconds crm resource maintenance msl_SAPHana_SID_HD off crm resource maintenance g_ip_SID_HD off Basis Validation Basis crm_mon -R1 -Af -1 crm status crm configure show SAPHanaSR-showAttr Basis Rollback Plan Enable maintenance mode: Basis crm configure property maintenance-mode=true crm resource maintenance msl_SAPHana_SID_HD on crm resource maintenance g_ip_SID_HD on Basis Restore configuration from backup: Decide to which state we need to revert and use respective backup Basis crm configure load update /hana/shared/SID/dbcluster_backup_prechange/prehealth/premsl.txt Basis Recreate the previous clone configuration if needed: Basis crm configure clone msl_SAPHana_SID_HD rsc_SAPHana_SID_HD meta notify=true clone-max=2 clone-node-max=1 target-role=Started interleave=true promotable=true maintenance="true" Basis Disable maintenance and refresh resources: Basis crm configure property maintenance-mode=false crm resource refresh msl_SAPHana_SID_HD wait 10 seconds crm resource maintenance msl_SAPHana_SID_HD off crm resource maintenance g_ip_SID_HD off Basis Important Points: 1. Are there known version-specific considerations when migrating from clone to ms? If you are using SAPHanaSR, please ensure you are using 'ms'. On the other hand, if you are working with SAPHanaSR-angi, you should use 'clone'. There are 3 different sets of HANA resource agents and SRHook scripts, two older ones and one newer one. 2. Does this change apply across the board on SUSE OS and/or Pacemaker versions? The packages for the older ones are: SAPHanaSR which is for Scale-Up HANA clusters. SAPHanaSR-ScaleOut which is for Scale-Out HANA clusters. The package for the new one is: SAPHanaSR-angi which is for both Scale-up and Scale-out clusters. (angi stands for "advanced next generation interface"). When using the older SAPHanaSR or SAPHanaSR-ScaleOut resource agents and SRHook scripts, SUSE only supports the multi-state (ms) clone type for the SAPHanaSR (scale-up) or SAPHanaController (scale-out) resource. The older resource agents and scripts are supported on all Service Packs of SLES for SAP 12 and 15. When using the newer SAPHanaSR-angi resource agents and scripts, SUSE only supports the regular clone type for the SAPHanaController resource (scale-up AND scale-out) with the "promotable=true" meta-attribute set on the clone. The newer "angi" resource agents and scripts are supported on SLES for SAP 15 SP5 and higher and on SLES for SAP 16 when it is released later this year. So, with SLES for SAP 15 SP5 and higher, you can use either the older or the newer resource agents and scripts. For all Service Packs of SLES for SAP 12 and Service Packs of SLES for SAP 15 prior to SP5, you must use the older resource agents and scripts. Starting with SLES for SAP 16, you must use the new angi resource agents and scripts. Installing the new SAPHanaSR-angi package will automatically uninstall the older SAPHanaSR or SAPHanaSR-ScaleOut packages if they are already installed. SUSE has published a blog on how to migrate from the older resource agents and scripts to the newer ones provided in the reference suse link. Conclusion: Let us set up and ensure that system replication is active. This is crucial to avoid any business disruptions during our critical operational hours. By taking these steps, we can seamlessly enhance the cluster architecture and resilience of our systems. Implementing these replication strategies will not only bolster our business continuity measures but also significantly improve our overall resilience. This means our operations will run more smoothly and efficiently, allowing us to handle future demands with ease. Reference MS links: High availability for SAP HANA on Azure VMs on SLES | Microsoft Learn https://www.suse.com/c/how-to-upgrade-to-saphanasr-angi/Gen1 to Gen2 Azure VM Upgrade in Rolling Fashion
Introduction: Azure offers Trusted Launch. This seamless solution is designed to significantly enhance the security of our Generation 2 virtual machines (VMs), providing robust protection against advanced and persistent attack techniques. Trusted Launch is composed of several coordinated infrastructure technologies, each of which can be enabled independently. These technologies work in harmony to create multiple layers of defense, ensuring our virtual machines remain secure against sophisticated threats. With Trusted Launch, we can confidently improve our security posture and safeguard our VMs from potential vulnerabilities. Upgrading of Azure VMs from Generation 1 (Gen1) to Generation 2 (Gen2) involves several steps to ensure a smooth transition without data loss or disruptions. Rolling fashion upgrade process: First and foremost, it is crucial to have a complete backup of virtual machines before starting the upgrade. This step is essential to protect valuable data in case of any unforeseen issues that may arise during the process. Having a backup will give you peace of mind and ensure that data is safe and secure. It is crucial to perform any new process or implementation in pre-production systems first. This step is vital to ensure that we can identify and resolve any potential issues before moving to the production environment. By doing so, we can maintain the integrity and stability of our systems, ultimately serving our customers better. Please run the pre-validation steps before you bring down the VM. SSH into VM: Connect to the Gen1 Linux VM. Identify Boot Device with sudo : bootDevice=$(echo "/dev/$(lsblk -no pkname $(df /boot | awk 'NR==2 {print $1}'))") Check Partition Type (must return 'gpt'): sudo blkid $bootDevice -o value -s PTTYPE Validate EFI System Partition (e.g., /dev/sda2 or 3): sudo fdisk -l $bootDevice | grep EFI | awk '{print $1}' Check EFI Mountpoint (/boot/efi must be in /etc/fstab): sudo grep -qs '/boot/efi' /etc/fstab && echo '/boot/efi present in /etc/fstab' || echo '/boot/efi missing /boot/efi present in /etc/fstab' Example: Once the complete backup is in place and the pre-validation steps are completed, we will need the SAP Basis team to proceed with stopping the application. As part of our planned procedure, once the application has been taken down, the Unix team will proceed to shut down the operating system on the ERS servers. Azure team to follow below steps and perform the Gen upgrade on the selected approved servers: Example: Example: Example: Start the VM: Start-AzVM -ResourceGroupName myResourceGroup -Name myVm (Or) Start from Azure Portal Login into Azure Portal to check the VM Generation is successfully changed to V2. Example: Unix team to validate OS on approved servers. SAP Basis team to generate a new license key based on the new hardware to apply and start the application. Unix team to perform failover of ASCS cluster. SAP Basis team to stop the application server. Unix team to shutdown OS on ERS for selected VM’s and validate the OS. SAP Basis team to apply the new Hardware key and start the application. Unix team to perform failover of ASCS cluster. Azure team to work on capacity analysis to find the path forward for hosting Mv2 VMs on the same PPG group. Once successfully completed test rollback on at least one app server for rollback planning. Here are the other methods to achieve this: Method 1: Using Trusted Launch Direct Upgrade Prerequisites Check: Ensure your subscription is onboarded to preview feature Gen1ToTLMigrationPreview under Microsoft. Compute namespace. The VM should be configured with Trusted launch supported size family and OS version. Also have a successful backup in place. Update Guest OS Volume: Update guest OS volume with GPT Disk layout and EFI system partition. Use PowerShell-based orchestration script for MBR2GPT validation and conversion. Enable Trusted Launch:Deallocate the VM using Stop-AzVM. Enable Trusted launch by setting -SecurityType to TrustedLaunch using Update-AzVM command. Stop-AzVM -ResourceGroupName myResourceGroup -Name myVm, Update-AzVM -ResourceGroupName myResourceGroup -VMName myVm -SecurityType TrustedLaunch -EnableSecureBoot $true -EnableVtpm $true Validate and Start VM:Validate the security profile in the updated VM configuration. Start the VM and verify that you can sign in using RDP or SSH. Method 2: Using Azure Backup Verify Backup Data: Ensure you have valid and up-to-date backups of your Gen1 VMs, including both OS disks and data disks. Verify that the backups are successfully completed and can be restored. Create Gen2 VMs: Create new Gen2 VMs with the desired specifications and configuration. There's no need to start them initially, just have them created and ready for when we need them. Restore VM Backups: In the Azure Portal, go to the Azure Backup service. Select "Recovery Services vaults" from the left-hand menu, and then select your existing backup vault that contains the backups of the Gen1 VMs. Inside the recovery services vault, go to the "Backup Items" section and select the VM you want to restore. Initiate a restore operation for the VM. During the restore process, choose the target resource group and the target VM (which should be the newly created Gen2 VM). Restore OS Disk: Choose to restore the OS disk of the Gen1 VM to the newly created Gen2 VM. Azure Backup will restore the OS disk to the new VM, effectively migrating it to Generation 2. Restore Data Disks: Once the OS disk is restored and the Gen2 VM is operational, proceed to restore the data disks. Repeat the restore process for each data disk, attaching them to the Gen2 VM as needed. Verify and Test: Verify that the Gen2 VM is functioning correctly, and all data is intact. Test thoroughly to ensure all applications and services are running as expected. Decommission Gen1 VMs (Optional): Once the migration is successful, and you have verified that the Gen2 VMs are working correctly please decommission the original Gen1 VMs. Important Notes: Before proceeding with any production migration, thoroughly test this process in a non-production environment to ensure its success and identify any potential issues. Make sure you have a backup of critical data and configurations before attempting any migration. While this approach focuses on using Azure Backup for restoring the VMs, there are other migration strategies available that may better suit your specific scenario. Evaluate them based on your requirements and constraints. Remember, migrating VMs between generations involves changes in the underlying virtual hardware, so thorough testing and planning are essential to ensure a smooth transition without data loss or disruptions. Why Generation2 upgrade without Trusted launch is not supported? Trusted Launch provides foundational compute security for our VMs at no additional cost, which means we can enhance our security posture without incurring extra expenses. Moreover, Trusted Launch VMs are largely on par with Generation 2 VMs in terms of features and performance. This means that upgrading to Generation 2 VMs without enabling Trusted Launch does not provide any added benefits. Unsupported Gen1 VM configurations: Gen1 to Trusted launch VM upgrade is NOT supported if Gen1 VM is configured with below options: Operating system: Windows Server 2016, Azure Linux, Debian, and any other operating system not listed under Trusted launch supported operating system (OS) version. Ref link: Trusted Launch for Azure VMs - Azure Virtual Machines | Microsoft Learn VM size: Gen1 VM configured with VM size not listed under Trusted launch supported size families. Ref link: Trusted Launch for Azure VMs - Azure Virtual Machines | Microsoft Learn Azure Backup: Gen1 VM configured with Azure Backup using Standard policy. As workaround, migrate Gen1 VM backups from Standard to Enhanced policy. Ref link: Move VM backup - standard to enhanced policy in Azure Backup - Azure Backup | Microsoft Learn Conclusion: We will enhance Azure virtual machines by transitioning from Gen1 to Gen2. By implementing these approaches, we can seamlessly unlock improved security and performance of systems. This transition will not only bolster our security measures but also significantly enhance the overall performance, ensuring our operations run more smoothly and efficiently. Let us make this upgrade to ensure virtual machines are more robust and capable of handling future demands. Ref links: Upgrade Gen1 VMs to Trusted launch - Azure Virtual Machines | Microsoft Learn GitHub - Azure/Gen1-Trustedlaunch: aka.ms/Gen1ToTLUpgrade Enable Trusted launch on existing Gen2 VMs - Azure Virtual Machines | Microsoft LearnDeep dive into Pacemaker cluster for Azure SAP systems optimization
Introduction: Azure Pacemaker offers a centralized management platform that streamlines the process of monitoring and maintaining pacemakers. With this innovative service, you can ensure the safety and well-being of your systems through automated alerts and comprehensive management tools. By leveraging Azure Pacemaker, organizations can experience enhanced efficiency and peace of mind knowing that their pacemakers are being managed optimally. The centralized platform simplifies the management process, making it easier to keep track of devices and promptly respond to any issues that may arise. Current customer challenges: Configuration: Common misconfigurations occur when customers don’t follow up-to-date HA setup guidance from learn.microsoft.com, leading to failover issues. Testing: Manual testing causes untested failover scenarios and config drift. Limited expertise in HA tools complicates troubleshooting. Key Use Cases for SAP HA Testing Automation: I wanted to discuss some important updates regarding our testing and validation procedures to ensure that we continue to maintain the highest standards in our work. First off, we need to automate the validation process on new OS versions. This will help us ensure that the Pacemaker cluster configuration remains up-to-date and functions smoothly with the latest OS releases. By doing this, we can promptly address any compatibility issues that might arise. Next, we should implement loop tests to run on a regular cadence. These tests will enable us to catch regressions early and ensure that our customer systems remain robust and reliable over time. It's essential to have continuous monitoring in place to maintain optimal performance. Furthermore, we must validate our high availability (HA) configurations according to the documented SAP on Azure best practices. This will ensure effective failover and quick recovery, minimize downtime and maximize system uptime. Adhering to these best practices will significantly enhance our HA capabilities. SAP Testing Automation Framework (Public Preview): The most recommended approach for validating Pacemaker configurations in SAP HANA clusters, which is through the SAP Deployment Automation Framework (SDAF) and its High Availability Testing Framework. This framework includes a comprehensive set of automated test cases designed to validate cluster behavior under various scenarios such as primary node crashes, manual resource migrations, and service failures. Additionally, it rigorously checks OS versions, Azure roles for fencing, SAP parameters, and Pacemaker/Corosync configurations to ensure everything is set up correctly. Low-level administrative commands are employed to validate the captured values against best practices, particularly focusing on constraints and meta-attributes. This thorough validation process ensures that our clusters are reliable, resilient, and adhering to industry standards. SAP System High Availability on Azure: SAP HANA Scale-UP: SAP Central Services: Support Matrix: Linux Distribution: Distribution Supported Release SUSE Linux Enterprise Server (SLES) 15 SP4, 15 SP5, 15 SP6 Red Hat Enterprise Linux (RHEL) 8.8, 8.10, 9.2, 9.4 High Availability Configuration Patterns: Component Type Cluster Type Storage SAP Central Services ENSA1 or ENSA2 Azure Fencing Agent Azure Files or ANF SAP Central Services ENSA1 or ENSA2 ISCSI (SBD device) Azure Files or ANF SAP HANA Scale-up Azure Fencing Agent Azure Managed Disk or ANF SAP HANA Scale-up ISCSI (SBD device) Azure Managed Disk or ANF High Availability Tests scenarios: Test Type Database Tier (HANA) Central Services Configuration Checks HA Resource Parmeter Validation Azure Load Balancer Configuration HA Resource Parmeter Validation SAPControl Azure Load Balancer Configuration Failover Tests HANA Resource Migration Primary Node Crash ASCS Resource Migration ASCS Node Crash Process & Services Index Server Crash Node Kill Kill SBD Service Message Server Enqueue Server Enqueue Replication Server SAPStartSRV process Network Tests Block network Block network Infrastructure Virtual machine crash Freeze file system (storage) Manual Restart HA Failover to Node Reference links: SLES: Set up Pacemaker on SUSE Linux Enterprise Server (SLES) in Azure | Microsoft Learn Troubleshoot startup issues in a SUSE Pacemaker cluster - Virtual Machines | Microsoft Learn RHEL: Set up Pacemaker on RHEL in Azure | Microsoft Learn Troubleshoot Azure fence agent issues in an RHEL Pacemaker cluster - Virtual Machines | Microsoft Learn STAF: GitHub - Azure/sap-automation-qa: This is the repository supporting the quality assurance for SAP systems running on Azure. Conclusion: This innovative tool is designed to significantly streamline and enhance the high availability deployment of SAP systems on Azure by reducing potential misconfigurations and minimizing manual effort. Please note that since this framework performs multiple failovers sequentially to validate the cluster behavior. It is not recommended to be run on production systems directly. It is intended for use in new high availability deployments that are not yet live / non-business critical systems.Announcing Public Preview for Business Process Solutions
In today’s AI powered enterprises, success hinges on access to reliable, unified business information. Whether you are deploying AI-augmented workflows or fully autonomous agentic solutions, one thing is clear: trusted, consistent data is the fuel that drives intelligent outcomes. Yet in many organizations, data remains fragmented across best of breed applications – creating blind spots in cross-functional processes and throwing roadblocks in the path of automation. Microsoft is dedicated to tackle these challenges, delivering a unified data foundation that accelerates AI adoption, simplifies automation and reduces risk – empowering businesses to unlock the full potential of unified data analytics and agentic intelligence. Our new solution offers cross-functional insights across previously siloed environments and includes: Prebuilt data models for enterprise business applications in Microsoft Fabric Source system data mappings and transformations Prebuilt dashboards and reports in Power BI Prebuilt AI Agents in Copilot Studio (coming soon) Integrated Security and Compliance By unifying Microsoft’s Fabric and AI solutions we can rapidly accelerate transformation and derisk AI rollout through repeatable, reliable, prebuilt solutions. Functional Scope Our new solution currently supports a set of business applications and functional areas, enabling organizations to break down silos and drive actionable insights across their core processes. The platform covers key domains such as: Finance: Delivers a comprehensive view of financial performance, integrating data from general ledger, accounts receivable, and accounts payable systems. This enables finance teams to analyze trends, monitor compliance, and optimize cash flow management all from within Power BI. The associated Copilot agent provides not only access to this data via natural language but will also enable financial postings. Sales: Provides a complete perspective on customers’ opportunity to cash journeys, from initial opportunity through invoicing and payment via Power BI reports and dashboards. The associated Copilot agent can help improve revenue forecasting, by connecting structured ERP and CRM data with unstructured data from Microsoft 365, also tracking sales pipeline health and identify bottlenecks. Procurement: Supports strategic procurement and supplier management, consolidating purchase orders, goods receipts, and vendor invoicing data into a complete spend dashboard. This empowers procurement teams to optimize sourcing strategies, manage supplier risk, and control spend. Manufacturing: (coming soon): Will extend coverage to manufacturing and production processes, enabling organizations to optimize resource allocation and monitor production efficiency. Each item within Business Process Solutions is delivered as a complete, business-ready offering. These models are thoughtfully designed to ensure that organizations can move seamlessly from raw data to actionable execution. Key features include: Facts and Dimensions: Each model is structured to capture both transactional details (facts) and contextual information (dimensions), supporting granular analysis and robust reporting across business processes. Transformations: Built-in transformations automatically prepare data for reporting and analytics, making it compatible with Microsoft Fabric. For example, when a business user needs to compare sales results from Europe, Asia, and North America, the solution transformations handle currency conversion behind the scenes. This ensures that results are consistent across regions, making analysis straightforward and reliable—without the need for manual intervention or complex configuration. Insight to Action: Customers will be able to leverage prebuilt Copilot Agents within Business Process Solutions to turn insight into action. These agents are deeply integrated not only with Microsoft Fabric and Microsoft Teams, but also connected source applications, enabling users to take direct, contextual actions across systems based on real-time insights. By connecting unstructured data sources such as emails, chats, and documents from Microsoft 365 apps, the agents can provide a holistic and contextualized view to support smarter decisions. With embedded triggers and intelligent agents, automated responses could be initiated based on new insights -- streamlining decision-making and enabling proactive, data-driven operations. Ultimately, this will empower teams to not just understand what is happening on a wholistic level, but to also take faster and smarter actions, and with greater confidence. Authorizations: Data models are tailored to respect organizational security and access policies, ensuring that sensitive information is protected and only accessible to authorized users. The same user credential principles apply to the Copilot agents when interacting with/updating the source system in the user-context. Behind the scenes, the solution automatically provisions the required objects and infrastructure to build the data warehouse, removing the usual complexity of bringing data together. It guarantees consistency and reliability, so organizations can focus on extracting value from their data rather than managing technical details. This reliable data foundation serves as one of the key informants of the agentic business processes. Accelerated Insights with Prebuilt Analytics Building on these robust data models, Business Process Solutions offer a suite of prebuilt Power BI reports tailored to common business processes. These reports provide immediate access to key metrics and trends, such as financial performance, sales effectiveness, and procurement efficiency. Designed for rapid deployment, they allow organizations to: Start analyzing data from day one, without lengthy setup or customization. Adapt existing reports for your organization’s exact business needs. Demonstrate best practices for leveraging data models in analytics and decision-making. This approach accelerates time-to-value and also empowers users to explore new analytical scenarios and drive continuous improvement. Extensibility and Customization Every organization is unique and our new solution is designed to support this, allowing you to adapt analytics and data models to fit your specific processes and requirements. You can customize scope items, bring in your own tables and views, integrate new data sources as your business evolves, and combine data across Microsoft Fabric for deeper insights. Similarly, the associated agents will be customizable from Copilot Studio to adapt to your specific Enterprise apps configuration. This flexibility ensures that, no matter how your organization operates, Business Process Solutions helps you unlock the full value of your data. Data integration Business Process Solutions uses the same connectivity options as Microsoft Fabric and Copilot Studio but goes further by embedding best practices that make integration simpler and more effective. We recognize that no single pattern can address the diverse needs of all business applications. We also understand that many businesses have already invested in data extraction tools, which is why our solution supports a wide range of options, from native connectivity to third-party options that bring specialized capabilities to the table. With Business Process Solutions we ensure data can be interacted with in a reliable and high-performant way, whether working with massive volumes or complex data structures. Getting started If your organization is ready to unlock the value of unified analytics, getting started is simple. Just send us a request using the form at: https://aka.ms/JoinBusAnalyticsPreview. Our team will guide you through the next steps and help you begin your journey.Backup SAP Oracle Databases Using Azure VM Backup Snapshots
This blog article provides a comprehensive step-by-step guide for backing up SAP Oracle databases using Azure VM backup snapshots, ensuring data safety and integrity. Installation of CIFS Utilities: The process begins with the installation of cifs-utils on Oracle Linux, which is the recommended OS for running Oracle databases in the cloud. Setting Up Environment Variables: Users are instructed to define necessary environment variables for resource group and storage account names. Creating SMB Credentials: The guide explains how to create a folder for SMB credentials and retrieve the storage account key, emphasizing the need for appropriate permissions. Mounting SMB File Share: Instructions are provided for checking the accessibility of the storage account and mounting the SMB file share, which will serve as a backup location for archived logs. Preparing Oracle Database for Backup:Users must place the Oracle database in hot backup mode to ensure a consistent backup while allowing ongoing transactions. Initiating Snapshot Backup: Once the VM backup is configured, users can initiate a snapshot backup to capture the state of the virtual machine, including the Oracle database. Restoration Process: The document outlines the steps for restoring the Oracle database from the backup, including updating IP addresses and starting the database listener. Final Steps and Verification: Users are encouraged to verify the configuration and ensure that all necessary backups are completed successfully, including the SMB file share.