azure
132 TopicsPartner Case Study | Infosys
Managing student and teacher data at scale is a high-stakes challenge. Modernizing a Student Information System (SIS) that tracks daily attendance, enrollment, academic records, scheduling, and other student data requires deep technical expertise, a structured approach to cloud migration, and a strong foundation in security and governance. Infosys, a Microsoft partner for over 25 years, has long been a leader in these secure modernizations and cloud transformations. Infosys holds Solutions Partner designations spanning Data and AI, Digital and App Innovation, Infrastructure, Modern Work, and Security. As a leading adopter of GitHub Copilot and Microsoft 365 Copilot, Infosys has delivered enterprise-scale transformations. Leveraging Infosys Cobalt, a proprietary set of solutions for cloud transformation, and Infosys Topaz, an AI-first suite of offerings, Infosys has been able to support multiple large enterprises in their AI-first, cloud-first transformation journeys. With deep expertise across Microsoft Azure, GitHub Copilot, and Microsoft 365, Infosys has built a reputation for delivering scalable, secure, and cost-effective solutions. This experience and collaborative history are part of what made Infosys a 2025 Microsoft Partner of the Year for Azure – Secure Migration and Modernization. It’s also what made the company so successful when they were tapped to stabilize, modernize, and migrate the SIS for one of the largest districts in the country to Azure—creating one of the largest SIS cloud deployments for any US school district in the process. Stabilizing and scaling a strained system Each morning, teachers and students generate a massive amount of data for Infosys’s client, with 30,000 teachers taking attendance for more than half a million students. The district’s legacy SIS was creating intense traffic spikes and exposing the limits of their infrastructure. “There were 1,400-plus schools, 500,000 students,” said Sambit Mohanty, AVP Education Practice at Infosys. “You can imagine the traffic between 7:00 AM and 10:00 AM.” Originally built on a legacy Microsoft .NET framework and deployed on premises, the SIS was designed to support a vast and complex student population. But the system was not easy to scale, and it had begun to show its age, with disruptive performance issues that made it difficult for teachers to submit attendance and that caused delays when administrators tried to access student records. During school openings each semester—one of the most critical periods for any district—the instability of the SIS could present major issues. “The school opening is the biggest event for a school district,” Mohanty explained. “Everything has to be fine. You should have proper enrollment, grades should be moved to the next level, students should be scheduled, and teachers should be able to take attendance.” In addition to the suboptimal performance, the SIS was also costly and inefficient. To manage the start-of-the-year chaos, the district assembled war rooms with more than 50 experts across IT, network, and security teams trying to juggle and fix issues in real time. But they needed a partner who could support efforts to stabilize the system and modernize it for long-term scalability and security. Continue reading here Explore all case studies or submit your own Subscribe to case studies tag to follow all new case study posts. Don't forget to follow this blog to receive email notifications of new stories!40Views0likes0CommentsPartner Case Study | DeepJudge
Legal work depends on precision, precedent, and the ability to apply institutional knowledge across diverse matters. For many firms, that knowledge is documented but not always easy to access or act on. DeepJudge, a Microsoft partner, is helping legal teams bridge that gap with AI-powered search and workflow tools built on Microsoft Azure. DeepJudge specializes in enterprise search and agentic AI workflows tailored for legal professionals. DeepJudge participated in Microsoft for Startups and the Pegasus Program, a selective initiative that helps high-potential partners scale through technical guidance, go-to-market support, and early access to Microsoft innovations. Today, the company’s platform is built on Microsoft Azure—including Azure OpenAI Services, Azure Kubernetes Service, and Microsoft Defender for Cloud—to promote high performance and robust data protection. Making internal knowledge accessible—and secure CMS Switzerland, a full-service law firm with more than 160 employees and a legacy spanning over 80 years, offers tailored legal solutions for businesses, investors, and private individuals. Known for its deep legal expertise and cross-border capabilities, CMS Switzerland built a strong foundation of internal knowledge—contracts, case files, templates, presentations, and precedent documents—stored across various systems and folders. While this information was historically well maintained, it wasn’t always easy for lawyers to locate and apply it consistently across cases. “Clients hire law firms for their expertise—but law firms often underestimate their breadth and depth of existing knowledge and experience,” said Stefan Brunnschweiler, Managing Partner at CMS Switzerland. The firm wanted to better surface and apply its internal expertise across teams—without disrupting existing workflows or compromising on data protection. The goal was to institutionalize internal know-how so that all employees could access and apply it confidently in their daily work. Security was a critical consideration. As part of CMS, one of the largest international law firms with over 7,200 lawyers in 92 offices across 50 countries, CMS Switzerland needed a solution that could meet strict data protection requirements while offering the flexibility and performance of modern AI tools. “Data security is our top priority,” Brunnschweiler emphasized. “The fact that Microsoft hosts our data in Switzerland and that DeepJudge, through Microsoft, also ensures high data protection convinced us.” CMS Switzerland began exploring options that could help surface internal knowledge more efficiently, reduce time spent on manual research, and support faster onboarding of new employees. The firm was looking for a solution that could meet the highest standards for security, reliability, and usability—while also aligning with the operational realities of legal work. Continue reading here Explore all case studies or submit your own Subscribe to case studies tag to follow all new case study posts. Don't forget to follow this blog to receive email notifications of new stories!97Views0likes0CommentsAnnouncing Neon Serverless Postgres as an Azure Native Integration (Preview)
Note: This service is now retired but you can browse similar database service in Azure. We are excited to announce that Neon Serverless Postgres is now available as an Azure Native Integration (in preview) within the Azure Cloud ecosystem. This integration enhances developer experience by combining the power and flexibility of Neon’s serverless Postgres database service with Azure's robust cloud infrastructure. We’re excited to bring Neon to all Azure developers, especially AI platforms. Neon Serverless Postgres scales automatically to match your workload and can branch instantly for an incredible developer experience. And for AI developers concerned about scale, cost efficiency, and data privacy, Neon enables them to easily adopt database-per-customer architectures, ensuring real-time provisioning and data isolation for their customers. - Nikita Shamgunov, CEO, Neon. What is Neon Serverless Postgres? Neon offers a serverless Postgres solution that leverages the principles of serverless computing to provide scalable and flexible database services. By abstracting away infrastructure complexities, Neon allows businesses to focus on application development rather than database administration. The key features of Neon’s Postgres service include: Instant Provisioning: Neon's architecture allows the creation of new databases in under a second, thanks to its custom-built storage engine. Efficient Scaling: Neon automatically scales resources based on load, ensuring optimal performance during traffic spikes without the need for overprovisioning. Integrated Developer Workflows: With features like database branching, Neon enables shorter software development lifecycles and cost-effective integration into CI/CD pipelines. What is Neon Serverless Postgres as an Azure Native Integration? The Azure Native integration of Neon Serverless Postgres enables users to create a Neon organization from Azure portal. Users can find the Neon Serverless Postgres offering on Azure portal and Azure Marketplace. This integration paves a way forward to effectively use Neon Postgres along with other Azure services. At Microsoft, we are committed to providing seamless and innovative solutions for our Azure developers. The introduction of Neon Postgres as an Azure Native Integration is a significant milestone in this journey. This integration not only simplifies the provisioning and management of Neon organizations directly from Azure but also enhances the overall developer experience. We are excited to see how this collaboration will empower developers to build intelligent and scalable applications on Azure with ease. - Shireesh Thota, CVP, Azure Databases. Benefits of the native integration: This Azure Native Integration brings many benefits to developers and businesses: Seamless Provisioning from Azure: Developers can create and manage Neon organizations directly within the Azure portal, without switching platforms. Single Sign-On (SSO): Users can access Neon via SSO using their Microsoft credentials, streamlining the login process and enhancing security. Enhanced Developer Experience: The integration allows developers to use Azure CLI and SDKs of their choice from .NET, Java, Python, Go, and JavaScript, to manage Neon organizations alongside other Azure resources, keeping development workflows consistent. Unified Billing: Neon usage can be included on existing Azure invoices, simplifying billing and financial management for businesses. By Purchasing Neon through Azure, customers can decrement their Microsoft Azure Consumption Commitment (MACC), if any with Microsoft. How to create a Neon organization from Azure You can find the details of how to create a Neon organization from Azure in the Microsoft docs. The section below summarizes the key steps which will aid you in resource creation. Step 1: Discover and Subscribe to Neon from Azure You can start your journey either from Azure portal or Azure Marketplace. Search for Neon Serverless Postgres in the search bar and select the offering. This will take you to the Marketplace landing page of Neon Serverless Postgres. Choose a plan out of the 3 available public plans. If you are new and exploring, you can start with a free plan. Click on Subscribe to move forward to the resource configuration stage. Step 2: Complete your Neon resource configuration on Azure You are now creating a Neon resource on Azure. The process is similar to creating other Azure resources and requires basic details like Azure subscription, resource group and resource details. The resource can be created in the East US 2, Germany West Central and West US 3 after public preview. Please check the region dropdown to view the available regions. The creation flow also simultaneously creates a Neon organization. For that, mention the name of your Neon organization. Once all details have been filled in, you can review the information by going to Review + Create. This will trigger the deployment process and result in resource creation. Congratulations! You just created a Neon organization from Azure. Let us now visit the Neon organization we just created. Step 3: Transition to Neon from Azure portal Go to the resource created and you will land onto the overview blade where you will find the resource details. We support single-sign-on from Azure portal to Neon. Click on the SSO link to transition to Neon portal, where you can continue with creating projects and databases, inviting users and much more. Step 4: Create Projects, Branches and Databases on Neon On the Neon portal, you will land on the project creation view. Proceed to create your first Neon project. When this project is created, a default branch and database is created as well. Visit the project dashboard to view project details. You can copy the connection URL of the newly created database and use it in your Azure application stack to connect to the database. Go ahead and create more projects in the Azure regions of your choice and explore interesting features like branches and AI based query generation. Now, you are ready to use Neon Serverless Postgres in your real-world applications. Real-World Applications Neon’s Serverless Postgres service is ideal for a variety of use cases, including: AI and Machine Learning: With the ability to generate vector embeddings and integrate with Azure AI services, Neon is well-suited for AI and machine learning applications. Neon’s Autoscaling ensures that even resource-intensive AI models operate seamlessly during periods of high demand without manual intervention. SaaS Applications: The scalability and flexibility of Neon’s Postgres service make it perfect for SaaS applications that need to handle varying levels of traffic. Its serverless architecture eliminates the need for infrastructure management, allowing developers to focus on building features while ensuring cost-effective scaling to meet demand. For more use-cases and success stories, visit Case Studies - Neon to understand how Neon, now on Azure, can create value in your organization. Ready to try out Neon Serverless Postgres as an Azure Native Integration? Check out the next steps and share your feedback with us. This is just the beginning of Neon Serverless Postgres on Azure and stay tuned as we make this integration seamless with more features. Next Steps Subscribe to Neon Serverless Postgres on Azure portal or Azure Marketplace Learn more about Neon Serverless Postgres at Microsoft docs Read the launch blogpost by Neon Discover more about Neon Submit feature suggestions and questions in the Neon discord community or contact feedback@neon.tech. Please mention that you are using Neon Serverless Postgres on Azure in your messages. Learn about Microsoft’s investment in Neon Thank you for reading this blog! Please follow for more updates on Neon Serverless Postgres as an Azure Native Integration.1.2KViews3likes0CommentsPartner Blog | Expanded partner benefits are now available: What’s new in February 2026
Expanded partner benefits are now available across the Microsoft AI Cloud Partner Program. These updates reflect continued investment in the tools, resources, and support partners rely on to build, differentiate, and grow, and they incorporate feedback we hear consistently across the ecosystem. If you read our January post about planning ahead for the February refresh, this is the follow-up: the new benefits are now rolling out, and partners with eligible offers will find them in Partner Center as they become available. What’s new You’ll find a range of meaningful additions designed to empower you to move faster with AI, support security needs, and improve go-to-market execution. Highlights include: Copilot additions in select offers: The FY26 refresh introduces new Copilot-related benefits across parts of the program, including Microsoft 365 Copilot, Copilot Studio, and Microsoft Dragon Copilot (per user) in select partner offers where available. Security benefits expansion: Security-focused benefits have been broadened, including additions such as Microsoft Defender Suite, Microsoft Entra Suite, and Microsoft Intune Suite in select offerings. Azure credit updates: Azure benefits are being updated across multiple offers, including new additions and increases in value for certain cloud benefits. These credits are designed to support solution development, testing, and expansion of your practice. Go-to-market resources: As partners continue to access marketing benefits and resources through the program, Microsoft is simplifying discovery and execution—so you can bring campaigns to market with less friction. Continue reading here401Views2likes4CommentsPartner Case Study | Mead Johnson Nutrition
In the race to innovate, global enterprises aren’t usually held back by a lack of ideas but rather by the friction hidden in their data. Legacy SAP systems, siloed data storage environments, and inconsistent data structures can quietly slow operations, including reporting, forecasting, compliance, and even cutting-edge AI adoption. This friction can be especially acute in the retail life sciences and consumer health sectors, where any new products must adhere to strict standards of safety and scientific rigor, all at an enterprise scale. In these data-intensive environments, the stakes are high, and the pressure to innovate safely and without business disruption is even higher. Healthcare and consumer health organizations need a partner who can help them move fast while respecting the boundaries, systems, and regulations that keep them compliant. In other words, they need a partner with the speed and precision of a tiger—and Tiger Analytics is true to their namesake. Founded in 2011, Tiger Analytics is an innovator in AI and data science consulting with a team of more than 6,200. A Microsoft partner with specializations in Analytics and AI and Machine Learning, the team helps enterprises around the world modernize complex data ecosystems at scale so they can harness insights for real-world innovation. With deep experience in SAP, data engineering, and Azure cloud services, they’ve built a reputation for solving high-stakes data challenges with clarity and speed. And it was these capabilities that made Tiger Analytics an ideal option when global pediatric nutrition leader Mead Johnson Nutrition (MJN) decided to transform their digital operations and lay the foundation for faster, more efficient, and scalable enterprise AI. Modernizing data architecture to fuel smarter innovation With more than a century of experience developing science-based formulas for infants and children, MJN serves millions of families in highly regulated markets across North America, Asia, and Latin America. The scale of their operation makes any digital transformation a significant undertaking. So when they initiated a major effort to modernize their enterprise resource planning—which included migrating from SAP ECC to SAP S/4HANA—there was some inherent risk involved. The switch disrupted the existing data replication layer that fed analytics and reporting systems across the enterprise. To maintain business continuity, MJN needed to rapidly rebuild and enhance this layer. MJN already had an established partnership with Tiger Analytics, having worked with them on various data engineering initiatives. MJN brought Tiger Analytics in again to help design, build, and operationalize the new replication layer, enabling seamless data flow from the upgraded S/4HANA system to MJN’s analytics platform. MJN outlined three key strategic goals they needed the solution to support: Real-time replication from S/4HANA into the DnA Delta Lakehouse for advanced analytics, AI and machine learning use cases, and enterprise-wide reporting. Clean and secure nutrition data delivery from SAP S/4 to the Global EDAP team, which manages centralized analytics and data processing across business units. A smooth, timely transition to ensure business teams retained access to business-critical SAP tables. With MJN's goals clearly articulated and the partnership already well established, Tiger Analytics was ready to get to work. Continue reading here Explore all case studies or submit your own Subscribe to case studies tag to follow all new case study posts. Don't forget to follow this blog to receive email notifications of new stories!116Views0likes0CommentsAnnouncing seamless integration of Apache Kafka with Azure Cosmos DB in Azure Native Confluent
Integrate Azure Cosmos DB with Kafka applications Confluent announced general availability (GA) of the fully managed V2 Kafka connector for Azure Cosmos DB, enabling users to seamlessly integrate their Azure Cosmos DB containers with Kafka-powered event streaming applications. without worrying about provisioning, scaling, or managing the connector infrastructure. The Confluent Cosmos DB v2 connector offers significant advantages as compared to the v1 connector in terms of higher throughput, enhanced security and observability, and increased reliability. Seamless Integration with Azure Native Confluent We are excited to announce a new capability in Azure Native Confluent service that enables users to create and configure Confluent-managed Cosmos DB Kafka connectors (v2) for Azure Cosmos DB containers through a direct, seamless experience in the Azure portal. Users can also provision and manage environments, Kafka clusters and Kafka topics from within the Azure Native Confluent service, creating a holistic end-to-end experience to integrate with Azure Cosmos DB. This eliminates the need for users to switch between the Azure and Confluent Cloud portal. Key Highlights Bi-directional Support: Allows users to create source connectors to stream data from Cosmos DB to Kafka topics, or sink connectors to move data from Kafka into Cosmos DB. Secure Authentication: Users can authenticate to Kafka cluster using service accounts, enabling least-privilege access controls to provision the connectors - aligned with Confluent’s recommended security guidelines. Create a Confluent Cosmos DB (v2) Kafka Connector from Azure portal The following section summarizes the key steps required to provision the connector from the Azure Native Confluent service. Navigate to the native Confluent resource in Azure. Navigate to Connectors -> Create new connector You can also create an environment, cluster and topic from within the Azure Portal. Choose the desired connector type Source: to stream data from Azure Cosmos DB Sink to move data into Azure Cosmos DB Select ‘Azure Cosmos DB V2’ as the connector plugin Enter the connector name. Then, select the required Kafka topics, Azure Cosmos DB account and database. Select Service Account authentication and provide a name for the service account. When the connector is created, this will create a new service account on Confluent Cloud. Optionally, you can also select the user-account based authentication by provisioning an API key on Confluent Cloud. Do the required connector configurations. Enter the topic container mapping in the form of ‘topic1#container1,topic2#container2…’ Review the configuration summary and click Create. Your connector will appear in the list with real-time status indicators. Other Resources Try out the Azure Native Confluent Service right away! Every new sign-up gets a free $1000 credit! To learn more, check out the Microsoft Docs Follow the Azure Partner discussion board to keep up to date all on Azure announcements and join the conversation with subject matter experts! If you would like to give us feedback on this feature, the overall product or have any suggestions for us to work on, please drop in your suggestions in the comments.298Views0likes0CommentsUnleashing New Business Opportunities for Microsoft Partners with PostgreSQL & MySQL on Azure
The latest innovations announced at Microsoft Ignite 2025 for PostgreSQL and MySQL running on Azure are more than just technical upgrades—they’re a launchpad for new business growth, deeper customer engagement, and accelerated digital transformation. Here’s how these advancements can help you deliver greater value and unlock new opportunities for your clients. 1. Introducing Azure HorizonDB: Built for Performance and AI Workloads We’re excited to unveil Azure HorizonDB in private preview—a new, fully managed PostgreSQL service engineered for business and developers alike. HorizonDB is designed for ultra-low latency, high read scale, and built-in AI capabilities, offering seamless scaling up to 192 virtual cores and 128 TB of storage. Deep integration with developer tools, including GitHub Copilot, delivers performance, resilience, and simplicity at any scale. With HorizonDB, teams can: Build AI apps at scale using advanced DiskANN vector indexing, pre-provisioned AI models, semantic search, and unified support for both relational and graph data. Accelerate app development with built-in extensions, including the PostgreSQL extension for Visual Studio Code integrated with GitHub Copilot. Copilot in VS Code is context-aware for PostgreSQL and enables one-click performance debugging. Unlock data insights through deep integrations with Microsoft Fabric and Microsoft Foundry. Expect reliability with enterprise-ready features from day one, including Entra ID integration, Private Link networking, and Azure Defender for Cloud. Business Opportunity: Position your practice as an early adopter and expert in next-generation database solutions. Position your practice as an early adopter and expert in next-generation database solutions by introducing customers to Azure HorizonDB. Use this conversation to offer migration, modernization, and AI-powered application development services leveraging Azure Database for PostgreSQL with future migrations to HorizonDB. Help clients build resilient, high-performance, and intelligent data platforms—driving new revenue streams and deeper customer engagement. 2. Modernize Data Infrastructure with Limitless Scale and Performance Azure’s new Elastic Clusters in Azure Database for PostgreSQL enable organizations to scale their databases horizontally across multiple nodes, supporting virtually unlimited throughput and storage. This means you can help clients build and grow multi-tenant SaaS applications and large-scale analytics solutions without the complexity of manual sharding or the limitations of legacy infrastructure. Azure’s managed service automates shard management, tenant isolation, and cross-node query coordination, freeing up your teams to focus on innovation instead of administration. Business Opportunity: Position your practice as the go-to partner for scalable, future-proof data platforms. Offer migration services, architecture consulting, and managed solutions that leverage Azure’s unique scale-out capabilities. 3. Accelerate Innovation with AI-Ready Databases Azure is leading the way in AI integration for open-source databases. With the PostgreSQL extension for Visual Studio Code and native Microsoft Foundry support, developers can build smarter apps and AI agents leveraging advanced AI capabilities directly in the database. Features like natural language querying, vector search, and seamless Copilot integration mean your clients can unlock new insights and automate processes faster than ever. Business Opportunity: Expand your offerings to include AI-powered analytics, intelligent agent development, and custom Copilot solutions. Help organizations harness their data for real-time decision-making and enhanced customer experiences. 4. Simplify and Accelerate Migrations from Legacy Systems The new AI-assisted Oracle to PostgreSQL migration tool dramatically reduces the effort and risk of moving off expensive, proprietary databases. Integrated into the PostgreSQL extension for VS Code, it automates schema and code conversion, provides inline AI explanations, and ensures secure, context-aware migrations. Business Opportunity: Lead migration projects that deliver rapid ROI. Offer assessment, planning, and execution services to help clients escape legacy costs and embrace open-source flexibility on Azure. 5. Enable Seamless Analytics and Real-Time Insights With support for Parquet in the Azure storage extension for PostgreSQL and Fabric zero-ETL mirroring for Azure Database for MySQL and Azure Database for PostgreSQL, Azure is bridging operational databases and analytics platforms. Business Opportunity: Build solutions that unify data estates, streamline analytics workflows, and deliver actionable intelligence. Position your team as experts in data integration and real-time analytics. 6. Drive Industry-Specific Transformation Ignite 2025 showcases real-world success stories from industries like healthcare (Apollo Hospitals), automotive (GM), and finance (Nasdaq), demonstrating how Azure’s open-source databases power resilient, scalable, and AI-driven solutions. Business Opportunity: Use these case studies to inspire clients in regulated or complex sectors. Offer tailored solutions that meet strict compliance, security, and performance requirements. Why Partners Win with Azure’s Latest Innovations Faster time-to-value: Help clients adopt the latest tech with minimal downtime and risk. Expanded service portfolio: From migration to AI, analytics to managed services, the new capabilities open doors to new revenue streams. Trusted platform: Azure’s enterprise-grade security, compliance, and high availability mean you can deliver solutions with confidence. Ready to help your customers achieve more? Dive deeper into the Ignite 2025 announcements and start building the next generation of intelligent, scalable, and AI-powered solutions on Microsoft Azure. Learn more here: https://ignite.microsoft.com/en-US/home250Views2likes0CommentsUpdated requirements for the SAP on Microsoft Azure specialization
The SAP on Microsoft Azure specialization is a credential that validates your deep expertise in planning, migrating, and operating SAP (Systems, Applications, and Products in Data Processing) workloads on Microsoft Azure. Microsoft is updating this specialization to make it more accessible for a broader range of partners. These changes are designed to expand opportunity—making it possible for more organizations to demonstrate SAP on Azure expertise, earn official Microsoft recognition, and unlock exclusive go‑to‑market advantages that strengthen differentiation in a competitive market. What’s changed Lower ACR threshold: Beginning in January 2026, the Azure consumed revenue (ACR) requirement decreased from $30,000 to $7,500,* totaled over three months, significantly reducing the barrier to entry while maintaining a high standard of technical capability. Streamlined skilling validation: Partners can validate required Microsoft learning coursework directly within Partner Center, removing the need to do so in the third‑party audit for this specialization. Skilling requirements may be met through either: The Azure for SAP Workloads Specialty certification. Completion of the Run SAP on the Microsoft Cloud learning path. This simplified approach accelerates the path to earning the specialization. Next actions Visit Partner Center to review the updated SAP on Microsoft Azure specialization requirements and apply or renew. Be sure to follow the Specialization Updates blog to stay up to date on all announcements! Throughout this document, $ refers to US dollar (USD).164Views1like0CommentsPartner Blog | Azure updates for partners: December 2025
At Microsoft Ignite 2025, we explored what it means for organizations to move into the era of Frontier transformation. This shift is focused on embedding AI across every part of the business to improve decision-making, increase speed, and create new value. Organizations leading in AI make it foundational. They rethink processes and integrate new technologies from the start to improve efficiency. For partners, this move toward Frontier represents a significant opportunity to lead customers into this new era. By building AI-powered solutions, connecting data for intelligent insights, and deploying Microsoft Azure’s cloud-ready platforms, partners can deliver value faster and scale confidently through the Microsoft ecosystem. Microsoft Ignite came with a significant number of announcements, so I’ve gathered the Azure updates that matter most for partners. These are the capabilities that can strengthen your ability to deliver intelligent solutions, drive operational efficiency, and differentiate your product or service in the market. You can also explore how partners are turning momentum into action, access highlights, and grab practical guidance from my Microsoft Ignite session. Continue reading here317Views0likes0CommentsDon’t miss Building Agents with Microsoft Foundry and Microsoft Foundry Agent Service!
Our dynamic four-part webinar series, Agentic AI + Copilot Partner Skilling Accelerator, empowers you to harness the Microsoft AI ecosystem to unlock new revenue streams and enhance customer success. Across the four sessions, Microsoft partners can expect to learn how to apply AI tools in no-code, low-code, and pro-code scenarios to build intelligent chat and workflow solutions, extend and customize capabilities, and create advanced, custom AI functionality. Don't miss the final session in the series, Building Agents with Microsoft Foundry and Microsoft Foundry Agent Service, where you'll learn how to design and deploy intelligent agents with Microsoft Foundry and Microsoft Foundry Agent Service, including multi-agent architectures and key protocols such as A2A and MCP. The live virtual event is scheduled for December 15, 2025. Register today to reserve your spot! Be sure to follow this Partner news blog for all partner related announcements by clicking follow above!306Views0likes0Comments