hybrid
11 TopicsThe Bridge - How Azure Arc brings cloud innovation to SQL Server anywhere
The ability to effectively manage data is more critical than ever and increasingly complex — with organizations hosting data on-premises, at the edge, and across multiple clouds. With the increasing need for businesses to maximize existing resources, one thing is clear: Today’s organizations need a bridge to bring cloud innovation to data estate across all environments – a consistent and efficient experience for managing, governing and security data. That bridge is Microsoft’s Azure Arc.
28KViews1like0CommentsArc enabled Azure migration & modernization journey
This blog explains how SQL Servers connected to Azure Arc can make the migration and modernization process to Azure SQL easier and faster. SQL Server enabled by Azure Arc allows SQL Server instances hosted outside of Azure to access Azure services. This results in operational efficiencies and cost reductions. Moreover, it also offers a simple migration path to Azure with minimal or no downtime.7.6KViews5likes0CommentsModernize your data estate on SQL from cloud to edge
Experience SQL innovation with consistency and rich Azure integration At this year’s Microsoft Ignite 2022 event, we talked about doing more with less and unlocking the value of your data. Leveraging the cloud to reduce costs, drive productivity and accelerate insights and decision-making can make a measurable impact on an organization’s competitiveness, particularly in uncertain times. This is true for your SQL Server data. At Ignite, we made several announcements that will help you modernize your SQL Server data by tapping into the power of Azure. These announcements (and a few sneak peeks) represent the ongoing investment we’re making in our family of SQL databases to deliver what you need for your business. Whether you’re currently on-premises, in the cloud, or somewhere in-between with a hybrid strategy, what really sets Microsoft SQL Server and the Azure SQL family of databases apart is the underlying SQL engine, language, and tools consistency that extends from cloud to edge, and the level of integration it has with other Azure services. Take SQL Server 2022, for example, currently in preview. It is truly a hybrid data platform that now connects to Azure in ways never seen before. In fact, it’s the most cloud-connected SQL Server yet. It uses Azure SQL Managed Instance for managed disaster recovery and Azure Synapse Link for SQL for near-real-time analytics. It also integrates with Microsoft Purview for insights, lineage, and centralized policy management, a true hub for security. Part of the enabling technology behind SQL Server 2022’s cloud connection comes from Azure Arc. Azure Arc-enabled SQL Server, which is our new branding for an existing member of the Arc-enabled family, connects your existing SQL Server to the cloud, allowing you to leverage some great Azure benefits, like Single Sign-on with Azure Active Directory with multi-factor authentication and Microsoft Defender for Cloud, as well as the Microsoft governance solution, Purview Access Policies. Coming soon, you can take advantage of Pay-As-You-Go licensing for Azure Arc-enabled SQL Server. This means that you can connect your SQL Server to Azure and have flexible licensing options from traditional SQL Server licensing to Pay-As-You-Go pricing. Pay for what you use, whether your SQL Server is on-premises, hybrid, or multi-cloud. Data permeates all aspects of a business, so faster actionable insights from that data can make all the difference. Azure Synapse Link for SQL enables near real-time analytics over your data in Azure SQL Database or SQL Server 2022, without having to create or maintain ETL pipelines. We’ve had some exciting new capabilities launch during its public preview, including a new Schedule Mode to control data ingestion timing, support for smaller-scale workloads, and full precision support for datetime data types in SQL Server. We’re looking forward to the general availability of Azure Synapse Link for SQL coming soon. Migrate and modernize your SQL to Azure Azure SQL Managed Instance combines the broadest SQL Server compatibility with all the benefits of a fully managed cloud database service. Not only does it provide industry-leading price performance using our latest premium-series hardware, but the latest generally available innovations help you further modernize your data for the cloud. Windows authentication for Azure Active Directory, for example, enables you to migrate legacy applications that cannot use Azure Active Directory authentication, with minimal to no changes in the application stack. Or, new data virtualization capabilities let you access external data while keeping it in its original format and location, eliminating the need for ETL processes and getting you faster insights from your data. If you’re interested in migrating your Oracle data to Azure SQL, we have some exciting news for you. Migration planning just got easier for Oracle customers wanting to modernize on Azure SQL, including Azure SQL Database Hyperscale, with two public preview announcements: the Database Migration Assessment for Oracle and the Database Schema Conversion Toolkit for Oracle, both of which are extensions in Azure Data Studio. With these announcements, customers can receive migration recommendations and an evaluation of code complexity as well as target sizing recommendations – all from the same tool. Schema migration between the different database platforms is also automated, and it runs on Linux environments, too! Today we also announced a comprehensive migration and modernization assessment for SQL Server on Azure Migrate, our free self-service migration tool. This new capability helps you easily discover and assess your entire SQL estate at scale, now including Hyper-V and bare-metal, helping identify ideal migration and modernization targets for Azure SQL, with cost optimization guidance built-in. A modern data platform for your next app on Azure We also provided an early view of innovations in Azure SQL Database to help developers build apps faster and with less “plumbing code.” Coming soon to SQL Database are new ways to integrate data with other services by invoking REST endpoints, boost price-performance with memory-optimized premium-series hardware and access data using modern REST and GraphQL endpoints. We’re very excited about the investments we’re making, like the offline SQL Database emulator and CI/CD integration, to help developers do even more with their SQL data. We invite you to watch our new Microsoft Mechanics video that double-clicks into some of the upcoming features that make Azure SQL Database great for modern, cloud-based apps. Enable comprehensive data governance at scale Finally, getting holistic and actionable insights from data, particularly in light of constantly evolving regulations, has become a real challenge. Microsoft Purview provides a unified hybrid data governance platform that creates a map of your data estate comprising Azure data sources, multi-cloud, and SaaS data systems. It’s truly the one-stop shop for technical, governance, and business stakeholders to map and curate their metadata and govern their hybrid data estate. This year’s Ignite saw a lot of new features rolling out to Microsoft Purview, with public previews for ML-based classifications, manual data lineage, metamodels, and self-service data access on Azure SQL. We also announced the general availability of dynamic lineage for Azure SQL Database. You can learn about all these announcements and more here. Your next steps If you like what you’ve read so far, we’d invite you to dive deeper into related topics by Watching the Ignite breakout sessions Getting started with Microsoft Learn to build your skills Exploring other Azure announcements at Microsoft Ignite6.1KViews1like0CommentsSQL Server enabled by Azure Arc, now assists in selecting the best Azure SQL target
To make the SQL Server migration journey more efficient, the SQL Server enabled by Azure Arc can now help our customers assess the readiness of their SQL Server workloads for Azure SQL migration and assist them in choosing the most suitable Azure SQL configuration.
5.5KViews3likes1CommentAzure Arc enabled SQL Server - determine your Azure SQL target with ease and confidence
Selecting the best Azure SQL target for your Azure Arc enabled SQL Server with confidence is now easier. All of this while you continue to manage secure and govern your SQL Server estate from Azure.2.3KViews2likes0CommentsUnlocking Enterprise AI: SQL Server 2025 and NVIDIA Nemotron RAG Accelerate AI
Today, most of the world’s data still remains untapped, sitting in databases, documents, and systems across organizations. Enterprises are racing to unlock this data’s value by building the next wave of generative AI applications—solutions that can answer questions, summarize documents, and drive smarter decisions. At the heart of these innovations are retrieval-augmented generation (RAG) pipelines, which enable users to interactively engage with large amount of data that continuously evolves. Yet, as promising as RAG pipelines are, enterprises face real challenges in making them work at scale. Handling both structured and unstructured data, processing massive volumes efficiently, and ensuring privacy and security are just a few hurdles. This is where the integration between SQL Server 2025 and NVIDIA Nemotron RAG models, deployed as NVIDIA NIM microservices, comes in, offering a new approach that streamlines AI deployment and delivers enterprise-grade performance—whether you’re running workloads in the cloud or on-premises. “As AI becomes core to every enterprise, organizations need efficient and compliant ways to bring intelligence to their data,” said Joey Conway, Senior Director of Generative AI software at NVIDIA. “With SQL Server 2025’s built-in AI and NVIDIA Nemotron RAG, deployed as NIM microservices, enterprises can deploy and run AI models close to their data on premises or in the cloud without complex integration, accelerating innovation while maintaining data sovereignty and control.” Overcoming the complexity of generating embeddings at scale Customer challenge Building responsive AI applications using RAG requires converting SQL data into vector embeddings—a process that feeds huge amounts of text through complex neural networks. This is inherently parallel and compute-intensive, often creating performance bottlenecks that prevent real-time data indexing. The result? Slow applications and poor user experiences. Moreover, enterprises need flexibility. Different embedding models excel at different tasks—semantic search, recommendations, classification—and each comes with its own tradeoffs in accuracy, speed, and cost. Businesses want to mix and match models, balance premium performance with budget constraints, and stay resilient against model deprecation or API changes. Furthermore, rapid experimentation and adaptation are key to staying ahead and thus developers want models that offer flexible customization and full transparency. The Solution: SQL Server 2025 + NVIDIA Nemotron RAG SQL Server 2025 brings AI closer to your data, allowing you to natively and securely connect to any model hosted anywhere. You can generate embeddings directly in SQL using extensions to T-SQL —no need for new languages, frameworks, or third-party tools. By connecting SQL Server 2025 to the llama-nemotron-embed-1b-v2 embedding model from NVIDIA, you eliminate bottlenecks and deliver the massive throughput needed for real-time embedding generation. llama-nemotron-embed-1b-v2 is a best in class embedding model that offers multilingual and cross-lingual text question-answering retrieval with long context support and optimized data storage. This model is part of NVIDIA Nemotron RAG models, a collection of extraction, embedding, reranking models, fine-tuned with the Nemotron RAG datasets and scripts, to achieve the best accuracy. These models offer flexible customization, enabling easy fine-tuning and rapid experimentation. They also offer full transparency with open access to models, datasets, and scripts. Llama-nemotron-embed-1b-v2 is the model of choice for embedding workflows, but this high-speed inference pipeline is not limited to this model and can potentially call any optimized AI model as an NVIDIA NIM microservice, seamlessly powering every stage of the RAG pipeline. From multimodal data ingestion and advanced retrieval to reranking, all operations run directly on your data within SQL Server. Such RAG systems can be applied across a wide range of use cases, enabling intelligent, context-aware applications across industries. Customer Benefits: With GPU acceleration and built-in AI of SQL Server 2025, you can achieve optimal inference, ensuring performance that meets the demands of modern applications. Our flexible approach lets you mix and match models to suit different use cases, striking the right balance between accuracy and cost. And with open models that enable vendor flexibility and rapid adaptation, you gain resilience to stay ahead of the curve in an ever-changing AI landscape. Streamlining AI Model Deployment with Enterprise-Grade Confidence Customer Challenge Integrating advanced AI models into enterprise workflows has historically been slow and complex. Specialized teams must manage intricate software dependencies, configure infrastructure, and handle ongoing maintenance—all while navigating the risks of deploying unsupported models in mission-critical environments. This complexity slows innovation, drains engineering resources, and increases risk. The Solution: Simplified, Secure Model Deployment with NVIDIA NIM This collaboration simplifies and de-risks AI deployment. The llama-nemotron-embed-1b-v2 model is available as an NVIDIA NIM microservice for secure, reliable deployment across multiple Azure compute platforms. Prebuilt NIM containers for a broad spectrum of AI models and can be deployed with a single command for easy integration into enterprise-grade AI applications using built-in REST APIs of SQL Server 2025 and just a few lines of code, regardless where you run SQL Server workloads and NVIDIA NIM, on premises or in the cloud. NIM containers package the latest AI models together with the best inference technology from NVIDIA and the community and all dependencies into a ready-to-run container, abstracting away the complexity of environment setup so customers can spin up AI services quickly. Furthermore, NVIDIA NIM is enterprise-grade and is continuously managed by NVIDIA with dedicated software branches, rigorous validation processes, and support. As a result, developers can confidently integrate state-of-the-art AI into their data applications. This streamlined approach significantly reduces development overhead and provides the reliability needed for mission-critical enterprise systems. NVIDIA NIM containers are discoverable and deployable via Microsoft Azure AI Foundry’s model catalog. Customer Benefits Rapid deployment with minimal setup means you can start leveraging AI without specialized engineering, and SQL Server 2025 makes it even easier with built-in support for AI workloads and native REST APIs. Enterprise-grade security and monitoring ensure safe, reliable operations, while SQL Server’s integration with Entra ID and advanced compliance features provide added protection. Direct integration into SQL workflows reduces complexity and risk, and with SQL Server’s hybrid flexibility, you can run seamlessly across on-premises and cloud environments—simplifying modernization while maintaining control. Innovating Without Compromise on Security or Flexibility Customer Challenge Organizations in regulated industries often face a tough choice: adopt powerful AI or maintain strict data residency and compliance. Moving sensitive data to external services is often not an option, and many companies run AI inference workloads both in the cloud and on-premises to balance scalability, privacy, regulatory compliance, and low-latency requirements. The Solution: Flexible, Secure Integration—On-Premises and Cloud SQL Server 2025 enables organizations in regulated environments to securely integrate locally hosted AI models, ensuring data residency and compliance while minimizing network overhead. This architecture boosts throughput by keeping sensitive data on-premises and leveraging SQL Server’s native extensibility for direct model invocation. With SQL Server 2025 and Nemotron RAG, deployed as NVIDIA NIM microservices, you get the best of both worlds. This solution can be seamlessly deployed in the cloud with serverless NVIDIA GPUs on Azure Container Apps (ACA) or on-premises with NVIDIA GPUs on Azure Local. Sensitive data never leaves your secure environment, allowing you to harness the full power of Nemotron models while maintaining complete data sovereignty and meeting the strictest compliance mandates. Customer Benefits SQL Server 2025 helps you maintain compliance by supporting data residency and meeting regulatory standard requirements across regions. Sensitive data stays protected on-premises with enterprise-grade security, including consistent access controls, ledger support, and advanced encryption to minimize risk. At the same time, SQL Server’s hybrid flexibility lets you deploy AI workloads wherever they’re needed—on-premises, in the cloud, or across a hybrid environment—while leveraging built-in AI features like vector search and secure integration with locally hosted models for performance and control. Conclusion: Powering the Next Wave of Enterprise AI The collaboration between Microsoft and NVIDIA is more than a technical integration. It’s designed to help enterprises overcome the toughest challenges in AI deployment. By streamlining vector embedding and vector search, delivering enterprise-grade performance, and enabling secure, flexible integration across cloud and on-premises environments, this joint solution empowers organizations to unlock the full value of their data. Whether you’re building conversational AI, automating document analysis, or driving predictive insights, SQL Server 2025 and NVIDIA Nemotron RAG models, deployed as NIM, provide the tools you need to innovate with confidence. The future of enterprise AI is here and it’s flexible, secure, and built for real business impact. Get started today: Learn more about SQL Server 2025 and download it today Learn more about our joint solution from NVIDIA’s Technical Blog GitHub: Microsoft SQL Server 2025 and NVIDIA Nemotron RAG599Views1like0CommentsReimagining Data Excellence: SQL Server 2025 Accelerated by Pure Storage
SQL Server 2025 is a leap forward as enterprise AI-ready database, unifying analytics, modern AI application development, and mission-critical engine capabilities like security, high availability and performance from ground to cloud. Pure Storage’s all-Flash solutions are engineered to optimize SQL Server workloads, offering faster query performance, reduced latency, and simplified management. Together it helps customers accelerate the modernization of their data estate.506Views2likes1Comment