<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>Azure Data Blog articles</title>
    <link>https://techcommunity.microsoft.com/t5/azure-data-blog/bg-p/AzureDataBlog</link>
    <description>Azure Data Blog articles</description>
    <pubDate>Sat, 14 Mar 2026 05:28:51 GMT</pubDate>
    <dc:creator>AzureDataBlog</dc:creator>
    <dc:date>2026-03-14T05:28:51Z</dc:date>
    <item>
      <title>The new frontier of data for the next generation of innovation</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/the-new-frontier-of-data-for-the-next-generation-of-innovation/ba-p/4463236</link>
      <description>&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;A decade ago, an agent that could gather insights from data, trigger actions and make intelligent decisions was science fiction. Today, that kind of intelligent technology is not only possible, but&amp;nbsp;it’s&amp;nbsp;becoming&amp;nbsp;a business requirement.&amp;nbsp;Enterprises must find new and meaningful ways to propel AI innovation that meets customer and business needs.&amp;nbsp;The next&amp;nbsp;generation of innovation&amp;nbsp;requires&amp;nbsp;a data foundation that is unified,&amp;nbsp;secure,&amp;nbsp;and&amp;nbsp;addresses persistent challenges of latency, rigidity,&amp;nbsp;and&amp;nbsp;complexity, while&amp;nbsp;infusing&amp;nbsp;data with AI to&amp;nbsp;optimize&amp;nbsp;performance and accelerate development.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;This week at Ignite, Microsoft is taking a bold step toward making a future-ready data foundation a reality, unveiling innovations that deliver performance, scale and flexibility, and bridge the gap between analytical intelligence and operational agility. The innovations we’re announcing are catalysts for businesses to modernize faster and build the intelligent applications of tomorrow. By leveraging a unified data strategy on Azure,&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://www.microsoft.com/en/customers/story/19769-bmw-ag-azure-app-service" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;BMW&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; is much closer to their goal of predictive maintenance for vehicles and smart factories. With&amp;nbsp;these&amp;nbsp;releases&amp;nbsp;announced today at Ignite, Azure is poised to deliver&amp;nbsp;a&amp;nbsp;resilient, scalable, and AI-integrated&amp;nbsp;data&amp;nbsp;foundation&amp;nbsp;that will help&amp;nbsp;you&amp;nbsp;unlock innovation at&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;scale.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H2 aria-level="2"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;Modeling&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;&amp;nbsp;a future-proofed data platform&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:160,&amp;quot;335559739&amp;quot;:80,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H2&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;A future-proofed data platform&amp;nbsp;should&amp;nbsp;deliver performance at scale, flexibility and openness, unified operations and analytics, streamlined management,&amp;nbsp;and&amp;nbsp;seamless integration with&amp;nbsp;developer&amp;nbsp;tools, all backed by&amp;nbsp;security and trust. The innovations&amp;nbsp;we’re&amp;nbsp;announcing&amp;nbsp;reflect these priorities.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H3 aria-level="3"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;Performance at any scale&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:160,&amp;quot;335559739&amp;quot;:80}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H3&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;We’re&amp;nbsp;releasing performance enhancements across the database portfolio that&amp;nbsp;let users&amp;nbsp;easily scale&amp;nbsp;performance to support&amp;nbsp;intelligent&amp;nbsp;agents and applications&amp;nbsp;that&amp;nbsp;can&amp;nbsp;stand apart in any&amp;nbsp;industry.&amp;nbsp;These new features&amp;nbsp;enable&amp;nbsp;applications&amp;nbsp;backed by&amp;nbsp;Microsoft&amp;nbsp;databases&amp;nbsp;to&amp;nbsp;seamlessly handle massive throughput and global user loads without performance bottlenecks.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H5 aria-level="4"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;Introducing&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;Azure&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;HorizonDB&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:80,&amp;quot;335559739&amp;quot;:40}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H5&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;We’re&amp;nbsp;excited to&amp;nbsp;unveil&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://aka.ms/PostgreSQL_Reimagined" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Azure HorizonDB&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;in private&amp;nbsp;preview,&amp;nbsp;a new fully&amp;nbsp;managed PostgreSQL service built for&amp;nbsp;performance and AI workloads&amp;nbsp;that will offer&amp;nbsp;scaling up to 192 virtual cores and 128&amp;nbsp;TB of storage.&amp;nbsp;Azure&amp;nbsp;HorizonDB&amp;nbsp;is built for business and engineered for developers.&amp;nbsp;Ultra-low latency, high read scale, built in AI, and&amp;nbsp;deep&amp;nbsp;integration with&amp;nbsp;developer&amp;nbsp;tools&amp;nbsp;including&amp;nbsp;GitHub Copilot&amp;nbsp;delivers&amp;nbsp;performance,&amp;nbsp;resilience&amp;nbsp;and simplicity at any scale.&amp;nbsp;With&amp;nbsp;HorizonDB, teams can:&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="5" data-list-defn-props="{&amp;quot;335551671&amp;quot;:0,&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="0" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Build AI apps&amp;nbsp;that perform at scale with advanced&amp;nbsp;DiskANN&amp;nbsp;vector indexing, pre-provisioned AI models, semantic search, and unified support for relational and graph data.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="5" data-list-defn-props="{&amp;quot;335551671&amp;quot;:0,&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Accelerate&amp;nbsp;app&amp;nbsp;development with&amp;nbsp;built-in extensions, including the PostgreSQL extension for Visual Studio (VS) Code integrated with&amp;nbsp;GitHub Copilot.&amp;nbsp;GitHub Copilot&amp;nbsp;in VS Code&amp;nbsp;is context aware of&amp;nbsp;PostgreSQL&amp;nbsp;and&amp;nbsp;includes one-click&amp;nbsp;performance debugging.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="5" data-list-defn-props="{&amp;quot;335551671&amp;quot;:0,&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="2" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Unlock data insights&amp;nbsp;with deep integrations&amp;nbsp;with Microsoft Fabric&amp;nbsp;and&amp;nbsp;Microsoft Foundry.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="5" data-list-defn-props="{&amp;quot;335551671&amp;quot;:0,&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="3" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Expect&amp;nbsp;reliability with a service that is enterprise ready on day one,&amp;nbsp;integrated with&amp;nbsp;Entra ID, Private Link networking, and Azure Defender for Cloud.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;H5 aria-level="4"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;Perfecting performance across the portfolio&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:80,&amp;quot;335559739&amp;quot;:40}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H5&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;We’re&amp;nbsp;also addressing&amp;nbsp;cloud-ready&amp;nbsp;performance and scaling needs in Azure Database for PostgreSQL and Azure SQL.&amp;nbsp;&lt;/SPAN&gt;&lt;A class="lia-external-url" href="https://aka.ms/ignite-25-pgsql" target="_blank"&gt;Elastic Clusters&lt;/A&gt; &lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;for Azure Database for PostgreSQL&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;,&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;now generally available, enables developers to easily scale&amp;nbsp;a single database&amp;nbsp;across&amp;nbsp;a cluster&amp;nbsp;of&amp;nbsp;read and write&amp;nbsp;nodes using a simple SQL command.&amp;nbsp;Additionally, new&amp;nbsp;v6&amp;nbsp;SKUs, which support up to 192&amp;nbsp;vCores,&amp;nbsp;and&amp;nbsp;general availability of&amp;nbsp;PostgreSQL 18 gives Azure Database for PostgreSQL&amp;nbsp;users&amp;nbsp;a potent performance boost.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;With the release of&amp;nbsp;the&amp;nbsp;next-generation&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://aka.ms/mi-next-gen-gp-blog" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Azure SQL Managed Instance&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;,&amp;nbsp;we’re&amp;nbsp;helping&amp;nbsp;you&amp;nbsp;modernize SQL Server&amp;nbsp;in the cloud&amp;nbsp;with better performance and easier migration.&amp;nbsp;You’ll&amp;nbsp;now have access to&amp;nbsp;the latest technology, unlocking&amp;nbsp;better performance and scale with&amp;nbsp;more&amp;nbsp;storage and database capacity. Flexible&amp;nbsp;compute,&amp;nbsp;storage&amp;nbsp;and memory options&amp;nbsp;also&amp;nbsp;enhance the ROI of migration and offer broad compatibility for unique workload demands.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H3 aria-level="3"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;Multi-modal&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;,&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;flexible&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;&amp;nbsp;and open&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:160,&amp;quot;335559739&amp;quot;:80,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H3&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;A comprehensive&amp;nbsp;data&amp;nbsp;strategy&amp;nbsp;isn’t&amp;nbsp;one-size&amp;nbsp;fits&amp;nbsp;all.&amp;nbsp;Openness and flexibility are core tenants of a future-ready data platform.&amp;nbsp;Flexibility&amp;nbsp;means&amp;nbsp;you&amp;nbsp;get to&amp;nbsp;choose the deployment model that works for&amp;nbsp;your business, whether&amp;nbsp;it’s&amp;nbsp;on-premises, cloud only or hybrid.&amp;nbsp;Beyond having flexibility for where your data lives, the modern data platform&amp;nbsp;should&amp;nbsp;also support multiple data models and open APIs to reduce complexity and enable extensibility as workload needs and team resources evolve.&amp;nbsp;That’s&amp;nbsp;why Azure fully embraces, supports,&amp;nbsp;and contributes to open-source innovation.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H5 aria-level="4"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;Meet the new Azure&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;DocumentDB&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:80,&amp;quot;335559739&amp;quot;:40}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H5&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;We’re&amp;nbsp;excited to announce the&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://aka.ms/ignite25/documentdb" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;general availability of Azure DocumentDB&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;,&amp;nbsp;the new name for our MongoDB-compatible NoSQL document database service with hybrid and multi-cloud flexibility.&amp;nbsp;Powered by the open-source&amp;nbsp;DocumentDB&amp;nbsp;engine&amp;nbsp;managed&amp;nbsp;by&amp;nbsp;the Linux Foundation, Azure&amp;nbsp;DocumentDB&amp;nbsp;is designed for enterprise workloads with the flexibility to build anywhere and run managed on Azure. It includes native vector search&amp;nbsp;powered by&amp;nbsp;DiskANN&amp;nbsp;and&amp;nbsp;full-text search,&amp;nbsp;and&amp;nbsp;it&amp;nbsp;supports&amp;nbsp;advanced search scenarios that combine fuzzy search and BM25 ranking&amp;nbsp;for&amp;nbsp;smarter, more&amp;nbsp;accurate&amp;nbsp;query&amp;nbsp;results.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H3 aria-level="3"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;Support for&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;t&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;ranslytical&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;workload&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;s&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:160,&amp;quot;335559739&amp;quot;:80,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H3&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;A&amp;nbsp;translytical&amp;nbsp;data platform&amp;nbsp;is designed&amp;nbsp;to support both transactional and analytical workloads. This&amp;nbsp;combo&amp;nbsp;is&amp;nbsp;crucial for responsive, real-time AI applications. A future-proofed data&amp;nbsp;strategy&amp;nbsp;should natively bridge operational data and analytical insight; a capability&amp;nbsp;we’re&amp;nbsp;delivering with&amp;nbsp;Microsoft Fabric.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H5 aria-level="4"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;Unifying data with Fabric databases&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:80,&amp;quot;335559739&amp;quot;:40}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H5&gt;
&lt;P&gt;&lt;A href="https://aka.ms/Fabric-databases-GA" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Fabric databases&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;are now generally available&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, bringing together SQL&amp;nbsp;database&amp;nbsp;and&amp;nbsp;Cosmos DB inside Microsoft Fabric.&amp;nbsp;Built natively into Microsoft Fabric,&amp;nbsp;Fabric databases&amp;nbsp;bridge the gap between traditional databases and data lakes,&amp;nbsp;enabling real-time analytics, transactional processing, and AI workloads to run side by side&amp;nbsp;in one governed environment. Every Fabric Database automatically&amp;nbsp;connects to&amp;nbsp;your organizational data mesh, ready for Power BI, AI, and Copilot experiences.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:278}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H5 aria-level="4"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;Replicating&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;databases&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;with zero ETL,&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;in near real time&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;,&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;with mirroring&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:80,&amp;quot;335559739&amp;quot;:40}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H5&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;If&amp;nbsp;you&amp;nbsp;prefer to keep&amp;nbsp;your&amp;nbsp;operational databases where they are,&amp;nbsp;you&amp;nbsp;can still take advantage of Fabric’s unified data foundation with&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://aka.ms/Fabric-databases-GA" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;database mirrorin&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;g&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, which is now generally available in Microsoft Fabric, supporting SQL Server, Azure Cosmos DB, and Azure Database for PostgreSQL. With mirroring, you can replicate these databases in Fabric for business analytics and AI scenarios without migrating or refactoring.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:278}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Several early adopters&amp;nbsp;are already experiencing&amp;nbsp;real results&amp;nbsp;with&amp;nbsp;Fabric&amp;nbsp;Databases&amp;nbsp;and mirroring. AP Pension,&amp;nbsp;a Danish pension fund,&amp;nbsp;has&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://www.microsoft.com/en/customers/story/1771984633872247465-ap-pension-azure-databricks-insurance-en-denmark" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;consolidated decades of fragmented data using Microsoft Fabric&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, enabling a unified, governed analytics platform for actuarial, finance, and development teams. With Fabric,&amp;nbsp;they’ve&amp;nbsp;built a centralized medallion architecture, automated data delivery via APIs, and supported real-time write-back from Power BI through SQL Databases — all with strong governance and security baked in.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H5 aria-level="4"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;G&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;eneral&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;&amp;nbsp;availability of SQL Server 2025&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:80,&amp;quot;335559739&amp;quot;:40,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H5&gt;
&lt;P&gt;&lt;A href="https://aka.ms/sqlserver2025blog" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;SQL Server 2025&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;is&amp;nbsp;now generally available following an outstanding preview with&amp;nbsp;10,000&amp;nbsp;participating&amp;nbsp;organizations,&amp;nbsp;double the&amp;nbsp;download rates of&amp;nbsp;SQL Server&amp;nbsp;2022, and more than one million databases created so far.&amp;nbsp;Built on SQL Server’s&amp;nbsp;foundation of&amp;nbsp;trusted&amp;nbsp;security, performance and availability, SQL Server 2025 redefines&amp;nbsp;what's&amp;nbsp;possible for enterprise data. With built-in AI and developer-first enhancements, SQL Server 2025 empowers customers to accelerate AI innovation using the data they already have, securely and at scale, all within SQL Server using the familiar T-SQL language.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H3 aria-level="3"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;AI at the core&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:160,&amp;quot;335559739&amp;quot;:80,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H3&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;We&amp;nbsp;believe that AI is a force multiplier for the data platform itself, so&amp;nbsp;every Azure database is&amp;nbsp;deeply embedded&amp;nbsp;with AI capabilities&amp;nbsp;transforming them&amp;nbsp;from passive data&amp;nbsp;stores&amp;nbsp;into active, intelligent&amp;nbsp;engines&amp;nbsp;for AI-powered applications.&amp;nbsp;Azure databases are built&amp;nbsp;to understand, reason and act, which translates to faster, more&amp;nbsp;accurate&amp;nbsp;search, smarter recommendations, streamlined developer&amp;nbsp;workflows, and the ability to power agentic and generative AI workloads without friction.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H5 aria-level="4"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;Azure SQL&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;&amp;nbsp;integrates&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;DiskANN&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:80,&amp;quot;335559739&amp;quot;:40}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H5&gt;
&lt;P&gt;&lt;A href="https://www.microsoft.com/en-us/research/project/project-akupara-approximate-nearest-neighbor-search-for-large-scale-semantic-search/?msockid=0798d296edf1695b0555c40aec006846" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;DiskANN&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;,&amp;nbsp;Microsoft’s cutting-edge vector search algorithm,&amp;nbsp;is now natively integrated into&amp;nbsp;SQL Server 2025,&amp;nbsp;Azure&amp;nbsp;SQL Database and&amp;nbsp;Azure&amp;nbsp;SQL Managed Instance.&amp;nbsp;DiskANN&amp;nbsp;delivers fast, scalable, and highly&amp;nbsp;accurate&amp;nbsp;approximate nearest neighbor (ANN) search, handling millions to billions of vectors with low latency and high recall.&amp;nbsp;This enables developers to build intelligent, AI-powered applications more efficiently directly within the database engine,&amp;nbsp;eliminating&amp;nbsp;the need for external vector databases and simplifying&amp;nbsp;the&amp;nbsp;architecture for future AI-native apps.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H5 aria-level="4"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;Azure&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;Cosmos&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;DB&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;supercharged by AI&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:80,&amp;quot;335559739&amp;quot;:40}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H5&gt;
&lt;P&gt;&lt;A href="https://devblogs.microsoft.com/cosmosdb/" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Azure&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Cosmos&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;DB&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;continues to evolve as the backbone for AI-powered, globally distributed applications.&amp;nbsp;At Ignite,&amp;nbsp;we’re&amp;nbsp;introducing a new wave of enhancements that make vector search, text retrieval, and semantic relevance faster, more intuitive, and more intelligent for modern AI workloads.&amp;nbsp;One of the biggest improvements comes from advancements in Azure Cosmos DB vector search powered by&amp;nbsp;DiskANN. The latest engine optimizations significantly boost throughput and reduce latency for vector insert and update operations.&amp;nbsp;Additional&amp;nbsp;enhancements include:&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="7" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;General availability of&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://aka.ms/Ignite25/CosmosDBSearch" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;fuzzy search in Azure Cosmos DB full-text search&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, which&amp;nbsp;enables more flexible text matching.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="7" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="2" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;General availability of&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://aka.ms/ignite25/cosmosfleets" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Azure Cosmos DB Fleets&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;,&amp;nbsp;allowing&amp;nbsp;multi-tenant apps to share throughput capacity across multiple database accounts while&amp;nbsp;maintaining&amp;nbsp;full security and performance isolation for their tenants.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="7" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="3" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Public preview of&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://aka.ms/ignite25/cosmosfleets" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;fleet analytics&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;that&amp;nbsp;provide&amp;nbsp;insights for multi-tenant workload optimization and growth planning.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;H5 aria-level="4"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;Azure&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;Database for PostgreSQL&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;optimizes&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;dev&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;eloper&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;ex&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 4"&gt;periences&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:80,&amp;quot;335559739&amp;quot;:40,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H5&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;We’ve&amp;nbsp;also&amp;nbsp;made&amp;nbsp;improvements to&amp;nbsp;Azure Database for PostgreSQL to help developers&amp;nbsp;streamline their workflows and build&amp;nbsp;and scale&amp;nbsp;next-gen&amp;nbsp;AI solutions&amp;nbsp;faster with confidence.&amp;nbsp;The&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://aka.ms/ignite-25-pgsql-dev" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;PostgreSQL extension for Visual Studio (VS) Code&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;,&amp;nbsp;now&amp;nbsp;generally available,&amp;nbsp;seamlessly&amp;nbsp;unifies DBA and&amp;nbsp;developer workflows for PostgreSQL databases—on Azure or anywhere. The improved extension, which already&amp;nbsp;reached&amp;nbsp;more than&amp;nbsp;250K&amp;nbsp;installs in preview, gives developers a familiar, productive environment to work with PostgreSQL,&amp;nbsp;complete with Azure AD authentication and GitHub Copilot AI&amp;nbsp;assistance&amp;nbsp;for SQL coding. Azure Database for PostgreSQL is also now natively integrated with&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://aka.ms/ignite-25-pgsql-dev" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Microsoft&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;&amp;nbsp;Foundry&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;,&amp;nbsp;enabling developers to build intelligent,&amp;nbsp;secure&amp;nbsp;AI apps and agents with minimal friction.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;H3 aria-level="3"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;Grounded in&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;&amp;nbsp;security and trust&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:160,&amp;quot;335559739&amp;quot;:80,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H3&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Innovation&amp;nbsp;shouldn’t&amp;nbsp;come at the expense of security.&amp;nbsp;A unified data platform&amp;nbsp;should&amp;nbsp;have end-to-end governance and security built in for enterprise-grade&amp;nbsp;resilience&amp;nbsp;as you build&amp;nbsp;what’s&amp;nbsp;next.&amp;nbsp;At&amp;nbsp;Microsoft, we&amp;nbsp;continue to deliver&amp;nbsp;a secure cloud environment for your data with features like&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/purview/data-governance-overview" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Microsoft Purview for data governance&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;and unified identity and access controls across&amp;nbsp;the&amp;nbsp;entire&amp;nbsp;Azure&amp;nbsp;data estate.&amp;nbsp;&amp;nbsp;Most recently,&amp;nbsp;we’ve&amp;nbsp;announced:&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="·" data-font="Symbol" data-listid="4" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;·&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;A href="https://github.com/Azure-Samples/Access-token-refresh-samples/tree/main" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Access token refresh with Entra ID&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;&amp;nbsp;for Azure Database for PostgreSQL&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;,&amp;nbsp;which enables database connections using AD credentials to automatically renew tokens,&amp;nbsp;eliminating&amp;nbsp;disruptions&amp;nbsp;and ensuring strong identity-based security without added complexity.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="·" data-font="Symbol" data-listid="4" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;·&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="2" data-aria-level="1"&gt;&lt;A href="https://techcommunity.microsoft.com/blog/adforpostgresql/announcing-azure-confidential-computing-for-azure-database-for-postgresql-ga/4454893" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Confidential Compute&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;&amp;nbsp;in Azure Database for PostgreSQL&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;,&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;which&amp;nbsp;provides&amp;nbsp;access to Confidential Virtual Machines (CVMs) to protect&amp;nbsp;sensitive&amp;nbsp;data&amp;nbsp;even&amp;nbsp;during processing.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 aria-level="2"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;The next frontier of data is here&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:160,&amp;quot;335559739&amp;quot;:80,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H2&gt;
&lt;P&gt;&lt;A href="https://www.microsoft.com/en/customers/story/19769-bmw-ag-azure-app-service" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;BMW&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;has already begun to embrace the next frontier of data. They modernized their Mobile Data Recorder (MDR) system on Azure to deploy multi-agent AI that enables their engineers to instantly analyze telemetry data. Azure&amp;nbsp;Cosmos DB&amp;nbsp;provides persistent storage for chat conversations and memory, while Azure Database for PostgreSQL supports structured telemetry analysis and feedback mechanisms. Both integrate seamlessly with Microsoft Foundry Agent Service, which BMW leverages to orchestrate specialized agents. This solution helped them deliver insights 12x faster, embed AI-driven workflows into daily engineering, and accelerate innovation across their global operations.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;As someone&amp;nbsp;who’s&amp;nbsp;seen data technology&amp;nbsp;evolve,&amp;nbsp;I’m&amp;nbsp;excited&amp;nbsp;about how the&amp;nbsp;latest&amp;nbsp;capabilities and integrations across the&amp;nbsp;Azure database portfolio&amp;nbsp;simplify&amp;nbsp;architectures while opening new doors.&amp;nbsp;They&amp;nbsp;represent&amp;nbsp;Microsoft’s commitment to helping customers innovate with a unified, intelligent data estate.&amp;nbsp;The organizations that lead in the AI era will be those that have their data house in order; our goal at&amp;nbsp;Microsoft&amp;nbsp;is to give you the keys to that house.&amp;nbsp;With a unified data platform,&amp;nbsp;you’re&amp;nbsp;not just solving today’s problems—you’re&amp;nbsp;building a foundation for endless&amp;nbsp;innovation.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Join us online or in person at&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://ignite.microsoft.com/en-US/home" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Microsoft Ignite&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;November 18&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;—21, 2025&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;,&amp;nbsp;to&amp;nbsp;see these announcements in action and get insights on building your&amp;nbsp;own&amp;nbsp;future-ready data strategy.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:276}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 18 Nov 2025 15:59:48 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/the-new-frontier-of-data-for-the-next-generation-of-innovation/ba-p/4463236</guid>
      <dc:creator>Shireesh_Thota</dc:creator>
      <dc:date>2025-11-18T15:59:48Z</dc:date>
    </item>
    <item>
      <title>Building Intelligent AI Apps with Microsoft Databases</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/building-intelligent-ai-apps-with-microsoft-databases/ba-p/4413833</link>
      <description>&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;We’re living in a transformative era—where AI is no longer a futuristic vision, but an urgent, present-day imperative. Organizations across every industry are racing to integrate intelligent solutions that can streamline operations, unlock insights, and deliver deeply personalized experiences. But here’s the truth: AI readiness doesn’t start with the model—it starts with the data. And for enterprises, having the right data infrastructure in place isn’t just helpful, it’s mission critical.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;That’s where Azure databases come in. At Microsoft, we understand that to be truly AI-ready, organizations need a data foundation that’s built on trust, scale, and resilience, so they can innovate boldly, securely, and at scale. Modern AI solutions—from copilots to autonomous agents—thrive on data at scale. And not just traditional structured data like customer records or transactions, but massive volumes of unstructured data: documents, images, logs, conversations, sensor outputs, and more.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Today's announcements highlight our commitment to helping customers fully unlock the potential of their data, building upon a strong foundation for using AI to create new possibilities and opportunities.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Modernize your data for AI Readiness&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Modernizing your data is a critical first step in making it more accessible, reliable and usable for AI app development.&amp;nbsp; SQL developers love the scalability and agility they can achieve in the cloud, and we’ve brought many of those cloud benefits on-premises with SQL Server 2025.&amp;nbsp; This release has already resonated well with more than 3,400 applicants for our private preview program, with full adoption coming twice as fast as SQL Server 2022.&amp;nbsp; We’re excited to announce &lt;/SPAN&gt;&lt;A href="https://aka.ms/Build/sql2025blog" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;public preview of SQL Server 2025&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, our most significant release of SQL Server in the last decade.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN style="display: block; text-align: center;"&gt; &lt;IMG style="width: 80%; display: block; margin-left: auto; margin-right: auto;" src="http://cdn.techcommunity.microsoft.com/assets/SQLServer/MSFT-2503_Azure_Cosmos_DB_SQL_Server_2025_GIF.gif" alt="" /&gt; &lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;SQL Server 2025 empowers developers to build modern AI applications using their own data.&amp;nbsp; Built on a foundation known for best-in-class security and performance, SQL Server 2025 will accelerate time-to-market for your new applications with features designed to boost developer productivity while keeping your data safe. Make the most of Microsoft’s most significant release of SQL Server in the last decade to:&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI data-leveltext="·" data-font="Symbol" data-listid="14" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;·&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Boost search intelligence using advanced semantic search alongside full text search and filtering - allowing you to run generative AI models of your choice using your own data.&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="·" data-font="Symbol" data-listid="12" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;·&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Process and manage data flows more simply and efficiently using native JSON support, built-in REST API, Change Event Streaming for real-time data updates.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="·" data-font="Symbol" data-listid="12" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;·&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="2" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Leverage the most secure database to improve credential management and reduce potential vulnerabilities with support for Microsoft Entra managed identities through Azure Arc.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="·" data-font="Symbol" data-listid="11" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;·&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Increase workload uptime and improve concurrency for SQL Server applications with enhanced query optimization, optimized locking and improved failover reliability.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="·" data-font="Symbol" data-listid="11" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;·&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="2" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Achieve zero-ETL, real-time analytics by replicating SQL Server data to Microsoft OneLake with Fabric database mirroring.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;See &lt;/SPAN&gt;&lt;A href="https://aka.ms/Build/sql2025video" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;SQL Server 2025 in action&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; and learn how you can &lt;/SPAN&gt;&lt;A href="https://aka.ms/sql2025" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;get started&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; today.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;P class="lia-align-center"&gt;&lt;SPAN data-contrast="auto"&gt;“&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;We are excited about the future of SQL Server 2025 and the transformative &lt;/SPAN&gt;&lt;/P&gt;
&lt;P class="lia-align-center"&gt;&lt;SPAN data-contrast="auto"&gt;improvements it brings to database performance and management&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;.”&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P class="lia-align-center"&gt;&lt;SPAN data-contrast="auto"&gt;Madhab Paudel, Database Engineer, Entain&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;SQL Server Management Studio (SSMS) is the go-to tool for developers and database administrators managing, configuring and querying SQL Server databases. Today we’re taking a leap forward with the general availability of&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://aka.ms/ssms-ga-release" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;SQL Server Management Studio 21&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, now built on Visual Studio 2022 and including 64-bit support, Git integration, an enhanced user interface and more.&amp;nbsp; We’re further enhancing the user experience with the preview of &lt;/SPAN&gt;&lt;A href="https://aka.ms/copilot-ssms-release" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Copilot in SSMS 21&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, providing an AI-powered assistant to help with writing, editing and fixing T-SQL queries using natural language.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Visual Studio (VS) Code has become a favorite among developers for its lightweight design, powerful extensions, and seamless support for a wide range of programming languages and workflows. Today we’re announcing the preview of &lt;/SPAN&gt;&lt;A href="https://aka.ms/vscode-mssql-copilot" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;MSSQL extension for Visual Studio Code with GitHub Copilot&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; capabilities. GitHub Copilot also brings natural language capabilities and helps developers explore schemas, optimize queries with intelligent suggestions, and streamline database interactions—all within VS Code.&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;For mission-critical workloads, performance and availability are non-negotiable.&amp;nbsp; We’re excited to announce the general availability of two &lt;/SPAN&gt;&lt;A href="https://aka.ms/hsenhancements" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;major performance upgrades&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; in Azure SQL Database Hyperscale. Continuous priming now ensures faster failover recovery by proactively warming up secondary replicas, minimizing downtime and boosting resilience. Additionally, the log generation rate has increased to 150 MiB/s, enabling significantly faster data ingestion and improved performance for write-heavy workloads. These enhancements make Hyperscale even more powerful for critical applications.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;P class="lia-align-center"&gt;&lt;SPAN data-contrast="auto"&gt;“When we process new data, we have a massive amount that needs to go out immediately. Azure SQL Database Hyperscale was the only tier that allowed us to do a massive data push without affecting the user experience in our platform.”&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P class="lia-align-center"&gt;&lt;SPAN data-contrast="auto"&gt;Marco De Sanctis, Chief Technology Officer, Mondra Global Limited&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Modern application development demands speed and flexibility and JSON enables developers to model complex, evolving data without rigid schemas. We’re happy to announce that the&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/sql/t-sql/data-types/json-data-type?view=azuresqldb-current" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;native JSON type and JSON aggregates&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; in Azure SQL Database are generally available, enabling more efficient reads, writes and storage.&amp;nbsp; &lt;/SPAN&gt;&lt;A href="https://aka.ms/json-index" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;JSON indexing&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, now in preview for Azure SQL Database, allows for more efficient searching of JSON documents using JSON functions and relational operators.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Experience open flexibility with Azure scale&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Open source databases play a critical role in shaping the future of intelligent applications, with PostgreSQL being the most popular according to &lt;/SPAN&gt;&lt;A href="https://survey.stackoverflow.co/2024/technology#1-databases" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Stack Overflow’s 2024 Developer Survey&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;. Microsoft proudly supports open source, and our recent investments into both the open-source Postgres project and our cloud services underscore our commitment to making the developer experience better than ever, using AI to shape how apps are built.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Today, we’re announcing the preview of a new and improved &lt;/SPAN&gt;&lt;A href="https://aka.ms/PostgreSQL-VScode-extension" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;PostgreSQL extension for Visual Studio (VS) Code with GitHub Copilot&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt; &lt;SPAN data-contrast="auto"&gt;capabilities. With this powerful tool&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;,&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt; you can connect to PostgreSQL database instances, run queries, create and manage connection profiles, take advantage of Entra ID support, and more—all within VS Code.&amp;nbsp; Designed to boost your productivity and streamline development, the extension includes the ability to chat with GitHub Copilot for PostgreSQL context and AI assistance and support for deployments in Docker, on premises or on Azure.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;For AI app developers, database speed and availability are critical to ensuring AI models can access and process data quickly.&amp;nbsp; We are pleased to announce that &lt;/SPAN&gt;&lt;A href="https://aka.ms/pg-diskann-blog" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;DiskANN&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, one of the fastest vector indexing algorithms on the market, is now generally available on Azure Database for PostgreSQL.&amp;nbsp; With this release, developers can leverage the new product quantization optimizations in DiskANN to build high performance, low latency and scalable generative AI applications that outperform pgvector index types. Also coming soon, you can enable &lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/azure/postgresql/flexible-server/concepts-storage#premium-ssd-v2-preview" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;high availability (HA) with Azure Premium SSD v2 disks&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; while deploying Azure Database for PostgreSQL flexible server. We’ve rearchitected HA for SSD v2 and most workloads will typically see less performance impact when using high availability in SSD v2 vs SSD v1.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;P class="lia-align-center"&gt;&lt;SPAN data-contrast="auto"&gt;“Azure Database for PostgreSQL gives us greater performance capabilities and flexibility, enabling smaller teams to do more—especially when managing large numbers of databases.”&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P class="lia-align-center"&gt;&lt;SPAN data-contrast="auto"&gt;Mike Jasperson, VP, IoT Software Operations and Support, PTC&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Vector search is vital for generative AI but often misses semantic relationships in enterprise data. We’re introducing generative AI-powered reasoning in PostgreSQL with the preview of&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://aka.ms/semantic-postgres" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;semantic operators in Azure Database for PostgreSQL&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;.&amp;nbsp; By adding semantic understanding to the levels of operational data that are not visible to vector search, customers can gain deeper insights from their data, allowing applications to reason in ways that were not possible before. &lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Build no-compromise SaaS apps that scale&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;AI applications thrive on real-time, reliable, and always accessible data, making enterprise-grade availability and global scale foundational to their success. For multi-tenant SaaS platforms delivering intelligent experiences, even brief data outages or latency can disrupt AI model performance and degrade user trust. Azure Cosmos DB provides the distributed architecture, multi-region replication, low latency, and elastic scalability needed to meet these demands.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Today we’re announcing the public preview of &lt;/SPAN&gt;&lt;A href="https://aka.ms/cosmosdb-preview-dynamic-routing" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;per partition automatic failover&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; for Azure Cosmos DB. This new capability intelligently routes requests at a more granular partition level during localized or regional outages, helping you achieve active-active 5’9s availability for any workload type, any consistency level from eventual to strong. You’ll benefit from stronger, built-in resiliency, reduced downtime, and less operational overhead—helping to ensure that your applications remain consistently available and reliable. &lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;We’re further optimizing queries on Azure Cosmos DB with a &lt;/SPAN&gt;&lt;A href="https://aka.ms/cosmos-db-global-secondary-index" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;global secondary index&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, now in public preview. Previously known as materialized views, global secondary indexes are read-only containers that automatically sync data from a source container. They have an independent partition key, data model, and index policy, allowing you to fine-tune them for any query pattern. And now, you can create index containers using Azure portal.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Managing multi-tenant SaaS apps is complex—balancing diverse customer needs with performance, scalability, and security. With &lt;/SPAN&gt;&lt;A href="https://aka.ms/AzureCosmosDBFleet" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Azure Cosmos DB fleets&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, now in preview, you can streamline operations by pooling throughput (RU/s) across accounts, reducing overprovisioning while maintaining tenant-level isolation and security. It’s a simpler, more efficient way to scale and manage Azure Cosmos DB resources for multi-tenant scenarios.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Managing user access across tenants quickly becomes complex and error-prone without centralized identity control.&amp;nbsp; Today we announced the &lt;/SPAN&gt;&lt;A href="https://aka.ms/build25/cosmosdb/mongodb-vcore-entra-id" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;general availability of &lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Microsoft Entra ID authentication for &lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Azure Cosmos DB for MongoDB (vCore)&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, enhancing security and simplifying identity management. With this update, you can now add Entra ID accounts directly to your MongoDB vCore clusters and use them for secure database access.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Azure AI Foundry simplifies project management and deployments while integrating advanced AI capabilities. Developers can now use Azure Cosmos DB accounts to power AI solutions within the platform. Using the Azure AI Foundry SDK, customers can now securely store conversation threads between users and AI agents in Azure Cosmos DB accounts, enabling agents to recall and resume previous discussions. &lt;/SPAN&gt;&lt;A href="https://aka.ms/Build25/CosmosDBAIFoundry" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Thread&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;s&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt; storage&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; is now generally available. Also, developers will soon have the capability to utilize data stored in their Azure Cosmos DB accounts for powering AI solutions in Azure AI Foundry. Customers will be able to connect and access their Azure Cosmos DB data using Azure AI Foundry within application code.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;We’re also announcing the public preview of &lt;/SPAN&gt;&lt;A href="https://aka.ms/ADB-AIFoundry" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="auto"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Azure AI Foundry connection for Azure Databricks&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, enabling Foundry Agents to use AI/BI Genie and run Azure Databricks Jobs. This can enhance knowledge retrieval and broadens how Foundry Agents deliver contextual answers grounded in enterprise data. Foundry Agents can now reason across Microsoft’s entire ecosystem, including Microsoft 365, Dynamics 365, Microsoft Fabric, and Azure Databricks— providing a robust foundation for building intelligent AI applications.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;We’re making several announcements focused on enhancing the search capabilities of your applications.&amp;nbsp; Now generally available, &lt;/SPAN&gt;&lt;A href="https://aka.ms/Build25/CosmosDBFTS" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;native full-text and hybrid search&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; capabilities mean you can use Azure Cosmos DB for NoSQL to efficiently create sophisticated applications and GenAI solutions, reducing complexity and costs compared to using external search services.  We’re continuing to invest here with the previews of &lt;/SPAN&gt;&lt;A href="https://aka.ms/CosmosDB/FTSGA" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;fuzzy text search&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; and &lt;/SPAN&gt;&lt;A href="https://aka.ms/CosmosDB/FTSGA" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;full-text search and scoring in multiple languages&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; in Azure Cosmos DB for NoSQL.&amp;nbsp; These new features enable developers to deliver more flexible and forgiving search experiences alongside full-text indexing and BM25-based ranking across a growing set of languages.&amp;nbsp; Rounding out our search enhancements, &lt;/SPAN&gt;&lt;A href="https://aka.ms/CosmosDB/FTSGA" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;phrase search&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; is now generally available and makes your full-text search more precise and powerful by matching entire phrases rather than just individual keywords.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;P class="lia-align-center"&gt;&lt;SPAN data-contrast="auto"&gt;“We found Cosmos DB built-in hybrid search to be 50-60 times cheaper than other search services we evaluated, providing us performance and resiliency Docusign needs.”&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P class="lia-align-center"&gt;&lt;SPAN data-contrast="auto"&gt;Kunal Mukerjee, VP of AI Technology Strategy, Docusign&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Finally,&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://aka.ms/diskANNonMongovCoreDoc" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;DiskANN&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; is now generally available in vCore-based Azure Cosmos DB for MongoDB, bringing powerful vector search capabilities directly into the database. With support for up to 4,000 dimensions, you can efficiently search across high-dimensional data like text embeddings and images—enabling low-latency, high-accuracy AI experiences without the need for a separate vector store.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;P class="lia-align-center"&gt;&lt;SPAN data-contrast="auto"&gt;“With our virtual assistant in particular, we have experienced the higher scale, higher availability, and faster time to market of using Azure Cosmos DB.”&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P class="lia-align-center"&gt;&lt;SPAN data-contrast="auto"&gt;Viju Chacko, Head of Digital Architecture, Air India&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Unify your data estate on Microsoft Fabric&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;With Microsoft Fabric, our vision is to simplify how you interact with data by bringing together all the tools you need on a single platform. &lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;We're expanding that vision with the unification of data estates with the preview of &lt;/SPAN&gt;&lt;A href="https://aka.ms/FabricCosmosDBNoSQLBlog" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Cosmos DB (NoSQL) in Fabric&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;. Based on Azure Cosmos DB, Cosmos DB in Fabric brings the same enterprise-grade dynamic scalability for unstructured data types that the world’s largest retailers and large-scale AI applications rely on today. Developers can now deploy Cosmos DB in Fabric in just a few clicks to build high-performance, distributed applications with ease.&amp;nbsp; You can &lt;/SPAN&gt;&lt;A href="https://aka.ms/fabric-cosmosdb" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;try Cosmos DB in Fabric today&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Organizations are under pressure to unify fragmented data estates and deliver insights faster than ever.&amp;nbsp; Whether you're modernizing on-premises systems or building AI-powered applications, mirroring into Microsoft Fabric simplifies integration, enhances resilience, and accelerates time-to-insight across your entire data estate.&amp;nbsp; We recently announced the preview of &lt;/SPAN&gt;&lt;A href="https://techcommunity.microsoft.com/blog/adforpostgresql/announcing-mirroring-for-azure-database-for-postgresql-in-microsoft-fabric-for-p/4396750" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;mirroring for Azure Database for PostgreSQL&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; and &lt;/SPAN&gt;&lt;A href="https://blog.fabric.microsoft.com/en/blog/supporting-database-mirroring-sources-behind-a-firewall?ft=Onelake:category" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Azure SQL Database&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt; behind a firewall&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;. We are now excited to announce &lt;/SPAN&gt;&lt;A href="https://aka.ms/IntroMirroringSQL" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;mirroring &lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;for all in-market &lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;versions of &lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;SQL Server&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; from SQL Server 2016 to SQL Server 2022.&amp;nbsp; And, with the preview announcement for SQL Server 2025, we are also announcing preview of mirroring for SQL Server 2025 in Fabric.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Azure databases are the foundation for AI innovation&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Scalable, resilient, and secure data is the cornerstone of breakthrough AI innovation and Azure databases are built with the enterprise-grade performance modern apps demand.&amp;nbsp;&amp;nbsp; Whether you're supporting millions of users across a global SaaS platform or training intelligent models on real-time data, Azure’s fully managed database services provide the foundation to move fast without compromising trust or uptime. With built-in high availability, advanced threat protection, and elastic scalability, Azure databases empower organizations to innovate confidently and deliver intelligent, data-driven experiences at enterprise scale.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Join us at &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/home" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Microsoft Build&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt; from May 19 to 22, 2025 to see all these announcements in action across the following sessions:&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="16" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="1" data-aria-level="1"&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/BRK197" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;BRK&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt; 197&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;: Architecting highly resilient applications in Azure&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:279}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="16" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="2" data-aria-level="1"&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/BRK202?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;BRK 202&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;: Scale and secure MongoDB-compatible apps with Azure Cosmos DB&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:279}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="16" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="3" data-aria-level="1"&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/BRK203?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;BRK203&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;: Get faster LLM responses and low app latency with Azure Managed Redis&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:279}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="16" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="4" data-aria-level="1"&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/BRK204?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;BRK 204:&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt; What's New in Microsoft Databases: Empowering AI-Driven App Dev&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:279}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="16" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="5" data-aria-level="1"&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/BRK206?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;BRK206&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;: &lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Microsoft Fabric for Developers: Build Scalable Data &amp;amp; AI Solutions&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:279}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="16" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="6" data-aria-level="1"&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/BRK207?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;BRK207&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;: SQL Server 2025: The Database Developer Reimagined&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:279}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="16" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="7" data-aria-level="1"&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/BRK210?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;BRK210&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;: Build AI apps and unlock the power of your data with Azure Databricks&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:279}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="16" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="8" data-aria-level="1"&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/BRK211?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;BRK211&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;: Building advanced agentic apps with PostgreSQL on Azure&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:279}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="16" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="9" data-aria-level="1"&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/BRK212?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;BRK212&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;: Design scalable data layers for multi-tenant apps with Azure Cosmos DB&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:279}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="16" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="10" data-aria-level="1"&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/BRK213?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;BRK213:&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt; Enable Advanced AI Scenarios with Unified Data Estates in Microsoft Fabric&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="16" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="11" data-aria-level="1"&gt;&lt;A class="lia-external-url" href="https://build.microsoft.com/en-US/sessions/COMM410" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;COMM410&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240,&amp;quot;335559740&amp;quot;:240}"&gt;: &lt;SPAN data-teams="true"&gt;The Future of Databases &amp;amp; Developers: An AMA with Azure Database Leadership&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 20 May 2025 18:23:22 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/building-intelligent-ai-apps-with-microsoft-databases/ba-p/4413833</guid>
      <dc:creator>Shireesh_Thota</dc:creator>
      <dc:date>2025-05-20T18:23:22Z</dc:date>
    </item>
    <item>
      <title>Fuel AI Innovation with Microsoft Databases</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/fuel-ai-innovation-with-microsoft-databases/ba-p/4303476</link>
      <description>&lt;P&gt;If data is the fuel that powers AI, then AI is only as good as the data behind it. Right now, it’s never been more important to have a strong data analytics and management foundation in place.&lt;/P&gt;
&lt;P&gt;To help our customers achieve real transformation with AI, we’ve invested heavily across the Microsoft databases portfolio. That includes developing a comprehensive vision for our databases focused on one goal: enabling you to build the next generation of intelligent applications.&lt;/P&gt;
&lt;P&gt;This week at Microsoft Ignite 2024, you’ll hear about how Azure helps you create fast, secure, and scalable applications powered by the latest advances in AI. Let me share a quick summary of our top database announcements.&lt;/P&gt;
&lt;H2&gt;Delivering the best enterprise databases&lt;/H2&gt;
&lt;P&gt;At the heart of this vision is our commitment to providing the best enterprise databases, with a strong emphasis on reliability, resiliency, and security and the performance and availability needed to support modern, intelligent applications.&lt;/P&gt;
&lt;P&gt;We’re proud of the ground-to-cloud flexibility we offer for your workloads, and that’s especially true when it comes to our SQL databases—they’re enterprise-ready and built on the same proven, industry-leading SQL engine, so you have a consistent SQL experience whether you’re on-premises or in the cloud.&amp;nbsp; The latest release,&lt;STRONG&gt; &lt;/STRONG&gt;&lt;A href="https://aka.ms/ignite24/sql2025" target="_blank" rel="noopener"&gt;SQL Server 2025&lt;/A&gt;, is now in private preview and features built-in AI to simplify intelligent application development and RAG patterns. &amp;nbsp;&lt;/P&gt;
&lt;P&gt;We also announced the general availability of &lt;A href="https://aka.ms/sqlmipools-ga" target="_blank" rel="noopener"&gt;instance pools&lt;/A&gt; in Azure SQL Managed Instance. Instance pools let you provision&amp;nbsp;small, cost-effective 2-vCores instances&amp;nbsp;within a pre-provisioned pool, helping you right-size your workloads when migrating or modernizing in the cloud.&lt;/P&gt;
&lt;P&gt;In our flagship NoSQL database&lt;STRONG&gt; &lt;/STRONG&gt;Azure Cosmos DB, &lt;A class="lia-external-url" href="https://devblogs.microsoft.com/cosmosdb/announcing-the-ga-of-dynamic-scaling-per-region-and-per-partition-autoscale/" target="_blank" rel="noopener"&gt;dynamic autoscaling&lt;/A&gt; is now generally available, providing cost optimization for nonuniform workloads. With dynamic autoscaling, partitions and regions scale independently so you can scale all the data your AI applications are using in the most cost-efficient way.&lt;/P&gt;
&lt;P&gt;We’ve also invested in our fully managed open-source databases. This includes &lt;A class="lia-internal-link lia-internal-url lia-internal-url-content-type-blog" href="https://techcommunity.microsoft.com/blog/adformysql/ignite-2024-new-innovations-in-azure-database-for-mysql/4295842" target="_blank" rel="noopener" data-lia-auto-title="new features" data-lia-auto-title-active="0"&gt;new features &lt;/A&gt;in Azure Database for MySQL such as zonal resiliency by default, which will be generally available in December 2024. This feature helps you ensure seamless server recovery and business continuity in the face of zonal outages. Also coming to public preview in December, you can use Azure Migrate to discover MySQL instances and their attributes within your environment, assess their readiness for migration, and obtain recommendations on suitable compute and storage options.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In addition, we’ve made recent investments in Azure Database for PostgreSQL, including the public preview of&lt;STRONG&gt; &lt;/STRONG&gt;&lt;A href="https://aka.ms/preview-elastic-clusters" target="_blank" rel="noopener"&gt;elastic clusters&lt;/A&gt;&lt;STRONG&gt; &lt;/STRONG&gt;on Azure Database for PostgreSQL – Flexible Server. This enables horizontal scaling through row-based and schema-based sharding, making it easy to build multitenant apps by offloading shard management and operations—such as tenant isolation, split, or online rebalancing of shards—to the service.&amp;nbsp;&amp;nbsp; &amp;nbsp;&lt;/P&gt;
&lt;H2&gt;Unlocking the power of AI with SaaS-ified databases&lt;/H2&gt;
&lt;P&gt;Built on a SaaS foundation, &lt;A href="https://aka.ms/Ignite24/blog/Fabric" target="_blank" rel="noopener"&gt;Fabric Databases&lt;/A&gt; are a new class of cloud databases that bring together transactional and analytical workloads to create a truly unified data platform.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Now in preview, &lt;A href="https://aka.ms/announcingsqlfabric" target="_blank" rel="noopener"&gt;SQL database&lt;/A&gt; is the first database engine to come to Fabric, enabling customers to:&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Build intelligent AI applications faster with built-in vector search, RAG support, and Azure AI integration.&lt;/LI&gt;
&lt;LI&gt;Boost productivity with auto-optimizing and auto-scaling databases.&lt;/LI&gt;
&lt;LI&gt;Accelerate innovation with Copilot assistance.&lt;/LI&gt;
&lt;LI&gt;Support CI/CD using GitHub integration for source control.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;Ready to give SQL database in Fabric a try? Starting December 3rd, you can join live sessions with database experts and the Microsoft product team and see just how easy it is to get started. View the schedule and register for the series &lt;A class="lia-external-url" href="https://developer.microsoft.com/reactor/series/S-1431?ocid=ignite24_azdata" target="_blank" rel="noopener"&gt;here&lt;/A&gt;.&amp;nbsp; You can also &lt;A href="https://aka.ms/fabcon-vegas" target="_blank" rel="noopener"&gt;register today&lt;/A&gt; to join us from March 29 to April 3, 2025, at the Microsoft Fabric Community Conference in Las Vegas, Nevada to learn more.&lt;/P&gt;
&lt;P&gt;Rounding out our Fabric Databases news, we’re also pleased to announce the public preview of &lt;A class="lia-external-url" href="https://www.microsoft.com/security/blog/2024/09/25/activate-your-data-responsibly-in-the-era-of-ai-with-microsoft-purview/?msockid=044d759b6a406e2c1d8761ba6e4068eb" target="_blank" rel="noopener"&gt;Fabric integration with Microsoft Purview Information Protection&lt;/A&gt;, extending the benefits of central, policy-based governance to Fabric items, including the new SQL database in Fabric.&lt;/P&gt;
&lt;H2&gt;Meeting the needs of modern AI developers&lt;/H2&gt;
&lt;P&gt;Beyond a SaaS-ified experience, we also want to provide the best databases for AI developers. We have a lot of exciting developments to share that demonstrate our commitment to building an ecosystem that’s integrated with Azure AI services and tool support to make building AI applications even easier.&lt;/P&gt;
&lt;P&gt;Let’s start with a recent innovation in Azure SQL Database and now in SQL database in Fabric – native&amp;nbsp;&lt;A href="https://devblogs.microsoft.com/azure-sql/exciting-announcement-public-preview-of-native-vector-support-in-azure-sql-database/" target="_blank" rel="noopener"&gt;vector support&lt;/A&gt;. We’re excited to announce the public preview of a &lt;STRONG&gt;vector data type&lt;/STRONG&gt; that gives developers the ability to handle vector data, which is foundational when it comes to building scalable AI-enabled applications. This announcement also includes essential &lt;STRONG&gt;new vector functions&lt;/STRONG&gt;, like VECTOR_DISTANCE, VECTOR_NORM and VECTOR_NORMALIZE to support advanced operations, particularly when the embedding model does not return normalized vectors.&amp;nbsp; And, we’re working hard to make vector indexing on Azure SQL even faster with DiskANN, coming in the future. DiskANN is one of the fastest vector indexing algorithms on the market, and its performance and reliability characteristics are a great fit for our customers’ demanding requirements.&lt;/P&gt;
&lt;P&gt;While Azure Cosmos DB is already fueling some of the most powerful AI applications on the market, we’re committed to making it even better. We're making a number of &lt;A class="lia-external-url" href="https://devblogs.microsoft.com/cosmosdb/new-vector-search-full-text-search-and-hybrid-search-features-in-azure-cosmos-db-for-nosql/" target="_blank" rel="noopener"&gt;announcements &lt;/A&gt;to boost performance even further, to include the general availability of DiskANN vector index in Azure Cosmos DB for NoSQL. We’re also announcing two new search features available in public preview: full-text search, which enables efficient text searches and text-based ranking with BM25, and hybrid search, which combines the benefits of semantic vector search, with the text-based relevance of BM25. &amp;nbsp;These new capabilities help retrieve the most accurate data from the database to power generative AI applications.&lt;/P&gt;
&lt;P&gt;Azure Database for PostgreSQL is optimized for AI developers, and we continue to expand its capabilities with the addition of &lt;A href="https://techcommunity.microsoft.com/blog/adforpostgresql/introducing-diskann-vector-index-in-azure-database-for-postgresql/4261192" target="_blank" rel="noopener"&gt;DiskANN&lt;/A&gt;&lt;STRONG&gt;,&lt;/STRONG&gt; now available in preview. The new &lt;A href="https://aka.ms/pg-ranker" target="_blank" rel="noopener"&gt;Semantic Ranker Solution Accelerator&lt;/A&gt;, now generally available,&amp;nbsp;provides automated deployment scripts that can be used to provision a semantic ranker model as an Azure Machine Learning inference endpoint.&amp;nbsp; Finally, we introduced graph processing capabilities within Azure Database for PostgreSQL with the &lt;A class="lia-internal-link lia-internal-url lia-internal-url-content-type-blog" href="https://techcommunity.microsoft.com/blog/adforpostgresql/introducing-support-for-graph-data-in-azure-database-for-postgresql-preview/4275628" target="_blank" rel="noopener" data-lia-auto-title="Apache AGE graph extension" data-lia-auto-title-active="0"&gt;Apache AGE graph extension&lt;/A&gt; and enhanced the accuracy of your generative AI applications with a solution accelerator that integrates &lt;A href="https://aka.ms/pg-graphrag" target="_blank" rel="noopener"&gt;GraphRAG&lt;/A&gt; with PostgreSQL graph query capabilities.&amp;nbsp; Finally, automatic parameter tuning is coming soon to Azure Database for PostgreSQL.&amp;nbsp; Using machine learning to optimize workload parameters, your AI applications will exhibit significantly higher performance and scalability.&lt;/P&gt;
&lt;P&gt;We also have some exciting enhancements for MySQL developers. Azure Database for MySQL now supports the &lt;A class="lia-internal-link lia-internal-url lia-internal-url-content-type-blog" href="https://techcommunity.microsoft.com/blog/adformysql/ignite-2024-new-innovations-in-azure-database-for-mysql/4295842" target="_blank" rel="noopener" data-lia-auto-title="MySQL 9.1 Innovation release" data-lia-auto-title-active="0"&gt;MySQL 9.1 Innovation release&lt;/A&gt;, which includes exciting new capabilities, such as JavaScript for stored procedures and vector datatype support, expanding your options for application development and advanced data processing.&lt;/P&gt;
&lt;H2&gt;Integrating your data estate&lt;/H2&gt;
&lt;P&gt;Microsoft Fabric is the foundation of an integrated data estate, bringing&amp;nbsp;together everything from data science, engineering, and warehousing to real-time intelligence and operational databases in one environment. Mirroring provides a modern way of accessing and ingesting data continuously and seamlessly from any database into Microsoft Onelake in Fabric. This capability is now generally available for &lt;A class="lia-external-url" href="https://blog.fabric.microsoft.com/en-us/blog/announcing-mirroring-azure-sql-database-in-fabric-now-generally-available-ga" target="_blank" rel="noopener"&gt;Azure SQL Database&lt;/A&gt; and in preview for &lt;A class="lia-internal-link lia-internal-url lia-internal-url-content-type-blog" href="https://techcommunity.microsoft.com/blog/azuresqlblog/fabric-mirroring-for-azure-sql-managed-instance-now-in-public-preview/4290837?previewMessage=true" target="_blank" rel="noopener" data-lia-auto-title="Azure SQL Managed Instance" data-lia-auto-title-active="0"&gt;Azure SQL Managed Instance&lt;/A&gt;.&amp;nbsp; Mirroring is also in private preview for SQL Server 2016-2022, and you can sign-up to participate &lt;A class="lia-external-url" href="https://aka.ms/mirroring/sql-server-prpr-signup" target="_blank" rel="noopener"&gt;here&lt;/A&gt;.&lt;/P&gt;
&lt;H2&gt;The future of databases is now&lt;/H2&gt;
&lt;P&gt;Microsoft databases are empowering developers with essential tools to create groundbreaking AI applications. Whether you’re a large enterprise or a startup building your first application, Microsoft databases provide the performance, security, and reliability to help you make the most of your investments—and unlock valuable insights that drive business growth. Embracing these technologies today sets the stage for a more innovative, efficient, and secure tomorrow.&lt;/P&gt;</description>
      <pubDate>Tue, 19 Nov 2024 17:00:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/fuel-ai-innovation-with-microsoft-databases/ba-p/4303476</guid>
      <dc:creator>Shireesh_Thota</dc:creator>
      <dc:date>2024-11-19T17:00:00Z</dc:date>
    </item>
    <item>
      <title>Data for now and next: Powering a new era of AI apps</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/data-for-now-and-next-powering-a-new-era-of-ai-apps/ba-p/4146832</link>
      <description>&lt;P&gt;In this era of AI, the power to transform human interactions and propel businesses forward lies in the hands of organizations that are ready to learn, adapt and scale fast. And as business applications become more critical to engaging with customers, to optimizing processes and team productivity, and to driving innovation, the role of developers is more important than ever.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;To help set the foundation for innovating with AI, Microsoft is introducing a new way of managing data at any scale, anywhere. Where your on-premises investments and experiences extend to the cloud, while managed services do more on your behalf with modernized workloads for the next stage of growth. By harnessing Azure databases’ fully managed capabilities and integration with Microsoft proprietary and third-party solutions, developers can focus on what really matters: INNOVATION.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Azure cloud scale databases are designed to support any type and size of application, and direct integration into cloud scale analytics and AI help accelerate actionable insights, while simplified and intelligent security solutions protect your data across all layers. Always updated and secure, Azure databases help streamline modern, intelligent application development, empowering developers to build incredible digital experiences with agility and ease.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;As AI applications become more mainstream, seamless database management can be paramount. Trusted solutions that can scale limitlessly and autonomously, respond fast, and offer unparalleled flexibility and reliability will shape the future of coding.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Azure cloud scale databases go beyond simplifying database operations – they are built and tuned for transformative AI innovation.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;This week at Microsoft Build 2024, you’ll hear about breakthroughs in generative AI, building Copilots, securing applications and more. Let me offer you a glimpse of what’s in store for Azure databases and what sessions you won’t want to miss.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Building future-ready, scalable AI apps&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Azure’s vision for cloud-scale data is grounded in fueling AI innovation. This week, we are introducing important features that help developers build and transform modern, intelligent applications. Organizations looking to innovate with AI can now rely on native vector search capabilities in Azure Cosmos DB for NoSQL. With semantic search powered by DiskANN, a powerful algorithm library, Azure Cosmos DB becomes the first cloud database to offer super lower latency at scale with vector database built-in.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Microsoft is also announcing the general availability of &lt;A href="https://aka.ms/Build24/Blog/PostgreSQLAI" target="_blank" rel="noopener"&gt;Azure Database for PostgreSQL Azure AI extension&lt;/A&gt; to help unlock innovation with AI. Using this extension, developers can connect PostgreSQL in Azure directly to Azure Open AI embedding models helping to simplify development of RAG apps using data in PostgreSQL. &amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Azure Database for PostgreSQL is also the industry’s first cloud database to provide &lt;A href="https://aka.ms/Build24/Blog/PGembedding" target="_blank" rel="noopener"&gt;local embedding&lt;/A&gt; creation directly in the database server using a SQL interface. Vector embeddings enable developers to determine the similarity between text strings and are foundational for building generative AI apps. Local embedding generation is now in public preview, and offer single-digit millisecond latency, eliminated throttling due to remote embedding model API's, and predictable costs, all while ensuring data remains within the security perimeter of your Postgres server.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://aka.ms/Build24/Blog/MySQL" target="_blank" rel="noopener"&gt;New Accelerated Logs in Azure Database for MySQL,&lt;/A&gt; now generally available, are designed to boost the performance of crucial workloads within the Business-Critical service tier. This dynamic solution offers a substantial increase in throughput, a significant reduction in latency, and optimized cost efficiency.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Want to learn more about these announcements? Be sure to check out these sessions:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;BRK167:&amp;nbsp;&amp;nbsp;&lt;A href="https://build.microsoft.com/en-US/sessions/e8c67bb9-4b32-43fe-8739-8b6769537111?source=sessions" target="_blank" rel="noopener"&gt;Power the next generation of AI apps with databases at scale anywhere&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;BRK160:&amp;nbsp;&amp;nbsp;&lt;A href="https://build.microsoft.com/en-US/sessions/69adc44f-78d4-460e-a85d-c35b4e7848d0?source=sessions" target="_blank" rel="noopener"&gt;Accelerate insights with real-time Azure SQL data and Microsoft Fabric&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;BRK166:&amp;nbsp;&amp;nbsp;&lt;A href="https://build.microsoft.com/en-US/sessions/b7e44f6e-ae57-4ef2-82b7-ff217ee14870?source=sessions" target="_blank" rel="noopener"&gt;Power AI apps and develop rich experiences with Azure SQL Database&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;BRK168:&amp;nbsp;&amp;nbsp;&lt;A href="https://build.microsoft.com/en-US/sessions/1c6a934d-ecee-4544-8ee7-491a04e4ede7?source=sessions" target="_blank" rel="noopener"&gt;Transform applications with AI and Azure Database for PostgreSQL&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;DEM730: &lt;A href="https://build.microsoft.com/en-US/sessions/130500dd-7e67-4d41-8964-33b19a2c8678?source=sessions" target="_blank" rel="noopener"&gt;Build an AI-powered app with PostgreSQL within 10 mins&lt;/A&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Simplifying application development &lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Microsoft is also applying AI directly in the database so you can benefit from assisted experiences that let you focus more on your application. This week at Microsoft Build, Azure Database is introducing in public preview, &lt;A href="https://Aka.ms/Build24/HeroBlog/PoweringAI" target="_blank" rel="noopener"&gt;Copilot in Azure capabilities for Azure SQL Database&lt;/A&gt;. These AI-assisted experiences provide contextual self-help for management and operation of your database including complex performance tuning scenarios saving you time and effort. In addition, it allows you to use natural language to create complex SQL queries based on the schema of your database, making database interactions more intuitive.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://aka.ms/Build24/Blog/CopilotMySQL" target="_blank" rel="noopener"&gt;Microsoft Copilot in Azure also extends capabilities to Azure Database for MySQL&lt;/A&gt;, which answers questions and troubleshoots conversationally by drawing on Microsoft Learn content and best practices. It empowers application developers with self-guided assistance to solve challenges without browsing through +100 public documentation. The new skills also enable database administrators to independently manage databases and resolve issues faster with Copilot assistance.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Attend these sessions to learn more:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;BRK171:&amp;nbsp;&amp;nbsp;&lt;A href="https://build.microsoft.com/en-US/sessions/7389d059-8f7e-4dd6-aa02-107e617df03d?source=sessions" target="_blank" rel="noopener"&gt;The power of AI and Copilot for Azure Databases&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;DEM735: &lt;A href="https://build.microsoft.com/en-US/sessions/8b969459-3c90-4bfb-8200-dda0425ee7ed?source=sessions" target="_blank" rel="noopener"&gt;Industry leading performance with Azure Database for MySQL&lt;/A&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Come see how Microsoft customers are powering apps with Azure Databases&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;We are happy to be joined on stage by organizations that have transformed the way they manage and innovate with data with Azure Databases. Attend one of the sessions below to see how TomTom is using Azure Cosmos DB to empower more modern, intelligent cars, how Scandinavian Airlines embarked on a modernization journey to Azure SQL, and Kinectfy built multi-tenant SaaS apps with Azure Cosmos DB.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;BRK121:&amp;nbsp;&lt;A href="https://build.microsoft.com/en-US/sessions/516a7e95-e830-4c8b-a1e1-97708a340a8f?source=sessions" target="_blank" rel="noopener"&gt;TomTom brings AI-powered, talking cars to life with Azure&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;BRK161: &lt;A href="https://build.microsoft.com/en-US/sessions/76776c46-f59b-4982-b24c-dfd11465ee20?source=sessions" target="_blank" rel="noopener"&gt;Design and build multi-tenant SaaS apps at scale with Azure Cosmos DB&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;BRK170: &lt;A href="https://build.microsoft.com/en-US/sessions/f95c3e86-74de-4a31-b29b-a03d99a08424?source=sessions" target="_blank" rel="noopener"&gt;Vision to value - SAS accelerates modernization at scale with Azure&lt;/A&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Come chat with our data experts &lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;If you’re attending Build this week, stop by the Expert Zone and meet our experts! This is your chance to ask questions, share feedback, and explore the topics that matter most.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://azure.microsoft.com/en-us/products/category/databases/" target="_blank" rel="noopener"&gt;Learn more about Azure Databases&lt;/A&gt; and the&amp;nbsp;check below the journey that data goes on as it travels through the &lt;STRONG&gt;Microsoft Intelligent Data Platform.&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;div data-video-id="https://www.youtube.com/watch?v=vWrqIfP2etc&amp;amp;feature=youtu.be" data-video-remote-vid="https://www.youtube.com/watch?v=vWrqIfP2etc&amp;amp;feature=youtu.be" class="lia-video-container lia-media-is-center lia-media-size-small"&gt;&lt;iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FvWrqIfP2etc%3Ffeature%3Doembed&amp;amp;display_name=YouTube&amp;amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DvWrqIfP2etc&amp;amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FvWrqIfP2etc%2Fhqdefault.jpg&amp;amp;key=b0d40caa4f094c68be7c29880b16f56e&amp;amp;type=text%2Fhtml&amp;amp;schema=youtube" allowfullscreen="" style="max-width: 100%"&gt;&lt;/iframe&gt;&lt;/div&gt;&lt;/STRONG&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 21 May 2024 15:30:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/data-for-now-and-next-powering-a-new-era-of-ai-apps/ba-p/4146832</guid>
      <dc:creator>Shireesh_Thota</dc:creator>
      <dc:date>2024-05-21T15:30:00Z</dc:date>
    </item>
    <item>
      <title>Announcing the General Availability of AMD-based Confidential VMs on Azure Databricks</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/announcing-the-general-availability-of-amd-based-confidential/ba-p/3980969</link>
      <description>&lt;P&gt;We are excited to announce the general availability of AMD-based confidential virtual machines (VMs) for cluster nodes on Azure Databricks. Confidential VMs are part of the&amp;nbsp;&lt;A href="https://aka.ms/azurecc" target="_blank" rel="noopener"&gt;Azure confidential computing&lt;/A&gt;&amp;nbsp;(ACC) portfolio&amp;nbsp;and provide a trusted execution environment (TEE) for Azure Databricks clusters, protecting data while in use in memory. It is important to note that Azure already encrypts &lt;U&gt;data at rest&lt;/U&gt; and &lt;U&gt;in transit&lt;/U&gt;, and the introduction of confidential VMs provides an additional layer of security for sensitive &lt;U&gt;data in use&lt;/U&gt;, helping organizations meet compliance requirements and protect their most valuable data.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;By using Azure confidential computing on Azure Databricks, you gain the capability to encrypt your data end-to-end. This is valuable not only for confidential workloads but also for any scenario where you need to protect highly sensitive data residing in memory and prevent unauthorized access or tampering. The solution also supports &lt;A href="https://learn.microsoft.com/en-us/azure/key-vault/managed-hsm/overview" target="_blank" rel="noopener"&gt;Azure Managed HSM&lt;/A&gt;, a hardware security module that allows the customer to manage their own encryption keys for data at-rest, in-use, and in-transit.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;To use&amp;nbsp;&lt;A href="https://learn.microsoft.com/en-us/azure/confidential-computing/virtual-machine-solutions-amd" target="_blank" rel="noopener"&gt;confidential VMs&lt;/A&gt;&amp;nbsp;on Azure Databricks, customers need to select one of the confidential VM types when creating a cluster. This type of cluster can then be used for any workload that requires the protection of highly sensitive data in memory.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;For compute-optimized needs,&amp;nbsp;&lt;A href="https://techcommunity.microsoft.com/t5/azure-confidential-computing/azure-confidential-vms-using-sev-snp-dcasv5-ecasv5-are-now/ba-p/3573747" target="_blank" rel="noopener"&gt;DCasv5 confidential VMs&lt;/A&gt;&amp;nbsp;are available, and for memory-optimized needs, &lt;A href="https://techcommunity.microsoft.com/t5/azure-confidential-computing/azure-confidential-vms-using-sev-snp-dcasv5-ecasv5-are-now/ba-p/3573747" target="_blank" rel="noopener"&gt;ECasv5 confidential VM&lt;/A&gt;s can be used.&amp;nbsp; These VMs are currently available in the following regions: East US, West US, North Europe, West Europe, Southeast Asia, Central India, East Asia, Switzerland North, Japan East and Italy North, and coming to additional regions soon.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;FONT size="4"&gt;&lt;SPAN&gt;&lt;STRONG&gt;Databricks partnership in Confidential Computing:&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE class=" lia-align-left" style="width: 900px; border-style: hidden;" width="900"&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD width="101"&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="271"&gt;
&lt;P&gt;&lt;SPAN&gt;"We are thrilled to have collaborated with Microsoft to introduce Azure Databricks support for Azure confidential computing,” said David Meyer, Senior Vice President of Product, Databricks. “With the Databricks Lakehouse, customers can build an end-to-end data platform with increased confidentiality and privacy by encrypting data in use thanks to AMD-based confidential virtual machines.”&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&lt;SPAN&gt;&lt;STRONG&gt;&amp;nbsp;&lt;div data-video-id="https://www.youtube.com/watch?v=SHnnUut_LBA" data-video-remote-vid="https://www.youtube.com/watch?v=SHnnUut_LBA" class="lia-video-container lia-media-is-center lia-media-size-large"&gt;&lt;iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FSHnnUut_LBA%3Ffeature%3Doembed&amp;amp;display_name=YouTube&amp;amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DSHnnUut_LBA&amp;amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FSHnnUut_LBA%2Fhqdefault.jpg&amp;amp;key=fad07bfa4bd747d3bdea27e17b533c0e&amp;amp;type=text%2Fhtml&amp;amp;schema=youtube" allowfullscreen="" style="max-width: 100%"&gt;&lt;/iframe&gt;&lt;/div&gt;&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;&lt;STRONG&gt;&amp;nbsp;&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;/P&gt;
&lt;TABLE class=" lia-align-left" style="width: 900px; border-style: hidden;"&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD width="101"&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="271"&gt;
&lt;P&gt;“Azure Databricks with confidential computing is our first choice for the robust protection of confidential customer data across multiple industries.&lt;/P&gt;
&lt;P&gt;Our successful collaboration with Microsoft and Databricks enables our customers not only to unlock significant value from their data.&amp;nbsp;It also emphasizes data privacy and ownership throughout the large-scale data analysis of sensitive information.”&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Tune into &lt;/STRONG&gt;&lt;A href="https://ignite.microsoft.com/en-US/sessions?search=confidential+computing&amp;amp;sortBy=relevance" target="_blank" rel="noopener"&gt;&lt;STRONG&gt;Microsoft Ignite&lt;/STRONG&gt;&lt;/A&gt;&lt;STRONG&gt; this week to learn more about the recent innovations with Azure confidential computing on Azure Databricks.&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Learn more: &lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Watch the demo: &lt;A href="https://aka.ms/ADB-ACC-demo" target="_blank" rel="noopener"&gt;Azure Databricks on Confidential VMs Overview and Demo&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;Read the documentation: &amp;nbsp;&lt;A href="https://aka.ms/CVM-ADB-docs" target="_blank" rel="noopener"&gt;https://aka.ms/CVM-ADB-docs.&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 15 Nov 2023 16:00:01 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/announcing-the-general-availability-of-amd-based-confidential/ba-p/3980969</guid>
      <dc:creator>saira_shariff</dc:creator>
      <dc:date>2023-11-15T16:00:01Z</dc:date>
    </item>
    <item>
      <title>A Data Professional's Guide to Build 2023</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/a-data-professional-s-guide-to-build-2023/ba-p/3827800</link>
      <description>&lt;P&gt;&lt;img /&gt;&lt;SPAN data-contrast="none"&gt;ChatGPT and Microsoft Copilot are stealing headlines worldwide—but data is the force that drives them both. &lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;That’s why this year at &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/home" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Microsoft Build&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, which begins on May 23&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;rd&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;, we’ll have lots of data and analytics-focused live segments, breakout sessions, Learn Live workshops, and more to help you learn about the latest advancements in data tech. This guide outlines all &lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;the data content&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt; at the event this year so you can get the most out of Build—and harness the newest insights to propel your organization forward.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On &lt;/SPAN&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Day 1&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="auto"&gt;, join our very own Satya Nadella for an opening keynote that explores the next frontier of tech: AI. Imagine a not-so-distant future powered by data-driven AI copilots to learn how Microsoft is exploring the next generation of apps and tools. On &lt;/SPAN&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Day 2&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="auto"&gt;, join Rajesh Jha and many others to discover how developers like you can leverage data to create the Microsoft 365 apps, Windows, and Microsoft Graph solutions of tomorrow. And if you’re joining us in person, close out Build on &lt;/SPAN&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Day 3 &lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="auto"&gt;with Mark Russinovich and Scott Hanselman to learn how to use data and AI to develop apps from scratch.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;And of course, there’s lots of events on data and analytics. Check out our complete list of on-demand and live technical sessions below and click the links to get the full details on each session’s homepage. For the full agenda and more information, check out the &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/home" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Microsoft Build&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; homepage—just make sure to register now to get the full attendee experience, like participating in live workshops, Q&amp;amp;A sessions, and more.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;I&gt;&lt;SPAN data-contrast="auto"&gt;(Please note: schedules and speakers are subject to change.)&lt;/SPAN&gt;&lt;/I&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Core sessions and breakout sessions&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;TABLE data-tablestyle="MsoTableGrid" data-tablelook="1184" aria-rowcount="9"&gt;
&lt;TBODY&gt;
&lt;TR aria-rowindex="1"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Date&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="89px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Type&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Start, PDT&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="406px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Session name&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="333px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Speaker(s)&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="2"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="89px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital, in person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;9:00 AM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="406px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Keynote:&lt;/SPAN&gt;&lt;/STRONG&gt; &lt;A href="https://build.microsoft.com/en-US/sessions/49e81029-20f0-485b-b641-73b7f9622656?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Microsoft Build Opening&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="333px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Satya Nadella, Yusuf Mehdi&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="3"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="89px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital, in person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;9:25 AM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="406px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Keynote:&lt;/SPAN&gt;&lt;/STRONG&gt; &lt;A href="https://build.microsoft.com/en-US/sessions/bb8f9d99-0c47-404f-8212-a85fffd3a59d?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;The era of the AI Copilot&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="333px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Kevin Scott, Greg Brockman&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="4"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="89px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital, in person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;10:05 AM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="406px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Keynote:&lt;/SPAN&gt;&lt;/STRONG&gt; &lt;A href="https://build.microsoft.com/en-US/sessions/b1445158-aff5-44e8-b336-e1550770f028?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Next generation AI for developers with the Microsoft Cloud&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="333px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Scott Guthrie, Thomas Dohmke, Sarah Bird, Seth Juarez&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="5"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="89px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital, in person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;9:00 AM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="406px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Keynote:&lt;/SPAN&gt;&lt;/STRONG&gt; &lt;A href="https://build.microsoft.com/en-US/sessions/8aab36d1-d27d-46dd-81ec-eb3f49cfee6a?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Shaping the future of work with AI&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="333px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Rajesh Jha, Panos Panay, Yina Arenas, Steven Bathiche, Cassie Breviu, Pavan Davuluri, Wamwitha Love, Shilpa Ranganathan, Archana Saseetharan&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="6"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/25/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="89px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;12:00 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="406px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Keynote:&lt;/SPAN&gt;&lt;/STRONG&gt; &lt;A href="https://build.microsoft.com/en-US/sessions/e9568a77-7cf7-451e-a14b-a347313b2494?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Scott and Mark learn to code&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="333px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Mark Russinovich, Scott Hanselman&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="7"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="89px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital, in person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;4:00 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="406px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Breakout session: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/852ccf38-b07d-4ddc-a9fe-2e57bdaeb613?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Analytics in the age of AI&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="333px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Arun Ulagaratchagan, Amir Netz, Patrick Baumgartner, Justyna Lucznik&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="8"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/25&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="89px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital, in person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;1:30 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="406px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Breakout session: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/6bea806c-624c-44d3-9e53-513b22eff3e4?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Build scalable, cloud-native apps with Kubernetes and Azure Cosmos DB&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="333px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Mark Brown, &lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Pavneet Ahluwalia, Fabiano Fernandes&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="9"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/25/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="89px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;9:30 AM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="406px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Breakout session: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/a6bf2750-7af3-4e72-ba5c-6c68554f4311?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Increase developer velocity with Azure SQL Database, from data to API&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="333px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Davide Mauri&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;On demand&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE data-tablestyle="MsoTableGrid" data-tablelook="1184" aria-rowcount="11"&gt;
&lt;TBODY&gt;
&lt;TR aria-rowindex="1"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Date &amp;amp; Type&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Session name&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Speaker(s)&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="2"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand (digital)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/0ea5599b-e4f3-42d2-a103-6b008ee8c05c?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Using Spark to accelerate your lakehouse architecture&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Justyna Lucznik&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="3"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand (digital)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/423f41d4-815f-4744-bac0-53d121321cfb?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Accelerate your potential with Microsoft&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Swetha Manepalli&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="4"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand (digital)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/8b23c96e-7c35-463d-88b4-564d23dc14a5?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Empower every BI professional to do more with data&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Christian Wade, Zoe Douglas&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="5"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand (digital)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/aed14ea3-3356-4d15-92c7-2c24efd55f7c?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Modernize your enterprise data warehouse and generate value from data&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Priya Sathy&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="6"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand (digital)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/b35ec76d-c6ee-4c71-ad28-7b3210a7f6be?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Secure, govern, and manage your data at scale, in Power BI and Synapse&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Arthi Ramasubramanian Iyer, Adi Regev, Anton Fritz&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="7"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand (digital)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/d3bec9f4-b706-40f6-85b6-47e5b1194716?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Sense, analyze, and generate insights with real-time analytics&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;James Hutton, Kevin Lam, Tzvia Gitlin&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="8"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand (digital)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/e066311e-4232-49b2-9066-615a184bf749?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Modernize your data integration to enable petabyte scale analytics&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Wee Hyong Tok, Shireen Bahadur&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="9"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand (digital)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/f73d3a46-fa77-49bf-a704-be81e6c15aa2?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Models to outcomes with end-to-end data science workflows&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Nellie Gustafsson&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="10"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand (digital)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/7744f376-a7b9-4fe6-b133-2753ba60b9b3?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Transform productivity with AI experiences for analytics&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Arun Ulagaratchagan, Patrick Baumgartner&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="11"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand (digital)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;On-demand&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/62c4bdea-bee7-45dc-b4ff-730d0ecaa061?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Eliminate data siloes with the next gen data lake&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Joshua Caplan, Adi Regev&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Join lively discussions and product round tables after each breakout session to get the exact information you need from the experts themselves.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Discussions&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;TABLE data-tablestyle="MsoTableGrid" data-tablelook="1184" aria-rowcount="9"&gt;
&lt;TBODY&gt;
&lt;TR aria-rowindex="1"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Date&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Type&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Start, PDT&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Session name&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Speaker(s)&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="2"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital, in person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;11:30 AM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Discussion: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/6a8c5273-b43f-4867-adfc-aef849633214?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Build scalable and secure enterprise apps on OSS databases, Q&amp;amp;A&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Dinesh Madhusoodanan,&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Sunil Agarwal, Charles Feddersen&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="3"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital, in person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;12:30 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Discussion: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/94372ded-9820-4deb-afcf-0a4be8e0692f?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Database capabilities you need for your next awesome app, Q&amp;amp;A&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Andrew Liu, Ken Myers, Yoni Nijs, Nikisha Reyes-Grange&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="4"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital, in person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;1:30 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Discussion: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/d0f2d888-a79d-42f9-a062-f9c334daca9f?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Azure Cosmos DB for MongoDB: Choosing the right architecture, Q&amp;amp;A&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Gahl Levy, Jay Gordon, Thomas Kovarik&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="5"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital, in person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;2:45 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Discussion: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/9ac9e7cd-e615-4a72-afc0-ac01f957c91e?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Modernize your applications on Azure SQL Managed Instance Q&amp;amp;A&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Eric Hudson&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="6"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital, in person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;11:00 AM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Discussion: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/2184783a-5cef-4dc2-818b-01953de121df?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Synapse data warehouse, Q&amp;amp;A&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Priya Sathy, Faisal Mohamood, Bogdan Crivat&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="7"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;12:15 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Discussion: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/c28461cf-52f8-4db3-88c0-2cc13610d6a0?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Distributed NoSQL vs. D-SQL: What to use for your cloud-native app Q&amp;amp;A&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Charles Feddersen,&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Mark Brown, Estefani Arroyo&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="8"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;2:45 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Discussion: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/ed489151-dfa7-4328-a807-f89b31be6664?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Do more on Azure SQL Database Hyperscale, Q&amp;amp;A&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Morgan Oslake,&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Arvind Shyamsundar, Aditya Badramraju&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="9"&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital, in person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;4:00 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Discussion: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/1b03f355-d931-4a39-9b1c-dddfe643db36?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;All about data analytics, Q&amp;amp;A&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Nellie Gustafsson, Mark Kromer, Justyna Lucznik, Wee Hyong Tok&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Round Tables&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;TABLE data-tablestyle="MsoTableGrid" data-tablelook="1184" aria-rowcount="5"&gt;
&lt;TBODY&gt;
&lt;TR aria-rowindex="1"&gt;
&lt;TD width="70.9844px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Date&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="61.2188px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Type&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="70px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Start, PDT&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="453px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Session name&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="316px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Speaker(s)&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="2"&gt;
&lt;TD width="70.9844px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="61.2188px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="70px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;1:30 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="453px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Round table: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/c22cf799-21f6-4eac-a059-af75b7c4f2a8?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Azure Synapse Data Explorer and Delta Lake Integration&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="316px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Avner Aharoni, Slavik Neimer, Anshul Sharma&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="3"&gt;
&lt;TD width="70.9844px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="61.2188px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="70px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5:15 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="453px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Round table: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/bded6aec-aa6d-4a8d-9647-6665d1abe411?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Integrate your data with latest Data warehouse and Data Factory capabilities&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="316px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Bogdan Crivat, Miquella de Boer, Sid Jayadevan, Swetha Mannepalli, Mohan Sankaran, Priya Sathy, Wee Hyong Tok&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="4"&gt;
&lt;TD width="70.9844px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="61.2188px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="70px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;11:00 AM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="453px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Round Table: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/9c5af76c-17cb-4d02-8ae9-f1b338ce9049?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Real-time analytics with Azure Synapse Data Explorer&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="316px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Uri Barash, Tzivia Gitlin&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="5"&gt;
&lt;TD width="70.9844px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="61.2188px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="70px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5:15 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="453px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Round table: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/5b6b2f31-3022-42e4-8e30-ca25d885fb4c?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Synapse Data Engineering, Data Science &amp;amp; OpenAI Roundtable&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="316px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Nellie Gustafsson, Chris Finlan, Misha Desai, Avinanda Chattapadday, Martin Lee, Piero Morano, Eren Orbey, Santhosh Kumar Ravindran, Raj Rikhy, Narmeen Samad, Alexander van Grootel, Ted Vilutis&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Practice makes perfect – so make sure you get as much of it as you need with Microsoft Build’s labs and demo sessions.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Labs&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;TABLE data-tablestyle="MsoTableGrid" data-tablelook="1184" aria-rowcount="9"&gt;
&lt;TBODY&gt;
&lt;TR aria-rowindex="1"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Date&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Type&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Start, PDT&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="433px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Session name&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="315px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Speaker(s)&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="2"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;1:30 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="433px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Learn Live: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/8036fabb-00b1-4fcf-bb61-6f2cc45c4008?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Model data for Azure Cosmos DB for PostgreSQL&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="315px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Cedric Derue, Gary Hope&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="3"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5:15 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="433px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Lab: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/b89d8d4c-a11c-4912-b705-b2abff11c864?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Democratize your data estate with the latest data warehouse&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt; (session one)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="315px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Priyanka Langade, Peri Rocha, Dharini Sundaram, Kevin Conan, Mariya Ali, Salil Kanade&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="4"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5:15 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="433px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Lab: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/e761cb4f-eaca-4dfb-b0e4-697671eca7fa?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Analytics in the Microsoft Intelligent Data Platform DREAM lab&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt; (session one)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="315px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Sanjay Soni, Thasmika Gokal&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="5"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;1:30 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="433px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Lab: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/05a7371d-c393-4908-ab33-db0d2e2bcf43?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Get started with lakehouses&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="315px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Paul DeCarlo, Graeme Malcolm&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="6"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;2:45 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="433px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Lab: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/aab57b06-0864-451c-95dd-c8132bb01513?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Democratize your data estate with the latest data warehouse&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt; (session two)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="315px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Mariya Ali, Kevin Conan, Salil Kanade, Priyanka Langade, Peri Rocha, Dharini Sundaram&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="7"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;4:00 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="433px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Lab: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/b1e960ac-e00b-486b-af92-a75a952e192c?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Learn how to build a modern data stack in an hour&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="315px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Emily Chen, Miguel Escobar, Estera Kot, Noelle Li, Abhishek Narain, Nikki Waghani&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="8"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Digital&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5:15 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="433px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Learn Live: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/9e4175d4-7549-4bff-b48d-864e9f5441da?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Migrate SQL workloads to Azure Managed Instances&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="315px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Ajith Krishnan, Javier Villegas&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="9"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="74px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5:15 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="433px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Lab: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/180c875c-3821-4d23-96a9-9d3275768e67?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Analytics in the Microsoft Intelligent Data Platform DREAM lab&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt; (session two)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="315px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Sanjay Soni, Thasmika Gokal&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Demos&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;TABLE data-tablestyle="MsoTableGrid" data-tablelook="1184" aria-rowcount="5"&gt;
&lt;TBODY&gt;
&lt;TR aria-rowindex="1"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Date&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Type&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="112px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Start, PDT&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="396px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Session name&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="314px" data-celllook="0"&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Speaker(s)&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="2"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="112px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;11:30 AM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="396px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Demo: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/7e40508e-e8bc-4541-bf36-9828230b6e5d?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Modernize MongoDB workloads with Azure Cosmos DB&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="314px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Gahl Levy&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="3"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/23/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="112px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;11:45 AM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="396px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Demo: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/8e5f1b28-175d-4353-b6f1-7521a448ac90?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Build next gen intelligent retail with Azure OpenAI &amp;amp; Azure Cosmos DB&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="314px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Mark Brown&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="4"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="112px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5:15 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="396px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Demo: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/6ffba46b-2059-4447-a4cb-9b0fd8d5d44c?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Further, faster with Azure Functions and Azure SQL integration&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="314px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Drew Skwiers-Koballa&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR aria-rowindex="5"&gt;
&lt;TD width="71px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5/24/23&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="81px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In person&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="112px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;5:30 PM PDT&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="396px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Demo: &lt;/SPAN&gt;&lt;A href="https://build.microsoft.com/en-US/sessions/5dd06b6d-afcc-4954-b51a-377246fb24ac?source=sessions" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Protect your data from tampering with ledger on Azure SQL&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="314px" data-celllook="0"&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Pieter Vanhove&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Learn More&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Get even more out of Microsoft Build with these additional resources.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559684&amp;quot;:-2,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="none"&gt;The &lt;/SPAN&gt;&lt;A href="https://aka.ms/buildcsc?ocid=build23_csc_collections_cnl" target="_self"&gt;&lt;SPAN data-contrast="none"&gt;Cloud Skills Challenge&lt;/SPAN&gt;&lt;/A&gt;&amp;nbsp;at Build begins May 23, 2023, at 4:00 PM PDT. Join to win your chance at a free Microsoft certification exam&lt;/LI&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559684&amp;quot;:-2,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="2" data-aria-level="1"&gt;&lt;SPAN&gt;Check out &lt;A href="http://Our%20curated set of training, documentation, certification, and additional resources creates a one-stop shop" target="_self"&gt;Collections&lt;/A&gt;, our curated set of training, documentation, certification, and additional resources.&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559684&amp;quot;:-2,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Explore our data and analytics learning resources and documentation on &lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/training/topics/azure-data-and-ai" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;Microsoft Learn&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, designed to help you upskill and grow in your career.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;201341983&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:259}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Tue, 23 May 2023 15:00:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/a-data-professional-s-guide-to-build-2023/ba-p/3827800</guid>
      <dc:creator>NatalieM</dc:creator>
      <dc:date>2023-05-23T15:00:00Z</dc:date>
    </item>
    <item>
      <title>Introducing the Microsoft Intelligent Data Platform  Partner Ecosystem</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/introducing-the-microsoft-intelligent-data-platform-partner/ba-p/3640279</link>
      <description>&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In May, we introduced the &lt;A href="https://www.microsoft.com/en-us/microsoft-cloud/solutions/intelligent-data-platform?activetab=tabs-1:primaryr13" target="_blank"&gt;Microsoft Intelligent Data platform&lt;/A&gt; which deeply integrates our databases, analytics, BI and data governance products into a unified platform. This platform, which is also integrated with the Microsoft Cloud, enables seamless experience and intuitive collaboration for Developers, DBAs, Data Engineers, Data Scientists, Business Analysts and Data Officers. The platform enables customers to “do more with less”, by helping them add layers of intelligence to their applications, unlock predictive insights and drive relevant action, and govern their entire data estate.&lt;/P&gt;
&lt;P&gt;The customer response has been very positive, and we have received rich feedback on how we can further the value of this unified platform. Some common feedback we heard is that customers want the Microsoft Intelligent Platform to extend the deep integration with the Partner products they use.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Today at &lt;A href="https://ignite.microsoft.com/en-US/sessions/8c7bb7fe-330c-4989-ba62-099bf1fc9599" target="_blank"&gt;Microsoft Ignite&lt;/A&gt;, we announced the launch of a powerful new Partner Ecosystem for the Microsoft Intelligent Data Platform. Together with our partners, we deliver category-leading and cloud-native data and AI solutions integrated with the Microsoft Intelligent Data Platform. These solutions complement capabilities to address the diverse scenarios and investments of our customers.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The Microsoft Intelligent Data Platform is the natural choice for independent software vendor (ISV) partners to build and deliver integrated solutions. This is all made possible through the platform’s open and governed foundation.&lt;/P&gt;
&lt;P&gt;The partner ecosystem enables customers across all industries to take advantage of the Microsoft Intelligent Data Platform capabilities. Together with our partners, we’ll build upon decades of experience in helping our customers digitally transform.&lt;/P&gt;
&lt;P&gt;Thanks to partner solutions that span every layer of the Microsoft Intelligent Data Platform, we ensure our wide range of customers get a seamless, enriched, and cost-optimized experience that meets their specific needs.&lt;/P&gt;
&lt;P&gt;The initial set of ISV partners aligned with the Microsoft Intelligent Data Platform partner ecosystem include:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Operational Databases&lt;/STRONG&gt;: &lt;STRONG&gt;MongoDB &lt;/STRONG&gt;as a leading NoSQL database, and &lt;STRONG&gt;Yugabyte&lt;/STRONG&gt; as a cloud native NoSQL and Distributed SQL database, they enable integration into the broader enterprise data for Analytics and ML/AI and govern data estates with Microsoft Purview.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Analytics&lt;/STRONG&gt;: These solutions further enhance low code/no code experiences to automate building and operating complex enterprise grade pipelines. They can govern data integration and transformation pipelines in Microsoft Purview. Examples of differentiated partner solution capabilities include:
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Informatica: &lt;/STRONG&gt;Cloud-native data integration for enterprise systems&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Qlik&lt;/STRONG&gt;: Data integrations for enterprise systems such as SAP, Oracle, and appliances&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Confluent&lt;/STRONG&gt;: Analytics for data in motion from Apache Kafka and IoT sources&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Fivetran&lt;/STRONG&gt;: Low code / No code, cloud-native, automated, reliable, and secure data pipelines&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Striim&lt;/STRONG&gt;: Real-time data access with continuous, streaming data integration to analytics&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;dbtLabs&lt;/STRONG&gt;: SQL friendly modality for data transformation pipelines&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Data Governance&lt;/STRONG&gt;: Marketing leading solutions in this layer include:
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Profisee&lt;/STRONG&gt; and &lt;STRONG&gt;CluedIn&lt;/STRONG&gt; for Master Data Management&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Delphix&lt;/STRONG&gt; data masking for pipeline data compliance&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;OneTrust&lt;/STRONG&gt; for privacy &amp;amp; security&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In tandem with leading a powerful ecosystem of ISV partners to engage with our customers, we are also committed to deepening an enhanced first-party relationship with Databricks. We’re pleased to share that Azure Databricks, which aligns to the open and governed paradigm of Microsoft Intelligent Data Platform, also benefits from this partner ecosystem.&lt;/P&gt;
&lt;P&gt;And this is just the beginning as we add more ISV partners in the coming months. We believe the &lt;STRONG&gt;Microsoft Intelligent Data Platform partner ecosystem&lt;/STRONG&gt; approach is the first of its kind in the industry. Best of all, working with our valued partners, we set our customers up for success.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Learn more: &lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://ignite.microsoft.com/en-US/sessions/8c7bb7fe-330c-4989-ba62-099bf1fc9599" target="_blank"&gt;Innovate faster and achieve greater agility with the Microsoft Intelligent Data Platform&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://ignite.microsoft.com/en-US/sessions/b2dc8bf4-2ceb-4de5-bbf6-67c6f7db8563?source=sessions" target="_blank"&gt;Microsoft and Databricks Partnership&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 12 Oct 2022 16:00:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/introducing-the-microsoft-intelligent-data-platform-partner/ba-p/3640279</guid>
      <dc:creator>RohanKumar</dc:creator>
      <dc:date>2022-10-12T16:00:00Z</dc:date>
    </item>
    <item>
      <title>Microsoft and Databricks deepen partnership for modern, cloud-native analytics</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/microsoft-and-databricks-deepen-partnership-for-modern-cloud/ba-p/3640280</link>
      <description>&lt;P&gt;In May, we introduced the &lt;A href="https://www.microsoft.com/en-us/microsoft-cloud/solutions/intelligent-data-platform?activetab=tabs-1:primaryr13" target="_blank" rel="noopener"&gt;Microsoft Intelligent Data platform&lt;/A&gt; which deeply integrates our databases, analytics, BI and data governance products into a unified platform. This platform, which is also integrated with the Microsoft Cloud, enables seamless experience and intuitive collaboration for Developers, DBAs, Data Engineers, Data Scientists, Business Analysts and Data Officers. The platform enables customers to “do more with less”, by helping them add layers of intelligence to their applications, unlock predictive insights and drive relevant action, and govern their entire data estate.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Databricks has been a key partner for us in delivering a cloud native platform for Analytics in the Microsoft Intelligent Data Platform. &lt;/SPAN&gt;&lt;SPAN&gt;Today, I am pleased to share that we have further strengthened our partnership with Databricks to evolve our Analytics platform to an Open and Governed Data Lakehouse foundation. This foundation will enable our customers to unify their most demanding Business Intelligence, Machine Learning, and Artificial Intelligence investments on a single data foundation for analytics viz. an open and governed data lakehouse. The foundation will also enable customers to responsibly democratize their analytics data products to accelerate digital transformation applications across their organizations.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;EM&gt;“Databricks is a key partner for Microsoft, and together, we are&amp;nbsp;delivering a modern, cloud-native data foundation&amp;nbsp;in&amp;nbsp;the Microsoft Intelligent Data Platform&amp;nbsp;for the most demanding analytics and machine learning applications,” - &lt;STRONG&gt;Scott Guthrie, EVP, Cloud + AI, Microsoft&lt;/STRONG&gt;.&amp;nbsp;&lt;/EM&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Microsoft and Databricks have partnered to build this foundation in the Microsoft Intelligent Data Platform by integrating their hallmark capabilities to deliver an integrated solution for our customers.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;Core to the foundation is the Open and Governed Data Lakehouse, a unified and cloud native data fabric capable of serving business intelligence, machine learning and artificial intelligence workloads at enterprise and cloud scales, without requiring organizations to invest in building, integrating, and operating point data fabrics for each.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;&lt;img /&gt;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;This foundation is built by integrating the hallmark analytics capabilities in Microsoft’s Azure Synapse Analytics and Databricks, as well by integrating the governance foundations in Microsoft Purview and Databricks Unity Catalog to enable a single pane experience for Data and Analytics Governance in Microsoft Purview. The components of the integrated analytics foundation include:&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;SPAN&gt;Azure Data Factory in Azure Synapse Analytics for orchestrating data integration pipelines to integrate data from across hybrid, multi-cloud, SaaS, and legacy enterprise data source systems&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Azure Data Explorer in Azure Synapse Analytics for rea&lt;/SPAN&gt;&lt;SPAN&gt;l-time streaming analytics&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Azure Databricks for building an open standard data lakehouse using the Delta format&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Azure Machine Learning, Synapse ML, and Azure Databricks for machine learning &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Azure Machine Learning, and Databricks MLflow for MLOps&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Azure Synapse SQL and Databricks SQL for serverless SQL analytics&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Azure Synapse SQL for a Data Warehouse modality for data serving&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Power BI for Business Intelligence&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Microsoft Purview and Databricks Unity Catalog Federation for Unified Data Governance in Purview spanning Operational, Analytics, and ML/AI data assets on the Microsoft Intelligent Data Platform&lt;/SPAN&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;EM&gt;“This announcement between Databricks and Microsoft is a significant step forward in helping customers adopt an open, well governed data lakehouse on Azure with the entirety of the Microsoft Intelligent Data Platform capabilities. More than ever, customers are able to quickly and easily unify their data, analytics, and AI with a simple and open approach.”&amp;nbsp;&lt;STRONG&gt;- Ali Ghodsi, Co-founder and CEO of Databricks&lt;/STRONG&gt;&lt;/EM&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;We, Microsoft and Databricks, are thrilled to be able to partner in deeply integrating our complementary strengths and capabilities, to deliver an open and governed analytics foundation in the Microsoft Intelligent Platform, to serve the most demanding analytics investments of our customers with unparalleled productivity, performance, and cost efficiencies that will enable our customers to do and achieve more with less.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN&gt;In a recent benchmark study, implementing an end-to-end analytics solution on Azure was up to 49% more cost efficient than on a top competitor cloud. &lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;&lt;STRONG&gt;&amp;nbsp;&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Learn more:&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://ignite.microsoft.com/en-US/sessions/b2dc8bf4-2ceb-4de5-bbf6-67c6f7db8563?source=sessions" target="_blank" rel="noopener"&gt;Microsoft and Databricks Partnership&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://ignite.microsoft.com/en-US/sessions/8c7bb7fe-330c-4989-ba62-099bf1fc9599" target="_blank" rel="noopener"&gt;Innovate faster and achieve greater agility with the Microsoft Intelligent Data Platform&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 17 Oct 2022 06:04:42 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/microsoft-and-databricks-deepen-partnership-for-modern-cloud/ba-p/3640280</guid>
      <dc:creator>RohanKumar</dc:creator>
      <dc:date>2022-10-17T06:04:42Z</dc:date>
    </item>
    <item>
      <title>Zero to Hero: Your Guide to Getting Skilled on Azure Data and AI</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/zero-to-hero-your-guide-to-getting-skilled-on-azure-data-and-ai/ba-p/3434622</link>
      <description>&lt;P&gt;&lt;img /&gt;We are excited to highlight the Zero to Hero guide, our very own journey to help developers and engineers get started on Azure. We have a large collection of resources to help you ramp up in the cloud, and these guides curate all our best-in-class content in one place, making it easier to find and navigate. Technology is changing fast. With these guides you will ramp-up and level up your skill set to keep the upper hand in this competitive market.&amp;nbsp;Our Zero to Hero guides ensure that you have a solid foundation as you begin to explore Azure.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The Zero to Hero guide will show you how to achieve expertise and prepare for certification in the following areas: &lt;STRONG&gt;data engineering&lt;/STRONG&gt;, &lt;STRONG&gt;data science&lt;/STRONG&gt;, &lt;STRONG&gt;artificial intelligence, and application development with MySQL&lt;/STRONG&gt;. Within 4 weeks, regardless of where you are in your journey, we can propel you to the next level.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Each week you'll watch videos, learn from step-by-step training, and try skills for yourself with self-guided exercises. Don't forget to also sign up for the associated 30 days learning challenges to earn 50 percent off Microsoft certification.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Download the guides below and start learning today!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;&lt;A href="https://aka.ms/data-engineer-learning-path" target="_blank" rel="noopener"&gt;&lt;STRONG&gt;Zero to Hero with Azure Data Engineering&lt;/STRONG&gt;&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;A href="https://aka.ms/mllearningjourney" target="_blank" rel="noopener"&gt;&lt;STRONG&gt;Zero to Hero with Azure Machine Learning&lt;/STRONG&gt;&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;A href="https://aka.ms/ailearningjourney " target="_blank" rel="noopener"&gt;&lt;STRONG&gt;&lt;SPAN&gt;Zero to Hero with&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;STRONG&gt;Azure Artificial Intelligence&lt;/STRONG&gt;&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;A href="https://azure.microsoft.com/en-us/resources/mysql-developer-learning-journey/?OCID=mysql_dev_pdf_MySQL_page" target="_blank" rel="noopener"&gt;&lt;STRONG&gt;Zero to Hero with Azure Database for MySQL&lt;/STRONG&gt;&lt;/A&gt;&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Are you a developer who builds apps with low-code techniques? Are you a solution architect defining the design and implementation of technology solutions in Azure? We have guides for you too!&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;A href="https://techcommunity.microsoft.com/t5/azure-infrastructure-blog/new-azure-skilling-guides/ba-p/3423689" target="_blank" rel="noopener"&gt;Azure Skills Navigator for System Administrators and Solution Architects&lt;/A&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;A href="https://techcommunity.microsoft.com/t5/azure-developer-community-blog/introducing-new-ramp-up-guide-for-developers-azure-skills/ba-p/3431731" target="_blank" rel="noopener"&gt;Azure Skills Navigator for Developers&amp;nbsp;&lt;/A&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Tue, 07 Jun 2022 18:55:41 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/zero-to-hero-your-guide-to-getting-skilled-on-azure-data-and-ai/ba-p/3434622</guid>
      <dc:creator>NatalieM</dc:creator>
      <dc:date>2022-06-07T18:55:41Z</dc:date>
    </item>
    <item>
      <title>A Data and AI guide to Build 2022</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/a-data-and-ai-guide-to-build-2022/ba-p/3389827</link>
      <description>&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;Microsoft Build will begin Tuesday, May 24 at 8:30 AM PDT with our annual keynote. Attendees can then participate in live segments, breakout sessions, Learn Live workshops, and more.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;We are excited to bring you this guide as your go-to map for discovering all the best content for Data and AI product releases, learning opportunities, and resources.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P data-unlink="true"&gt;On &lt;STRONG&gt;Day 1&lt;/STRONG&gt; join Microsoft CEO Satya Nadella as he shares how Microsoft is creating new opportunities for developers across our platforms in our &lt;A href="https://aka.ms/Build22/KEY01" target="_blank" rel="noopener"&gt;Microsoft Build Opening&lt;/A&gt;. Join us later that morning to hear from Microsoft’s CVP of Data, AI, and MR Jessica Hawk along with Rohan Kumar, CVP of Azure Data Engineering, in their &lt;A href="https://mybuild.microsoft.com/en-US/sessions/9b919415-e250-44a4-b3ad-d462ba147341?source=https://mybuild.microsoft.com/en-US/theme/gain-agility-with-an-integrated-data-platform?f=%255B%257B%2522name%2522%253A%2522Global%2522%252C%2522facetName%2522%253A%2522event%2522%257D%255D&amp;amp;t=%257B%2522from%2522%253A%25222022-05-24T00%253A00%253A00-07%253A00%2522%252C%2522to%2522%253A%25222022-05-26T23%253A59%253A00-07%253A00%2522%257D" target="_self"&gt;core session&lt;/A&gt;: Accelerate innovation and achieve agility on a trusted integrated platform with hybrid and multicloud capabilities.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;On &lt;STRONG&gt;Day 2 &lt;/STRONG&gt;Microsoft CTO Kevin Scott will share how breakthrough research and ambitious AI models are empowering developers to do more during his keynote: &lt;A href="https://aka.ms/Build22/KEY05" target="_blank" rel="noopener"&gt;The Future of AI Developer Tools&lt;/A&gt;. Shortly after, join our &lt;A href="https://aka.ms/Build22KEY06" target="_blank" rel="noopener"&gt;Into Focus: AI session&lt;/A&gt; with Microsoft CVP Engineering Eric Boyd as he dives into the latest technology announcements and customer stories from Azure AI.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Be sure to visit the&amp;nbsp;&lt;A href="https://mybuild.microsoft.com/home" target="_blank" rel="noopener"&gt;Microsoft Build&amp;nbsp;&lt;/A&gt;homepage for the full agenda and additional details. If you haven’t already, &lt;A href="https://register.build.microsoft.com/" target="_blank" rel="noopener"&gt;register now&lt;/A&gt;&amp;nbsp;for the full event attendee experience so you can participate in live workshops and Q&amp;amp;A.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The complete line up of Azure Data and AI technical sessions is listed below for you to build a schedule that is unique to you and your interests. Many sessions will be available on-demand at the links below. Navigate to the session page to see full details.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;U&gt;&lt;STRONG&gt;Core Sessions + Breakout Sessions:&lt;/STRONG&gt;&lt;/U&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE border="1" width="99.9229188078109%"&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD width="11.188298038817624%" height="30px"&gt;
&lt;P&gt;&lt;STRONG&gt;Date&lt;/STRONG&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.468133429764134%" height="30px"&gt;
&lt;P&gt;&lt;STRONG&gt;Start, PDT&lt;/STRONG&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="49.665664293961655%" height="30px"&gt;
&lt;P&gt;&lt;STRONG&gt;Session name&lt;/STRONG&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="28.60082304526749%" height="30px"&gt;&lt;STRONG&gt;Speaker(s)&lt;/STRONG&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="11.188298038817624%" height="57px"&gt;
&lt;P&gt;5/24/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.468133429764134%" height="57px"&gt;
&lt;P&gt;8:30 AM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="49.665664293961655%" height="57px"&gt;
&lt;P&gt;&lt;A href="https://aka.ms/Build22/KEY01" target="_blank" rel="noopener"&gt;Microsoft Build Opening Keynote&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="28.60082304526749%" height="57px"&gt;Satya Nadella&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="11.188298038817624%" height="140px"&gt;
&lt;P&gt;5/24/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.468133429764134%" height="140px"&gt;
&lt;P&gt;10:10 AM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="49.665664293961655%" height="140px"&gt;
&lt;P data-unlink="true"&gt;&lt;A href="https://mybuild.microsoft.com/en-US/sessions/9b919415-e250-44a4-b3ad-d462ba147341?source=https://mybuild.microsoft.com/en-US/theme/gain-agility-with-an-integrated-data-platform?f=%255B%257B%2522name%2522%253A%2522Global%2522%252C%2522facetName%2522%253A%2522event%2522%257D%255D&amp;amp;t=%257B%2522from%2522%253A%25222022-05-24T00%253A00%253A00-07%253A00%2522%252C%2522to%2522%253A%25222022-05-26T23%253A59%253A00-07%253A00%2522%257D" target="_self"&gt;Core session: Accelerate innovation and achieve agility&lt;/A&gt; on a trusted, integrated platform with hybrid and multicloud capabilities&amp;nbsp;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="28.60082304526749%" height="140px"&gt;
&lt;P&gt;Rohan Kumar, Jessica Hawk,&amp;nbsp;&lt;SPAN&gt;Rafiq Jalal (KPMG)&lt;/SPAN&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="11.188298038817624%" height="57px"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.468133429764134%" height="57px"&gt;
&lt;P&gt;9:00 AM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="49.665664293961655%" height="57px"&gt;
&lt;P&gt;&lt;A href="https://aka.ms/Build22/KEY05" target="_blank" rel="noopener"&gt;Keynote: The Future of AI Development Tools&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="28.60082304526749%" height="57px"&gt;Kevin Scott&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="11.188298038817624%" height="57px"&gt;
&lt;P&gt;5/24/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.468133429764134%" height="57px"&gt;
&lt;P&gt;12:30 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="49.665664293961655%" height="57px"&gt;
&lt;P&gt;&lt;A href="https://aka.ms/Build22KEY03" target="_blank" rel="noopener"&gt;Into Focus: Preparing for the metaverse&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="28.60082304526749%" height="57px"&gt;Sam George, Alysa Taylor, Nicole Herskowitz, Andy Pratt, Matt Fleckenstein&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="11.188298038817624%" height="30px"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.468133429764134%" height="30px"&gt;
&lt;P&gt;10:00 AM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="49.665664293961655%" height="30px"&gt;
&lt;P&gt;&lt;A href="https://aka.ms/Build22KEY06" target="_blank" rel="noopener"&gt;Into Focus: AI&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="28.60082304526749%" height="30px"&gt;Eric Boyd, Seth Juarez&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="11.188298038817624%" height="85px"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.468133429764134%" height="85px"&gt;
&lt;P&gt;11:00 AM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="49.665664293961655%" height="85px"&gt;
&lt;P&gt;Breakout Session: &lt;A href="https://aka.ms/Build22BRK21" target="_blank" rel="noopener"&gt;Scaling responsible MLOps with Azure Machine Learning&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="28.60082304526749%" height="85px"&gt;Daniel Moth, Abe Omorogbe, Beatriz Stollnitz, Minsoo Thigpen&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="11.188298038817624%" height="140px"&gt;
&lt;P&gt;5/24/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.468133429764134%" height="140px"&gt;
&lt;P&gt;1:30 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="49.665664293961655%" height="140px"&gt;
&lt;P&gt;Breakout Session: &lt;A href="https://aka.ms/Build22BRK22" target="_blank" rel="noopener"&gt;Build and deploy containerized applications and databases for hybrid and multicloud with Azure Arc&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="28.60082304526749%" height="140px"&gt;Travis Wright, Jason&amp;nbsp;Hansen,Jes&amp;nbsp;Schultz, Lior&amp;nbsp;Kamrat&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="11.188298038817624%" height="112px"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.468133429764134%" height="112px"&gt;
&lt;P&gt;2:00 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="49.665664293961655%" height="112px"&gt;
&lt;P&gt;Breakout Session: &lt;A href="https://aka.ms/Build22BRK13" target="_blank" rel="noopener"&gt;Building document intelligence applications with Applied AI and Cognitive Services&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="28.60082304526749%" height="112px"&gt;Ayşegül&amp;nbsp;Yönet, Krishna Doss Mohan, Marco&amp;nbsp;Casalaina, Neta&amp;nbsp;Haiby&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="11.188298038817624%" height="112px"&gt;
&lt;P&gt;5/24/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.468133429764134%" height="112px"&gt;
&lt;P&gt;11:30 AM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="49.665664293961655%" height="112px"&gt;
&lt;P&gt;Breakout Session: &lt;A href="https://aka.ms/Build22BRK20" target="_blank" rel="noopener"&gt;Modernize your application with new innovations across SQL Server 2022 and Azure SQL&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="28.60082304526749%" height="112px"&gt;Bob Ward, Chuck Heinzelman, Drew Skwiers-Koballa, Sunetra Virdi&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="11.188298038817624%" height="57px"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.468133429764134%" height="57px"&gt;
&lt;P&gt;1:00 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="49.665664293961655%" height="57px"&gt;
&lt;P&gt;Breakout Session: &lt;A href="https://aka.ms/Build22BRK23" target="_blank" rel="noopener"&gt;Democratize your data at scale with Power BI&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="28.60082304526749%" height="57px"&gt;Arun Ulgaratchagan, Priya Sathy&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Microsoft Build is also and foremost an amazing opportunity to connect with Microsoft experts. After each breakout session, you can join our &lt;STRONG&gt;Ask The Expert&lt;/STRONG&gt; sessions and &lt;STRONG&gt;Product Roundtables &lt;/STRONG&gt;to dive deeper into the subject and engage in live discussions.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;Ask the Experts &lt;/U&gt;&lt;/STRONG&gt;&lt;STRONG&gt;&lt;U&gt;:&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE border="1" width="99.94861253854059%"&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD width="10.277492291880781%"&gt;
&lt;P&gt;&lt;STRONG&gt;Date&lt;/STRONG&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.894141829393634%"&gt;
&lt;P&gt;&lt;STRONG&gt;Start, PDT&lt;/STRONG&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="50.899280575539564%"&gt;&lt;STRONG&gt;Session name&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="27.877697841726622%"&gt;&lt;STRONG&gt;Speaker(s)&lt;/STRONG&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="10.277492291880781%"&gt;
&lt;P&gt;5/24/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.894141829393634%"&gt;
&lt;P&gt;12:30 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="50.899280575539564%"&gt;&lt;A href="https://aka.ms/Build22CATE20" target="_blank" rel="noopener"&gt;Ask the Experts: Modernize your applications with new features across SQL Server 2022 and Azure SQ&lt;/A&gt;&lt;/TD&gt;
&lt;TD width="27.877697841726622%"&gt;Bob Ward, Chuck Heinzelman, Drew Skwiers-Koballa, Miwa Monji, Sunetra Virdi&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="10.277492291880781%"&gt;
&lt;P&gt;5/24/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.894141829393634%"&gt;
&lt;P&gt;2:30 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="50.899280575539564%"&gt;&lt;A href="https://aka.ms/Build22CATE22" target="_blank" rel="noopener"&gt;Ask the Experts: Build and deploy containerized applications and databases for hybrid and multicloud with Azure Arc&lt;/A&gt;&lt;/TD&gt;
&lt;TD width="27.877697841726622%"&gt;Dhananjay Mahajan, Dinakar Nethi, Jes Schultz, Lior Kamrat&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="10.277492291880781%"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.894141829393634%"&gt;
&lt;P&gt;12:00 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="50.899280575539564%"&gt;&lt;A href="https://aka.ms/Build22CATE21" target="_blank" rel="noopener"&gt;Ask the Expert: Scaling responsible MLOps with Azure Machine Learning&lt;/A&gt;&lt;/TD&gt;
&lt;TD width="27.877697841726622%"&gt;Christian Gero, Minsoo Thigpen, Ilya Matiach, Richard Edgar&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="10.277492291880781%"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.894141829393634%"&gt;
&lt;P&gt;1:00 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="50.899280575539564%"&gt;&lt;A href="https://aka.ms/Build22CATE10" target="_blank" rel="noopener"&gt;Ask the Experts: Deploy modern containerized apps and cloud native databases at scale&lt;/A&gt;&lt;/TD&gt;
&lt;TD width="27.877697841726622%"&gt;Anthony Chu, Daria Grigoriu, Deborah Chen, Kendall Roden&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="10.277492291880781%"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.894141829393634%"&gt;
&lt;P&gt;2:00 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="50.899280575539564%"&gt;&lt;A href="https://aka.ms/Build22CATE23" target="_blank" rel="noopener"&gt;Ask the Experts: Democratizing your data at scale with Power BI&lt;/A&gt;&lt;/TD&gt;
&lt;TD width="27.877697841726622%"&gt;Charles Webb, Dharini Sundaram, Gautam Bharti, Priyanka Langade&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="10.277492291880781%"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.894141829393634%"&gt;
&lt;P&gt;12:00 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="50.899280575539564%"&gt;&lt;A href="https://mybuild.microsoft.com/en-US/sessions/c1d096d3-d162-4bf2-b16d-7381e4081ec1?source=sessions" target="_blank" rel="noopener"&gt;Roundtable: Azure Machine Learning – Responsible AI and MLOps&lt;/A&gt;&lt;/TD&gt;
&lt;TD width="27.877697841726622%"&gt;Steve Sweetman, Ahmet Gyger, Abe Omorogbe, Cody Peterson&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="10.277492291880781%"&gt;
&lt;P&gt;Post-event June 16&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="10.894141829393634%"&gt;
&lt;P&gt;9:30 AM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="50.899280575539564%"&gt;&lt;A href="https://aka.ms/ateonlearntv" target="_blank" rel="noopener"&gt;Ask the Expert: Building AI - infused document processing applications&lt;/A&gt;&lt;/TD&gt;
&lt;TD width="27.877697841726622%"&gt;Ayşegül&amp;nbsp;Yönet, Krishna Doss Mohan, Marco&amp;nbsp;Casalaina&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Take advantage of our guided, online learning workshops where our Microsoft experts will walk you through everything from analytics and databases to AI and Machine Learning.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;Learn Live sessions and Intro to Tech Skills:&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE border="1" width="100%"&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD width="9.172661870503598%" height="30px"&gt;&lt;STRONG&gt;Date&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="11.12538540596095%" height="30px"&gt;&lt;STRONG&gt;Start, PDT&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="52.749229188078104%" height="30px"&gt;&lt;STRONG&gt;Session name&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="26.952723535457352%" height="30px"&gt;&lt;STRONG&gt;Speaker(s)&lt;/STRONG&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="9.172661870503598%"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="11.12538540596095%"&gt;
&lt;P&gt;2:00 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="52.749229188078104%"&gt;
&lt;P&gt;&lt;A href="https://aka.ms/build2022-CLL12" target="_blank" rel="noopener"&gt;Azure Cosmos DB: Learn how to enable analytics over real-time operational data with Azure Synapse Link&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="26.952723535457352%"&gt;
&lt;P&gt;Gary Hope and Rodrigo Souza&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="9.172661870503598%"&gt;
&lt;P&gt;5/24/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="11.12538540596095%"&gt;
&lt;P&gt;1:30 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="52.749229188078104%"&gt;
&lt;P&gt;&lt;A href="https://aka.ms/build2022-CLL07" target="_blank" rel="noopener"&gt;Develop applications with Azure Database for MySQL - Flexible Server&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="26.952723535457352%"&gt;
&lt;P&gt;Marko Hotti and Shreya R. Aithal&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="9.172661870503598%"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="11.12538540596095%"&gt;
&lt;P&gt;1:00 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="52.749229188078104%"&gt;
&lt;P&gt;&lt;A href="https://aka.ms/build2022-CLL10" target="_blank" rel="noopener"&gt;Introduction to Azure Arc enabled Kubernetes&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="26.952723535457352%"&gt;
&lt;P&gt;Orin Thomas, Vinicius Apolinario&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="9.172661870503598%"&gt;
&lt;P&gt;5/24/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="11.12538540596095%"&gt;
&lt;P&gt;12:30 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="52.749229188078104%"&gt;
&lt;P&gt;&lt;A href="https://aka.ms/build2022-CLL06" target="_blank" rel="noopener"&gt;Develop IoT solutions with Azure SQL Database&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="26.952723535457352%"&gt;
&lt;P&gt;Anna Hoffman and Silvano Coriani&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="9.172661870503598%"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="11.12538540596095%"&gt;
&lt;P&gt;6:00 AM (3:00 PM Central Europe Time)&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="52.749229188078104%"&gt;
&lt;P&gt;&lt;A href="https://mybuild.microsoft.com/en-US/sessions/f6a6d354-5a80-4d8c-a0e9-1305e87165c9?source=https://mybuild.microsoft.com/en-US/home/uk?replaceFacets=true&amp;amp;f=%255B%257B%2522name%2522%253A%2522Learning%2520Zone%2522%252C%2522facetName%2522%253A%2522topic%2522%257D%255D&amp;amp;t=%257B%2522from%2522%253A%25222022-05-24T00%253A00%253A00-07%253A00%2522%252C%2522to%2522%253A%25222022-05-26T23%253A59%253A00-07%253A00%2522%257D" target="_blank" rel="noopener"&gt;Extract data from forms with Form Recognizer&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="26.952723535457352%"&gt;
&lt;P&gt;David Glover and Vinod Kurpad&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="9.172661870503598%"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="11.12538540596095%"&gt;
&lt;P&gt;10:00 AM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="52.749229188078104%"&gt;
&lt;P&gt;&lt;A href="https://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmybuild.microsoft.com%2Fsessions%2F1cc11fb0-a754-416a-9e28-f8ca72cd36df&amp;amp;data=05%7C01%7Cnmickey%40microsoft.com%7C919500b2dc1a4ffe300908da377066e8%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C637883253150774465%7CUnknown%7CTWFpbGZsb3d8eyJWIjoiMC4wLjAwMDAiLCJQIjoiV2luMzIiLCJBTiI6Ik1haWwiLCJXVCI6Mn0%3D%7C3000%7C%7C%7C&amp;amp;sdata=XA4hSg2gBt0oII8oGh%2BmUUSGGgfRR8HEDG3z0JQIs8c%3D&amp;amp;reserved=0" target="_blank" rel="noopener"&gt;How to develop custom object models with data labeling tools and AutoML&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="26.952723535457352%"&gt;
&lt;P&gt;Paul DeCarlo and Cassie Dreviu&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="9.172661870503598%"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="11.12538540596095%"&gt;
&lt;P&gt;9:00 AM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="52.749229188078104%"&gt;
&lt;P&gt;&lt;A href="https://aka.ms/Build22-CTS05" target="_blank" rel="noopener"&gt;Intro to Tech Skills: A guided journey into AI&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="26.952723535457352%"&gt;
&lt;P&gt;Carlotta Castelluccio and Dmitry Shoshnikov&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="9.172661870503598%"&gt;
&lt;P&gt;5/25/2022&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="11.12538540596095%"&gt;
&lt;P&gt;2:00 PM&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="52.749229188078104%"&gt;
&lt;P&gt;&lt;A href="https://aka.ms/Build22-CTS10" target="_blank" rel="noopener"&gt;Intro to Tech Skills: The New Developer’s Guide to the Cloud&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="26.952723535457352%"&gt;
&lt;P&gt;Someleze Diko, Nitya Narasimhan,Christoffer Noring&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;To go more in depth into specific topics, explore our &lt;STRONG&gt;on-demand&lt;/STRONG&gt; recordings.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;On-Demand recordings:&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;TABLE border="1" width="100%"&gt;
&lt;TBODY&gt;
&lt;TR&gt;
&lt;TD width="12.058924289140117%"&gt;&lt;STRONG&gt;Date&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="54.607742377526556%"&gt;&lt;STRONG&gt;Session name&lt;/STRONG&gt;&lt;/TD&gt;
&lt;TD width="33.333333333333336%"&gt;&lt;STRONG&gt;Speaker(s)&lt;/STRONG&gt;&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="12.058924289140117%"&gt;
&lt;P&gt;On-demand&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="54.607742377526556%"&gt;
&lt;P&gt;&lt;A href="https://mybuild.microsoft.com/en-US/sessions/2780a5c8-882d-470f-8208-d4824fccd2f8?source=sessions" target="_blank" rel="noopener"&gt;Analytics and Operational Data&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="33.333333333333336%"&gt;
&lt;P&gt;Chuck Heinzelman&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;TR&gt;
&lt;TD width="12.058924289140117%"&gt;
&lt;P&gt;On-demand&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="54.607742377526556%"&gt;
&lt;P&gt;&lt;A href="https://mybuild.microsoft.com/en-US/sessions/cc11390f-5711-405f-8bf8-92bc614aed9f?source=sessions" target="_blank" rel="noopener"&gt;Azure SQL and Azure Functions: Integration with SQL bindings&lt;/A&gt;&lt;/P&gt;
&lt;/TD&gt;
&lt;TD width="33.333333333333336%"&gt;
&lt;P&gt;Drew Skwiers-Koballa&lt;/P&gt;
&lt;/TD&gt;
&lt;/TR&gt;
&lt;/TBODY&gt;
&lt;/TABLE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Learn more&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;The &lt;A href="https://www.microsoft.com/en-US/cloudskillschallenge/build/registration/2022" target="_blank" rel="noopener"&gt;&lt;STRONG&gt;Cloud Skills Challenge&lt;/STRONG&gt;&lt;/A&gt;&lt;STRONG&gt; is back at Build!&lt;/STRONG&gt; Earn a free Microsoft Certification exam by completing a collection of interactive Microsoft Learn modules.&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Interested in becoming a Data or AI engineer?&lt;/STRONG&gt; Get started with one of our structured, self-paced courses to take you from zero to hero in less than an hour each day:
&lt;UL&gt;
&lt;LI&gt;&lt;A href="https://azure.microsoft.com/en-us/resources/azure-ai-learning-journey/" target="_blank" rel="noopener"&gt;Zero to hero in 4 weeks with Azure AI: A guide to achieving AI-900 Azure AI Fundamentals Certification&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;A href="https://azure.microsoft.com/en-us/resources/ml-learning-journey/" target="_blank" rel="noopener"&gt;Zero to hero in 4 weeks with Azure Machine Learning: A guide to achieving your DP-100 Azure Data Scientist Associate certification&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;A href="https://azure.microsoft.com/en-us/resources/data-engineer-learning-journey/" target="_blank" rel="noopener"&gt;Data Engineering on Microsoft Azure: A guide to achieving a DP-203 Azure Data Engineer Associate Certification&lt;/A&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Discover more Azure Data and AI learning resources &lt;/STRONG&gt;and documentation on &lt;A href="https://docs.microsoft.com/en-us/learn/topics/azure-data-and-ai" target="_blank" rel="noopener"&gt;Microsoft Learn&lt;/A&gt;&lt;/LI&gt;
&lt;/UL&gt;</description>
      <pubDate>Wed, 18 May 2022 22:47:05 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/a-data-and-ai-guide-to-build-2022/ba-p/3389827</guid>
      <dc:creator>NatalieM</dc:creator>
      <dc:date>2022-05-18T22:47:05Z</dc:date>
    </item>
    <item>
      <title>SQL DB maintenance activity using Automation accounts for Azure SQL DBs(Different scenarios)</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/sql-db-maintenance-activity-using-automation-accounts-for-azure/ba-p/3370081</link>
      <description>&lt;P&gt;&lt;STRONG&gt;Infrastructure:&lt;/STRONG&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Multiple Azure SQL DB instances&lt;/LI&gt;
&lt;LI&gt;Multiple databases in total&lt;/LI&gt;
&lt;LI&gt;Multiple instances consist of multiple databases&lt;/LI&gt;
&lt;LI&gt;Different subscriptions exist for different environments (Dev, UAT, Prod)&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;STRONG&gt;Requirement:&lt;/STRONG&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Schedule maintenance jobs for all instances&lt;/LI&gt;
&lt;LI&gt;All of them should run parallelly&lt;/LI&gt;
&lt;LI&gt;No dependency of one job on another&lt;/LI&gt;
&lt;LI&gt;Per subscription one automation account can be created&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;STRONG&gt;Overview of Steps to be followed:&lt;/STRONG&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Create Azure automation account&lt;/LI&gt;
&lt;LI&gt;Import SQLServer module&lt;/LI&gt;
&lt;LI&gt;Add Credentials to access SQL DB&lt;/LI&gt;
&lt;LI&gt;Add a runbook to run the maintenance&lt;/LI&gt;
&lt;LI&gt;Schedule task&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;Create new automation account&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Login to your Azure portal and search “automation account” in the search bar.&lt;/P&gt;
&lt;P&gt;Choose Automation Accounts from the results:&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Click "create"&lt;/LI&gt;
&lt;LI&gt;Fill the form, choose a name for your automation account, and choose in which resource group it will be placed.&lt;/LI&gt;
&lt;LI&gt;Click "create" and wait for the account to be created.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;Import SQLServer module&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Click on "Modules" at the left options panel, and then click on "Browse Gallery" and search for "SQLServer"&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Choose "SqlServer" by matteot_msft&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Then click on "import" and the "OK"&lt;/LI&gt;
&lt;LI&gt;Wait for the import to complete&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;Add Credentials to access SQL DB&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;This will use secure way to hold login name and password that will be used to access Azure SQL DB.&lt;/P&gt;
&lt;P&gt;Note: You can skip this and use it as clear text if you like to use clear text skip to the next step.&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Under "Shared Resources" click on credentials&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Then click on "Add Credential"&lt;/LI&gt;
&lt;LI&gt;Type "SQLCred"(or whatever name you want to provide for your credentials) as the name of the credential.&lt;/LI&gt;
&lt;LI&gt;In the username field type the SQL Login that will be used for maintenance and its password.&lt;/LI&gt;
&lt;LI&gt;Click "Create"&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;Add a runbook to run the maintenance&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Click on "runbooks" at the left panel and then click "add a runbook"&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Choose "create a new runbook" and then give it a name and choose "Powershell" as the type of the runbook and then click on "create"&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;UL class="lia-list-style-type-square"&gt;
&lt;LI&gt;Copy and paste the following row to the new runbook (in case of 1 Azure SQL Server and 1 Azure SQL DB).&lt;/LI&gt;
&lt;LI&gt;Make sure you change your database properties.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;(Thanks to my colleague Vasukhi K S to help me with the code testing in different scenarios)&lt;/P&gt;
&lt;LI-CODE lang="applescript"&gt;$AzureSQLServerName = "&amp;lt;ServerName&amp;gt;" 
$AzureSQLDatabaseName = "&amp;lt;DatabaseName&amp;gt;" 

$AzureSQLServerName = $AzureSQLServerName + ".database.windows.net" 
$Cred = Get-AutomationPSCredential -Name "SQLCred" 
$SQLOutput = $(Invoke-Sqlcmd -ServerInstance $AzureSQLServerName -Username $Cred.UserName -Password $Cred.GetNetworkCredential().Password -Database $AzureSQLDatabaseName -Query "exec [dbo].[AzureSQLMaintenance] @Operation='all' ,@LogToTable=1" -QueryTimeout 65535 -ConnectionTimeout 60 -Verbose) 4&amp;gt;&amp;amp;1 

Write-Output $SQLOutput
​&lt;/LI-CODE&gt;
&lt;UL&gt;
&lt;LI&gt;Click on Publish and confirm.&lt;/LI&gt;
&lt;LI&gt;Create a new runbook again&lt;/LI&gt;
&lt;LI&gt;Copy and paste the following row to the new runbook (in case of 1 Azure SQL Server and multiple Azure SQL DBs).&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="applescript"&gt;$ServerInstance = "&amp;lt;ServerName&amp;gt;" ## instance name 
$Cred = Get-AutomationPSCredential -Name "SQLCred" 
$Databases = Invoke-SqlCmd -ServerInstance $ServerInstance -Username $Cred.UserName -Password $Cred.GetNetworkCredential().Password -Database master -Query "SELECT [name] AS [Database] FROM sys.databases WHERE name not in ('master') ORDER BY 1 DESC;"


foreach ($DB in $Databases)
{
    Write-Output "Processing $($DB.Database)..."

$SQLOutput = Invoke-SqlCmd -ServerInstance $ServerInstance -Username $Cred.UserName -Password $Cred.GetNetworkCredential().Password -Database $DB.Database -Query "EXECUTE dbo.IndexOptimize @Databases = $($DB.Database) ,@MinNumberOfPages = 1000,@FragmentationLow = NULL,@FragmentationMedium = 'INDEX_REORGANIZE,INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',@FragmentationHigh = 'INDEX_REBUILD_ONLINE,INDEX_REBUILD_OFFLINE',@FragmentationLevel1 = 5,@FragmentationLevel2 = 30,@logToTable = 'Y';" -OutputSqlErrors:$true -Verbose 
}
Write-Output “Complete”
​&lt;/LI-CODE&gt;
&lt;P&gt;For running these PowerShell scripts on the databases, we should first have the required objects created in the database. Please refer to following pointers for creation of objects:&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;In case of 1 Server 1 DB&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;Create stored procedure “AzureSQLDBMaintenance” as per the attached script below:&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="sql"&gt;/****** Object:  StoredProcedure [dbo].[AzureSQLMaintenance]   ******/
SET ANSI_NULLS ON
GO

SET QUOTED_IDENTIFIER ON
GO

CREATE PROCEDURE [dbo].[AzureSQLMaintenance]
	(
		@operation nvarchar(10) = null,
		@mode nvarchar(10) = 'smart',
		@ResumableIndexRebuild bit = 0,
		@RebuildHeaps bit = 0,
		@LogToTable bit = 0,
		@debug nvarchar = 'none'
	)
as
begin
	set nocount on;
	
	---------------------------------------------
	--- Varialbles and pre conditions check
	---------------------------------------------

	set quoted_identifier on;
	declare @idxIdentifierBegin char(1), @idxIdentifierEnd char(1);
	declare @statsIdentifierBegin char(1), @statsIdentifierEnd char(1);
	
	declare @msg nvarchar(max);
	declare @minPageCountForIndex int = 40;
	declare @OperationTime datetime2 = sysdatetime();
	declare @KeepXOperationInLog int =3;
	declare @ScriptHasAnError int = 0; 
	declare @ResumableIndexRebuildSupported int;
	declare @indexStatsMode sysname;
	declare @LowFragmentationBoundry int = 5;
	declare @HighFragmentationBoundry int = 30;

	/* make sure parameters selected correctly */
	set @operation = lower(@operation)
	set @mode = lower(@mode)
	set @debug = lower(@debug) /* any value at this time will produce the temp tables as permanent tables */
	
	if @mode not in ('smart','dummy')
		set @mode = 'smart'

	---------------------------------------------
	--- Begin
	---------------------------------------------

	if @operation not in ('index','statistics','all') or @operation is null
	begin
		raiserror('@operation (varchar(10)) [mandatory]',0,0)
		raiserror(' Select operation to perform:',0,0)
		raiserror('     "index" to perform index maintenance',0,0)
		raiserror('     "statistics" to perform statistics maintenance',0,0)
		raiserror('     "all" to perform indexes and statistics maintenance',0,0)
		raiserror(' ',0,0)
		raiserror('@mode(varchar(10)) [optional]',0,0)
		raiserror(' optionaly you can supply second parameter for operation mode: ',0,0)
		raiserror('     "smart" (Default) using smart decision about what index or stats should be touched.',0,0)
		raiserror('     "dummy" going through all indexes and statistics regardless thier modifications or fragmentation.',0,0)
		raiserror(' ',0,0)
		raiserror('@ResumableIndexRebuild(bit) [optional]',0,0)
		raiserror(' optionaly you can choose to rebuild indexes as resumable operation: ',0,0)
		raiserror('     "0" (Default) using non resumable index rebuild.',0,0)
		raiserror('     "1" using resumable index rebuild when it is supported.',0,0)
		raiserror(' ',0,0)
		raiserror('@RebuildHeaps(bit) [optional]',0,0)
		raiserror(' Logging option: @LogToTable(bit)',0,0)
		raiserror('     0 - (Default) do not log operation to table',0,0)
		raiserror('     1 - log operation to table',0,0)
		raiserror('		for logging option only 3 last execution will be kept by default. this can be changed by easily in the procedure body.',0,0)
		raiserror('		Log table will be created automatically if not exists.',0,0)
		raiserror(' ',0,0)
		raiserror('@LogToTable(bit) [optional]',0,0)
		raiserror(' Rebuild HEAPS to fix forwarded records issue on tables with no clustered index',0,0)
		raiserror('     0 - (Default) do not rebuild heaps',0,0)
		raiserror('     1 - Rebuild heaps based on @mode parameter, @mode=dummy will rebuild all heaps',0,0)
		raiserror(' ',0,0)
		raiserror('Example:',0,0)
		raiserror('		exec  AzureSQLMaintenance ''all'', @LogToTable=1',0,0)

	end
	else 
	begin
		
		---------------------------------------------
		--- Prepare log table
		---------------------------------------------

		/* Prepare Log Table */
		if object_id('AzureSQLMaintenanceLog') is null and @LogToTable=1
		begin
			create table AzureSQLMaintenanceLog (id bigint primary key identity(1,1), OperationTime datetime2, command varchar(4000),ExtraInfo varchar(4000), StartTime datetime2, EndTime datetime2, StatusMessage varchar(1000));
		end

		---------------------------------------------
		--- Resume operation
		---------------------------------------------

		/*Check is there is operation to resume*/
		if OBJECT_ID('AzureSQLMaintenanceCMDQueue') is not null 
		begin
			if 
				/*resume information exists*/ exists(select * from AzureSQLMaintenanceCMDQueue where ID=-1) 
			begin
				/*resume operation confirmed*/
				set @operation='resume' -- set operation to resume, this can only be done by the proc, cannot get this value as parameter

				-- restore operation parameters 
				select top 1
				@LogToTable = JSON_VALUE(ExtraInfo,'$.LogToTable')
				,@mode = JSON_VALUE(ExtraInfo,'$.mode')
				,@ResumableIndexRebuild = JSON_VALUE(ExtraInfo,'$.ResumableIndexRebuild')
				from AzureSQLMaintenanceCMDQueue 
				where ID=-1
				
				raiserror('-----------------------',0,0)
				set @msg = 'Resuming previous operation'
				raiserror(@msg,0,0)
				raiserror('-----------------------',0,0)
			end
			else
				begin
					-- table [AzureSQLMaintenanceCMDQueue] exist but resume information does not exists
					-- this might happen in case execution intrupted between collecting index &amp;amp; ststistics information and executing commands.
					-- to fix that we drop the table now, it will be recreated later 
					DROP TABLE [AzureSQLMaintenanceCMDQueue];
				end
		end


		---------------------------------------------
		--- Report operation parameters
		---------------------------------------------
		
		/*Write operation parameters*/
		raiserror('-----------------------',0,0)
		set @msg = 'set operation = ' + @operation;
		raiserror(@msg,0,0)
		set @msg = 'set mode = ' + @mode;
		raiserror(@msg,0,0)
		set @msg = 'set ResumableIndexRebuild = ' + cast(@ResumableIndexRebuild as varchar(1));
		raiserror(@msg,0,0)
		set @msg = 'set RebuildHeaps = ' + cast(@RebuildHeaps as varchar(1));
		raiserror(@msg,0,0)
		set @msg = 'set LogToTable = ' + cast(@LogToTable as varchar(1));
		raiserror(@msg,0,0)
		raiserror('-----------------------',0,0)
	end

	if @LogToTable=1 insert into AzureSQLMaintenanceLog values(@OperationTime,null,null,sysdatetime(),sysdatetime(),'Starting operation: Operation=' +@operation + ' Mode=' + @mode + ' Keep log for last ' + cast(@KeepXOperationInLog as varchar(10)) + ' operations' )	

	-- create command queue table, if there table exits then we resume operation in earlier stage.
	if @operation!='resume'
		create table AzureSQLMaintenanceCMDQueue (ID int identity primary key,txtCMD nvarchar(max),ExtraInfo varchar(max))

	---------------------------------------------
	--- Check if engine support resumable index operation
	---------------------------------------------
	if @ResumableIndexRebuild=1 
	begin
		if cast(SERVERPROPERTY('EngineEdition')as int)&amp;gt;=5 or cast(SERVERPROPERTY('ProductMajorVersion')as int)&amp;gt;=14
		begin
			set @ResumableIndexRebuildSupported=1;
		end
		else
		begin 
				set @ResumableIndexRebuildSupported=0;
				set @msg = 'Resumable index rebuild is not supported on this database'
				raiserror(@msg,0,0)
				if @LogToTable=1 insert into AzureSQLMaintenanceLog values(@OperationTime,null,null,sysdatetime(),sysdatetime(),@msg)	
		end
	end


	---------------------------------------------
	--- Index maintenance
	---------------------------------------------
	if @operation in('index','all')
	begin
		/**/
		if @mode='smart' and @RebuildHeaps=1 
			set @indexStatsMode = 'SAMPLED'
		else
			set @indexStatsMode = 'LIMITED'
	
		raiserror('Get index information...(wait)',0,0) with nowait;
		/* Get Index Information */
		/* using inner join - this eliminates indexes that we cannot maintain such as indexes on functions */
		select 
			idxs.[object_id]
			,ObjectSchema = OBJECT_SCHEMA_NAME(idxs.object_id)
			,ObjectName = object_name(idxs.object_id) 
			,IndexName = idxs.name
			,idxs.type
			,idxs.type_desc
			,i.avg_fragmentation_in_percent
			,i.page_count
			,i.index_id
			,i.partition_number
			,i.avg_page_space_used_in_percent
			,i.record_count
			,i.ghost_record_count
			,i.forwarded_record_count
			,null as OnlineOpIsNotSupported
			,null as ObjectDoesNotSupportResumableOperation
			,0 as SkipIndex
			,replicate('',128) as SkipReason
		into #idxBefore
		from sys.indexes idxs
		inner join sys.dm_db_index_physical_stats(DB_ID(),NULL, NULL, NULL ,@indexStatsMode) i  on i.object_id = idxs.object_id and i.index_id = idxs.index_id
		where idxs.type in (0 /*HEAP*/,1/*CLUSTERED*/,2/*NONCLUSTERED*/) 
		and (alloc_unit_type_desc = 'IN_ROW_DATA' /*avoid LOB_DATA or ROW_OVERFLOW_DATA*/ or alloc_unit_type_desc is null /*for ColumnStore indexes*/)
		and OBJECT_SCHEMA_NAME(idxs.object_id) != 'sys'
		and idxs.is_disabled=0
		order by i.avg_fragmentation_in_percent desc, i.page_count desc
				
		-- mark indexes XML,spatial and columnstore not to run online update 
		update #idxBefore set OnlineOpIsNotSupported=1 where [object_id] in (select [object_id] from #idxBefore where [type]=3 /*XML Indexes*/)

		-- mark clustered indexes for tables with 'text','ntext','image' to rebuild offline
		update #idxBefore set OnlineOpIsNotSupported=1 
		where index_id=1 /*clustered*/ and [object_id] in (
			select object_id
			from sys.columns c join sys.types t on c.user_type_id = t.user_type_id
			where t.name in ('text','ntext','image')
		)
	
		-- do all as offline for box edition that does not support online
		update #idxBefore set OnlineOpIsNotSupported=1  
			where /* Editions that does not support online operation in case this has been used with on-prem server */
				convert(varchar(100),serverproperty('Edition')) like '%Express%' 
				or convert(varchar(100),serverproperty('Edition')) like '%Standard%'
				or convert(varchar(100),serverproperty('Edition')) like '%Web%'
		
		-- Do non resumable operation when index contains computed column or timestamp data type
		update idx set ObjectDoesNotSupportResumableOperation=1
		from #idxBefore idx join sys.index_columns ic on idx.object_id = ic.object_id and idx.index_id=ic.index_id
		join sys.columns c on ic.object_id=c.object_id and ic.column_id=c.column_id
		where c.is_computed=1 or system_type_id=189 /*TimeStamp column*/
		
		-- set SkipIndex=1 if conditions for maintenance are not met
		-- this is used to idntify if stats need to be updated or not. 
		-- Check#1 - if table is too small
		update #idxBefore set SkipIndex=1,SkipReason='Maintenance is not needed as table is too small'
		where (
					/*Table is small*/
					(page_count&amp;lt;=@minPageCountForIndex)
				)
				and @mode != 'dummy' /*for Dummy mode we do not want to skip anything */
		
		-- Check#2 - if table is not small and fragmentation % is too low 
		update #idxBefore set SkipIndex=1,SkipReason='Maintenance is not needed as fragmentation % is low'
		where (
					/*Table is big enough - but fragmentation is less than 5%*/
					(page_count&amp;gt;@minPageCountForIndex and avg_fragmentation_in_percent&amp;lt;@LowFragmentationBoundry)
				)
				and @mode != 'dummy' /*for Dummy mode we do not want to skip anything */
		
		-- Skip columnstore indexes
		update #idxBefore set SkipIndex=1,SkipReason='Columnstore index'
		where (
					type in (
								5/*Clustered columnstore index*/,
								6/*Nonclustered columnstore index*/
							)
				)
				and @mode != 'dummy' /*for Dummy mode we do not want to skip anything */

		raiserror('---------------------------------------',0,0) with nowait
		raiserror('Index Information:',0,0) with nowait
		raiserror('---------------------------------------',0,0) with nowait

		select @msg = count(*) from #idxBefore 
		set @msg = 'Total Indexes: ' + @msg
		raiserror(@msg,0,0) with nowait

		select @msg = avg(avg_fragmentation_in_percent) from #idxBefore where page_count&amp;gt;@minPageCountForIndex
		set @msg = 'Average Fragmentation: ' + @msg
		raiserror(@msg,0,0) with nowait

		select @msg = sum(iif(avg_fragmentation_in_percent&amp;gt;=@LowFragmentationBoundry and page_count&amp;gt;@minPageCountForIndex,1,0)) from #idxBefore 
		set @msg = 'Fragmented Indexes: ' + @msg
		raiserror(@msg,0,0) with nowait

				
		raiserror('---------------------------------------',0,0) with nowait


		/* Choose the identifier to be used based on existing object name 
			this came up from object that contains '[' within the object name
			such as "EPK[export].[win_sourceofwealthbpf]" as index name
			if we use '[' as identifier it will cause wrong identifier name	
		*/
		if exists(
			select 1
			from #idxBefore 
			where IndexName like '%[%' or IndexName like '%]%'
			or ObjectSchema like '%[%' or ObjectSchema like '%]%'
			or ObjectName like '%[%' or ObjectName like '%]%'
			)
		begin
			set @idxIdentifierBegin = '"'
			set @idxIdentifierEnd = '"'
		end
		else 
		begin
			set @idxIdentifierBegin = '['
			set @idxIdentifierEnd = ']'
		end

			
		/* create queue for indexes */
		insert into AzureSQLMaintenanceCMDQueue(txtCMD,ExtraInfo)
		select 
		txtCMD = 'ALTER INDEX ' + @idxIdentifierBegin + IndexName + @idxIdentifierEnd + ' ON '+ @idxIdentifierBegin + ObjectSchema + @idxIdentifierEnd +'.'+ @idxIdentifierBegin + ObjectName + @idxIdentifierEnd + ' ' +
		case when (
					avg_fragmentation_in_percent between @LowFragmentationBoundry and @HighFragmentationBoundry and @mode = 'smart')/* index fragmentation condition */ 
					or 
					(@mode='dummy' and type in (5,6)/* Columnstore indexes in dummy mode -&amp;gt; reorganize them */
				) then
			 'REORGANIZE;'
			when OnlineOpIsNotSupported=1 then
			'REBUILD WITH(ONLINE=OFF,MAXDOP=1);'
			when ObjectDoesNotSupportResumableOperation=1 or @ResumableIndexRebuildSupported=0 or @ResumableIndexRebuild=0 then
			'REBUILD WITH(ONLINE=ON,MAXDOP=1);'
			else
			'REBUILD WITH(ONLINE=ON,MAXDOP=1, RESUMABLE=ON);'
		end
		, ExtraInfo = 
			case when type in (5,6) then
				'Dummy mode, reorganize columnstore indexes'
			else 
				'Current fragmentation: ' + format(avg_fragmentation_in_percent/100,'p')+ ' with ' + cast(page_count as nvarchar(20)) + ' pages'
			end
		from #idxBefore
		where SkipIndex=0 and type != 0 /*Avoid HEAPS*/


		---------------------------------------------
		--- Index - Heaps 
		---------------------------------------------

		/* create queue for heaps */
		if @RebuildHeaps=1 
		begin
			insert into AzureSQLMaintenanceCMDQueue(txtCMD,ExtraInfo)
			select 
			txtCMD = 'ALTER TABLE ' + @idxIdentifierBegin + ObjectSchema + @idxIdentifierEnd +'.'+ @idxIdentifierBegin + ObjectName + @idxIdentifierEnd + ' REBUILD;' 
			, ExtraInfo = 'Rebuilding heap - forwarded records ' + cast(forwarded_record_count as varchar(100)) + ' out of ' + cast(record_count as varchar(100)) + ' record in the table'
			from #idxBefore
			where
				type = 0 /*heaps*/
				and 
					(
						@mode='dummy' 
						or 
						(forwarded_record_count/nullif(record_count,0)&amp;gt;0.3) /* 30% of record count */
						or
						(forwarded_record_count&amp;gt;105000) /* for tables with &amp;gt; 350K rows dont wait for 30%, just run yje maintenance once we reach the 100K forwarded records */
					)
		end /* create queue for heaps */
	end



	---------------------------------------------
	--- Statistics maintenance
	---------------------------------------------

	if @operation in('statistics','all')
	begin 
		/*Gets Stats for database*/
		raiserror('Get statistics information...',0,0) with nowait;
		select 
			ObjectSchema = OBJECT_SCHEMA_NAME(s.object_id)
			,ObjectName = object_name(s.object_id) 
			,s.object_id
			,s.stats_id
			,StatsName = s.name
			,sp.last_updated
			,sp.rows
			,sp.rows_sampled
			,sp.modification_counter
			, i.type
			, i.type_desc
			,0 as SkipStatistics
		into #statsBefore
		from sys.stats s cross apply sys.dm_db_stats_properties(s.object_id,s.stats_id) sp 
		left join sys.indexes i on sp.object_id = i.object_id and sp.stats_id = i.index_id
		where OBJECT_SCHEMA_NAME(s.object_id) != 'sys' and /*Modified stats or Dummy mode*/(isnull(sp.modification_counter,0)&amp;gt;0 or @mode='dummy')
		order by sp.last_updated asc

		/*Remove statistics if it is handled by index rebuild / reorginize 
		I am removing statistics based on existance on the index in the list because for indexes with &amp;lt;5% changes we do not apply
		any action - therefore we might decide to update statistics */
		if @operation= 'all'
		update _stats set SkipStatistics=1 
			from #statsBefore _stats
			join #idxBefore _idx
			on _idx.ObjectSchema = _stats.ObjectSchema
			and _idx.ObjectName = _stats.ObjectName
			and _idx.IndexName = _stats.StatsName 
			where _idx.SkipIndex=0

		/*Skip statistics for Columnstore indexes*/
		update #statsBefore set SkipStatistics=1
		where type in (5,6) /*Column store indexes*/

		/*Skip statistics if resumable operation is pause on the same object*/
		if @ResumableIndexRebuildSupported=1
		begin
			update _stats set SkipStatistics=1
			from #statsBefore _stats join sys.index_resumable_operations iro on _stats.object_id=iro.object_id and _stats.stats_id=iro.index_id
		end
		
		raiserror('---------------------------------------',0,0) with nowait
		raiserror('Statistics Information:',0,0) with nowait
		raiserror('---------------------------------------',0,0) with nowait

		select @msg = sum(modification_counter) from #statsBefore
		set @msg = 'Total Modifications: ' + @msg
		raiserror(@msg,0,0) with nowait
		
		select @msg = sum(iif(modification_counter&amp;gt;0,1,0)) from #statsBefore
		set @msg = 'Modified Statistics: ' + @msg
		raiserror(@msg,0,0) with nowait
				
		raiserror('---------------------------------------',0,0) with nowait

		/* Choose the identifier to be used based on existing object name */
		if exists(
			select 1
			from #statsBefore 
			where StatsName like '%[%' or StatsName like '%]%'
			or ObjectSchema like '%[%' or ObjectSchema like '%]%'
			or ObjectName like '%[%' or ObjectName like '%]%'
			)
		begin
			set @statsIdentifierBegin = '"'
			set @statsIdentifierEnd = '"'
		end
		else 
		begin
			set @statsIdentifierBegin = '['
			set @statsIdentifierEnd = ']'
		end
		
		/* create queue for update stats */
		insert into AzureSQLMaintenanceCMDQueue(txtCMD,ExtraInfo)
		select 
		txtCMD = 'UPDATE STATISTICS '+ @statsIdentifierBegin + ObjectSchema + +@statsIdentifierEnd + '.'+@statsIdentifierBegin + ObjectName + @statsIdentifierEnd +' (' + @statsIdentifierBegin + StatsName + @statsIdentifierEnd + ') WITH FULLSCAN;'
		, ExtraInfo = '#rows:' + cast([rows] as varchar(100)) + ' #modifications:' + cast(modification_counter as varchar(100)) + ' modification percent: ' + format((1.0 * modification_counter/ rows ),'p')
		from #statsBefore
		where SkipStatistics=0;
	end

	if @operation in('statistics','index','all','resume')
	begin

		declare @SQLCMD nvarchar(max);
		declare @ID int;
		declare @ExtraInfo nvarchar(max);
	
		/*Print debug information in case debug is activated */
		if @debug!='none'
		begin
			drop table if exists idxBefore
			drop table if exists statsBefore
			drop table if exists cmdQueue
			if object_id('tempdb..#idxBefore') is not null select * into idxBefore from #idxBefore
			if object_id('tempdb..#statsBefore') is not null select * into statsBefore from #statsBefore
			if object_id('tempdb..AzureSQLMaintenanceCMDQueue') is not null select * into cmdQueue from AzureSQLMaintenanceCMDQueue
		end

		/*Save current execution parameters in case resume is needed */
		if @operation!='resume'
		begin
			set @ExtraInfo = (select top 1 LogToTable = @LogToTable, operation=@operation, operationTime=@OperationTime, mode=@mode, ResumableIndexRebuild = @ResumableIndexRebuild from sys.tables for JSON path, WITHOUT_ARRAY_WRAPPER)
			set identity_insert AzureSQLMaintenanceCMDQueue on
			insert into AzureSQLMaintenanceCMDQueue(ID,txtCMD,ExtraInfo) values(-1,'parameters to be used by resume code path',@ExtraInfo)
			set identity_insert AzureSQLMaintenanceCMDQueue off
		end
	
		---------------------------------------------
		--- Executing commands
		---------------------------------------------
		/*
		needed to rebuild indexes on comuted columns
		if ANSI_WARNINGS is set to OFF we might get the followin exception:
			Msg 1934, Level 16, State 1, Line 2
			ALTER INDEX failed because the following SET options have incorrect settings: 'ANSI_WARNINGS'. Verify that SET options are correct for use with indexed views and/or indexes on computed columns and/or filtered indexes and/or query notifications and/or XML data type methods and/or spatial index operations.
		*/
		SET ANSI_WARNINGS ON;

		raiserror('Start executing commands...',0,0) with nowait
		declare @T table(ID int, txtCMD nvarchar(max),ExtraInfo nvarchar(max));
		while exists(select * from AzureSQLMaintenanceCMDQueue where ID&amp;gt;0)
		begin
			update top (1) AzureSQLMaintenanceCMDQueue set txtCMD=txtCMD output deleted.* into @T where ID&amp;gt;0;
			select top (1) @ID = ID, @SQLCMD = txtCMD, @ExtraInfo=ExtraInfo from @T
			raiserror(@SQLCMD,0,0) with nowait
			if @LogToTable=1 insert into AzureSQLMaintenanceLog values(@OperationTime,@SQLCMD,@ExtraInfo,sysdatetime(),null,'Started')
			begin try
				exec(@SQLCMD)	
				if @LogToTable=1 update AzureSQLMaintenanceLog set EndTime = sysdatetime(), StatusMessage = 'Succeeded' where id=SCOPE_IDENTITY()
			end try
			begin catch
				set @ScriptHasAnError=1;
				set @msg = 'FAILED : ' + CAST(ERROR_NUMBER() AS VARCHAR(50)) + ERROR_MESSAGE();
				raiserror(@msg,0,0) with nowait
				if @LogToTable=1 update AzureSQLMaintenanceLog set EndTime = sysdatetime(), StatusMessage = @msg where id=SCOPE_IDENTITY()
			end catch
			delete from AzureSQLMaintenanceCMDQueue where ID = @ID;
			delete from @T
		end
		drop table AzureSQLMaintenanceCMDQueue;
	end
	
	---------------------------------------------
	--- Clean old records from log table
	---------------------------------------------
	if @LogToTable=1
	begin
		delete from AzureSQLMaintenanceLog 
		from 
			AzureSQLMaintenanceLog L join 
			(select distinct OperationTime from AzureSQLMaintenanceLog order by OperationTime desc offset @KeepXOperationInLog rows) F
				ON L.OperationTime = F.OperationTime
		insert into AzureSQLMaintenanceLog values(@OperationTime,null,cast(@@rowcount as varchar(100))+ ' rows purged from log table because number of operations to keep is set to: ' + cast( @KeepXOperationInLog as varchar(100)),sysdatetime(),sysdatetime(),'Cleanup Log Table')
	end

	if @ScriptHasAnError=0 	raiserror('Done',0,0)
	if @LogToTable=1 insert into AzureSQLMaintenanceLog values(@OperationTime,null,null,sysdatetime(),sysdatetime(),'End of operation')
	if @ScriptHasAnError=1 	raiserror('Script has errors - please review the log.',16,1)
end
GO


​&lt;/LI-CODE&gt;
&lt;P&gt;&lt;STRONG&gt;&amp;nbsp;&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;In case of 1 Server multiple DBs&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;OL&gt;
&lt;LI&gt;Create stored procedure “IndexOptimize” as per Ola Hallengren database maintenance scripts.&lt;/LI&gt;
&lt;LI&gt;Create table “CommandLog” for storing output as per&amp;nbsp;Ola Hallengren database maintenance scripts.&lt;/LI&gt;
&lt;LI&gt;Create stored procedure “CommandExecute” to execute and log commands as per the Ola Hallengren database maintenance scripts.&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;STRONG&gt;Note: &lt;/STRONG&gt;You will have to create these objects for each database.&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Schedule task&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Click on Schedules&lt;/LI&gt;
&lt;LI&gt;Click on "Add a schedule" and follow the instructions to choose existing schedule or create a new schedule.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Choose a time when the application has less or no workload, as running the maintenance might impact on performance while it's executing.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Monitoring&lt;/STRONG&gt;&lt;BR /&gt;&lt;BR /&gt;You can monitor the success of the job by reviewing the Automation overview page&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;You can click on a specific execution and get more details about it including the output of the script&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;For checking the output, you can also check the tables created for output:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-30px"&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 13 Jun 2022 04:45:37 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/sql-db-maintenance-activity-using-automation-accounts-for-azure/ba-p/3370081</guid>
      <dc:creator>Swati_Srivastava-MSFT</dc:creator>
      <dc:date>2022-06-13T04:45:37Z</dc:date>
    </item>
    <item>
      <title>Azure Data Factory CI-CD using YAML template</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/azure-data-factory-ci-cd-using-yaml-template/ba-p/3107341</link>
      <description>&lt;P&gt;&lt;STRONG&gt;&lt;FONT size="6"&gt;Purpose of this Blog&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Defined&amp;nbsp; the end to end Git integration and&amp;nbsp; Deployment flow using YAML template from a project implementation perspective&lt;/LI&gt;
&lt;LI&gt;This can be easily implemented as you just need to include this YAML template and do a few changes as this has been successfully implemented&amp;nbsp; for Deployments across some projects.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;FONT size="6"&gt;&lt;STRONG&gt;Developer Workflow(CI/CD)&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;&lt;FONT size="5"&gt;&lt;STRONG&gt;Integration of Code from Data Factory UI(Continuous Integration)&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;1. A sandbox Data Factory is created for development of data pipelines with Datasets and Linked Services.&lt;BR /&gt;The Data Factory is configured with Azure Dev-ops Git.(Collaboration and publish branch) and the root folder where the data factory code is committed&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;2. A feature branch is created based on the main/collaboration branch for development. The&amp;nbsp;branch in the Data Factory UI is changed to feature branch.&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;3. After the developer tests the pipelines and is satisfied with the changes , a pull request is raised to merge&lt;BR /&gt;the changes with the main/collaboration branch&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;4. After the PR gets approved by the concerned product leads, the changes made in feature branch are&lt;BR /&gt;merged into main branch.&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;5. The branch is changed to main branch in Data Factory UI . The changes are published to main branch by&lt;BR /&gt;clicking on publish button.&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;6. The changes are reflected in ARM template located in adf_publish branch&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;&lt;EM&gt;Note: If some parameters are missing in the arm template follow the steps below :&lt;/EM&gt;&lt;/P&gt;
&lt;P&gt;&lt;EM&gt;I. then we should go to manage&amp;nbsp;&lt;img /&gt;&amp;nbsp;in ADF UI&lt;/EM&gt;&lt;/P&gt;
&lt;P&gt;&lt;EM&gt;II. click on&amp;nbsp; the ARM template &lt;/EM&gt;&lt;/P&gt;
&lt;P&gt;&lt;EM&gt;III. click on Edit parameter configuration in the DataFactory UI and include parameters that were missing in the ARM template in the json configuration file.&lt;/EM&gt;&lt;/P&gt;
&lt;P&gt;&lt;EM&gt;IV. We need to include this json configuration file in the main branch at root folder configured in Git configuration, let us consider we keep it in&amp;nbsp; 'src/DataFactory/{DataFactoryName}/arm-template-parameters-definition.json'&lt;/EM&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;EM&gt;- Follow this documentation on how to use the&amp;nbsp;use custom parameters with the Resource Manager template :&amp;nbsp;&lt;A href="https://docs.microsoft.com/en-us/azure/data-factory/continuous-integration-delivery-resource-manager-custom-parameters" target="_blank" rel="noopener"&gt;Custom parameters in a Resource Manager template - Azure Data Factory | Microsoft Docs&lt;/A&gt;&amp;nbsp;&lt;/EM&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;7. A Release pipeline is built in YAML to deploy the Data Factory using ARM template(We need to give&lt;BR /&gt;location of ARM template file) to Dev ,QA ,UAT and Production environments&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;FONT size="5"&gt;Automated publish of ADF ARM Template :&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;SPAN&gt;The "&lt;/SPAN&gt;&lt;STRONG&gt;Automated publish&lt;/STRONG&gt;&lt;SPAN&gt;" improvement takes the&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;validate all&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;and&amp;nbsp;&lt;/SPAN&gt;&lt;STRONG&gt;export Azure Resource Manager (ARM) template&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;functionality&amp;nbsp;from the ADF UI and makes the logic consumable via a publicly available npm package&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://www.npmjs.com/package/@microsoft/azure-data-factory-utilities" target="_blank" rel="noopener nofollow noreferrer"&gt;@microsoft/azure-data-factory-utilities&lt;/A&gt;&lt;SPAN&gt;. This allows you to programmatically trigger these actions instead of going to the ADF UI and do a button click 'Publish'. This will give your CI/CD pipelines a truer continuous integration experience.(Quoted from blog)&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Please follow this tech community blog for more details :&amp;nbsp;&lt;A href="https://techcommunity.microsoft.com/t5/azure-data-factory-blog/automated-publish-improvement-in-adf-s-ci-cd-flow/ba-p/2117350" target="_blank" rel="noopener"&gt;Automated publish improvement in ADF's CI/CD flow - Microsoft Tech Community&lt;/A&gt;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;Please follow this Microsoft Documentation for the same :&amp;nbsp;&lt;A href="https://docs.microsoft.com/en-us/azure/data-factory/continuous-integration-delivery-improvements#create-an-azure-pipeline" target="_blank" rel="noopener"&gt;Automated publishing for continuous integration and delivery - Azure Data Factory | Microsoft Docs&lt;/A&gt;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;FONT size="5"&gt;&lt;STRONG&gt;YAML Release pipeline(Continuous Deployment)&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Run the release pipeline for the specified target environment.&lt;/LI&gt;
&lt;LI&gt;This will download the previously generated ARM template. It will also download secure connection strings&lt;BR /&gt;from Azure Key-Vault. .Then it will deploy to your&lt;BR /&gt;Target Data Factory using ARM template deployment.&lt;/LI&gt;
&lt;LI&gt;The code below shows how to run your release agent on a Microsoft hosted agent&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;stages:
- stage: Release
  displayName: Release stage
  jobs:
    - job: Release
      displayName: Release job
      pool:
        vmImage: 'Windows-2019'
&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT size="4"&gt;Trigger:&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;This Pipeline will be triggered based on the changes made in&amp;nbsp;&amp;nbsp;&lt;STRONG&gt;adf_publish&lt;/STRONG&gt; branch&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;trigger:
  branches:
    include:
    - adf_publish&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;&lt;STRONG&gt;&lt;FONT size="4"&gt;Resources:&lt;/FONT&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;The branch where the ARM template is located is mentioned.&lt;BR /&gt;Referring to the previous point where it is mentioned that the ARM template will be picked from adf_publish (Publish) branch&lt;/LI&gt;
&lt;LI&gt;Example:&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;resources:
  repositories:
  - repository: &amp;lt;repo name&amp;gt;
    type: git
    name: &amp;lt;repo name&amp;gt;
    ref: adf_publish&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;FONT size="4"&gt;Steps:&lt;/FONT&gt;&lt;/STRONG&gt;&lt;BR /&gt;1. &lt;STRONG&gt;Deployment Variables&lt;/STRONG&gt; (variable declared)&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt; #Deployment variables
variables:
  KeyVaultName: &amp;lt;keyvaultname&amp;gt;
  azureSubscription: &amp;lt;ServiceConnection&amp;gt;
  SourceDataFactoryName : &amp;lt;Source Data Factory name from which code is published&amp;gt;
  DeployDataFactoryName : &amp;lt;Target Data Factory Name&amp;gt;
  DeploymentResourceGroupName : &amp;lt;Target Resource Group Name&amp;gt;

&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;2. &lt;STRONG&gt;Keyvault task&lt;/STRONG&gt; to fetch the secrets(parameters to override ARM paramters) (PowerShell Task):&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;   - task: AzureKeyVault@1
     inputs:
       azureSubscription: '$(azureSubscription)'
       KeyVaultName: $(KeyVaultName)
       SecretsFilter: '&amp;lt;secrets needed for overriding ARM parameters&amp;gt;'
       RunAsPreJob: true

   - checkout: &amp;lt;Repo Name&amp;gt;&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;3. &lt;STRONG&gt;Stopping the triggers&lt;/STRONG&gt; of Target Data Factory before Deployment(Powershell Task)&lt;BR /&gt;Script:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;- task: AzurePowerShell@5
  displayName: Stop Triggers
  inputs:
    azureSubscription: '$(azureSubscription)'
    ScriptType: 'InlineScript'
    Inline: 
      $triggersADF = Get-AzDataFactoryV2Trigger -DataFactoryName   
      "$(DeployDataFactoryName)" -ResourceGroupName "$(DeploymentResourceGroupName)";
      $triggersADF | ForEach-Object { Stop-AzDataFactoryV2Trigger –
      ResourceGroupName "$(DeploymentResourceGroupName)" -DataFactoryName   
      "$(DeployDataFactoryName)" -Name $_.name -Force }
    azurePowerShellVersion: 'LatestVersion'&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;4. &lt;STRONG&gt;Deploying the ARM template to the Target Data Factory&lt;/STRONG&gt; by passing some values to the parameters in&lt;BR /&gt;Incremental Mode. (Powershell Task)&lt;BR /&gt;Script:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;- task: AzurePowerShell@5
  displayName: Deploy ADF
  inputs:
    azureSubscription: '$(azureSubscription)'
    ScriptType: 'InlineScript'
    Inline: 
      'New-AzResourceGroupDeployment 
      -ResourceGroupName "$(DeploymentResourceGroupName)" 
      -TemplateParameterFile "$(System.DefaultWorkingDirectory)/$(SourceDataFactoryName)/ARMTemplateParametersForFactory.json" 
      -TemplateFile "$(System.DefaultWorkingDirectory)/$(SourceDataFactoryName)/ARMTemplateForFactory.json"
      -factoryName "$(DeployDataFactoryName)"
      #&amp;lt;parameter-overridden&amp;gt; : &amp;lt;value-to-be-overridden&amp;gt; there are parameters in arm template and overriden by key vault secrets
      #&amp;lt;parameter-overridden&amp;gt; : &amp;lt;value-to-be-overridden&amp;gt;
      -Mode "Incremental"'
    azurePowerShellVersion: 'LatestVersion'&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;5. &lt;STRONG&gt;Starting the Triggers&lt;/STRONG&gt; after Successful Deployment(Powershell Task)&lt;BR /&gt;Script:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;- task: AzurePowerShell@5
  displayName: Restart
  inputs:
    azureSubscription: '$(azureSubscription)'
    ScriptType: 'InlineScript'
    Inline: 
      $triggersADF = Get-AzDataFactoryV2Trigger -DataFactoryName   "$(DeployDataFactoryName)" -ResourceGroupName "$(DeploymentResourceGroupName)";
      $triggersADF | ForEach-Object { Start-AzDataFactoryV2Trigger -ResourceGroupName "$(DeploymentResourceGroupName)" -DataFactoryName "$(DeployDataFactoryName)" -Name $_.name -Force }
    azurePowerShellVersion: 'LatestVersion'&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;FONT size="5"&gt;&lt;STRONG&gt;Full ADF CD Template:&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;name: Release-$(rev:r)

trigger:
  branches:
    include:
    - adf_publish

resources:
  repositories:
  - repository: &amp;lt;repo name&amp;gt;
    type: git
    name: &amp;lt;repo name&amp;gt;
    ref: adf_publish


variables:
  KeyVaultName: &amp;lt;keyvaultname&amp;gt;
  azureSubscription: &amp;lt;ServiceConnection&amp;gt;
  SourceDataFactoryName : &amp;lt;Source Data Factory name from which code is published&amp;gt;
  DeployDataFactoryName : &amp;lt;Target Data Factory Name&amp;gt;
  DeploymentResourceGroupName : &amp;lt;Target Resource Group Name&amp;gt;

stages:
- stage: Release
  displayName: Release stage
  jobs:
    - job: Release
      displayName: Release job
      pool:
        vmImage: 'Windows-2019'
      steps:
        - task: AzureKeyVault@1
          inputs:
            azureSubscription: '$(azureSubscription)'
            KeyVaultName: $(KeyVaultName)
            SecretsFilter: '&amp;lt;secrets needed for overriding ARM parameters&amp;gt;'
            RunAsPreJob: true

        - checkout: &amp;lt;repo name&amp;gt;

        - task: AzurePowerShell@5
          displayName: Stop Triggers
          inputs:
            azureSubscription: '$(azureSubscription)'
            ScriptType: 'InlineScript'
            Inline: 
              $triggersADF = Get-AzDataFactoryV2Trigger -DataFactoryName   
              "$(DeployDataFactoryName)" -ResourceGroupName "$(DeploymentResourceGroupName)";
              $triggersADF | ForEach-Object { Stop-AzDataFactoryV2Trigger –
              ResourceGroupName "$(DeploymentResourceGroupName)" -DataFactoryName   
              "$(DeployDataFactoryName)" -Name $_.name -Force }
            azurePowerShellVersion: 'LatestVersion'
        - task: AzurePowerShell@5
          displayName: Deploy ADF
          inputs:
            azureSubscription: '$(azureSubscription)'
            ScriptType: 'InlineScript'
            Inline: 
              'New-AzResourceGroupDeployment 
              -ResourceGroupName "$(DeploymentResourceGroupName)" 
              -TemplateParameterFile "$(System.DefaultWorkingDirectory)/$(SourceDataFactoryName)/ARMTemplateParametersForFactory.json" 
              -TemplateFile "$(System.DefaultWorkingDirectory)/$(SourceDataFactoryName)/ARMTemplateForFactory.json"
              -factoryName "$(DeployDataFactoryName)"
              #&amp;lt;parameter-overridden&amp;gt; : &amp;lt;value-to-be-overridden&amp;gt; there are parameters in arm template and overriden by key vault secrets
              #&amp;lt;parameter-overridden&amp;gt; : &amp;lt;value-to-be-overridden&amp;gt;
              -Mode "Incremental"'
            azurePowerShellVersion: 'LatestVersion'
        - task: AzurePowerShell@5
          displayName: Restart Triggers
          inputs:
            azureSubscription: '$(azureSubscription)'
            ScriptType: 'InlineScript'
            Inline: 
              $triggersADF = Get-AzDataFactoryV2Trigger -DataFactoryName   "$(DeployDataFactoryName)" -ResourceGroupName "$(DeploymentResourceGroupName)";
              $triggersADF | ForEach-Object { Start-AzDataFactoryV2Trigger -ResourceGroupName "$(DeploymentResourceGroupName)" -DataFactoryName "$(DeployDataFactoryName)" -Name $_.name -Force }
            azurePowerShellVersion: 'LatestVersion'
       
  
&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 17 Mar 2022 10:54:09 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/azure-data-factory-ci-cd-using-yaml-template/ba-p/3107341</guid>
      <dc:creator>Akshay_Attota</dc:creator>
      <dc:date>2022-03-17T10:54:09Z</dc:date>
    </item>
    <item>
      <title>Azure Data Components Network Architecture with secure configurations</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/azure-data-components-network-architecture-with-secure/ba-p/3141331</link>
      <description>&lt;H1&gt;Use Case :&lt;/H1&gt;
&lt;P&gt;•When there is a&amp;nbsp; need for&amp;nbsp; the Data Components – ADF,ADB and SQL-Pool code to be promoted to higher environment&amp;nbsp; securely without public internet access this blog is useful&lt;/P&gt;
&lt;P&gt;•We have integrated the data components to VNET’s, and public access has been disabled for the above use case&lt;/P&gt;
&lt;P&gt;•We have built deployment(CI-CD) pipelines in such a way that they can only deploy&amp;nbsp; securely via a Self-hosted agent which has the access to VNET&lt;/P&gt;
&lt;P&gt;This Blog will guide you to setup the data components securely with Network diagram included&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H1 id="user-content-network-architecture-for-azure-data-resources-in-v-net"&gt;Network Architecture for Azure Data Resources in V-NET&lt;/H1&gt;
&lt;UL&gt;
&lt;LI&gt;
&lt;P&gt;The Network Architecture diagram below shows the Azure Data Components(Azure Data Factory, Azure Data bricks, Azure Synapse) in Secure Virtual Networks.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;All these Azure Data Components cannot be accessible from public internet and are connected to each other securely.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;Virtual Networks and components in the Network Architecture Diagram:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Azure Synapse workspace and Azure Data Factory are provisioned with Managed Virtual Networks(Azure Data Factory Managed VNET, Synapse Managed VNET)&lt;/LI&gt;
&lt;LI&gt;The Azure Databricks is within a custom Virtual Network(Data VNET)&lt;/LI&gt;
&lt;LI&gt;The Azure Storage Accounts, Azure Key Vault , Azure Synapse workspace and Azure Data Factory are connected to the Data VNET by using Private Endpoints so that data transfer between these components is secure.&lt;/LI&gt;
&lt;LI&gt;A virtual machine (within the Data VNET/separate SUBNET) is configured as ADF SHIR(Self Hosted Integration Runtime) to run Azure Databricks notebooks from Azure Data Factory.&lt;/LI&gt;
&lt;LI&gt;A virtual machine within the Data VNET is configured as Azure DevOps Self Hosted Build Agent for CI-CD (Continuous Integration, Continuous Deployment) Pipelines to run as these Data components cannot be accessible from public internet.&lt;/LI&gt;
&lt;LI&gt;A virtual machine used as a jumpbox with Bastion login is configured so that application code can be accessed securely ONLY in the DEV environments. (This machine will not be present in any higher environments)&lt;/LI&gt;
&lt;/OL&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H1 id="user-content-azure-data-components-secure-network-setup"&gt;Azure Data Components Secure Network Setup&lt;/H1&gt;
&lt;UL&gt;
&lt;LI&gt;This section explains how the data components are configured securely so that only components within the virtual networks can access them and the public internet access is restricted.&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 id="user-content-synapse-secure-network-setup%3A"&gt;Synapse Secure Network setup:&lt;/H2&gt;
&lt;UL&gt;
&lt;LI&gt;Synapse Workspace is setup with a managed VNET&lt;/LI&gt;
&lt;LI&gt;Synapse Workspace is configured using private link hub&lt;/LI&gt;
&lt;LI&gt;Synapse Workspace must be connected via private endpoints through the private link hub.&lt;BR /&gt;&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN style="font-family: inherit;"&gt;Public Network Access to workspace endpoints must be disabled&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;TDE(Transparent Data Encryption) needs to be enabled&lt;/SPAN&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Managed identity needs be provisioned during creation&lt;BR /&gt;&lt;img /&gt;&lt;/LI&gt;
&lt;LI&gt;SQL Active Directory Admin should be enabled as a group, not for one specific user&lt;/LI&gt;
&lt;LI&gt;Azure Resource Locks should be turned on to prevent accidental user deletion&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 id="user-content-databricks-workspace-secure-network-setup%3A"&gt;Databricks Workspace Secure Network Setup:&lt;/H2&gt;
&lt;UL&gt;
&lt;LI&gt;
&lt;P&gt;Enable Databricks IP access list API in order to:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Restrict Portal access to Databricks workspace for specific IP addresses&lt;/LI&gt;
&lt;LI&gt;Restrict Databricks API calls to specific IP addresses&lt;BR /&gt;&lt;img /&gt;&lt;/LI&gt;
&lt;/OL&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;Configured Databricks Workspace with VNET injection and with no public IP (NPIP) enabled&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;Encrypt communication between Databricks nodes using global init scripts&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Configure an &amp;lt;init-script-folder&amp;gt; with the location to put the init script.&lt;/LI&gt;
&lt;LI&gt;Run the notebook below to create the script enable-encryption.sh.&lt;/LI&gt;
&lt;LI&gt;Configure the Databricks workspace with the enable-encryption.sh global init script using the global init script REST API.&lt;/LI&gt;
&lt;LI&gt;Below is the notebook to create enable-encryption.sh:&lt;/LI&gt;
&lt;/OL&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="bash"&gt;dbutils.fs.put("dbfs:/&amp;lt;init-script-folder&amp;gt;/init/enable-encryption.sh", """
!/bin/bash
 
keystore_file="$DB_HOME/keys/jetty_ssl_driver_keystore.jks"
keystore_password="gb1gQqZ9ZIHS"
sasl_secret=$(sha256sum $keystore_file | cut -d' ' -f1)
 
if [[ $DB_IS_DRIVER = "TRUE" ]]; then
  driver_conf=${DB_HOME}/driver/conf/spark-branch.conf
  echo "Configuring driver conf at $driver_conf"
  if [ ! -e $driver_conf ] ; then
    touch $driver_conf
  fi
    
  head -n 1 ${DB_HOME}/driver/conf/spark-branch.conf &amp;gt;&amp;gt; $driver_conf
 
  echo "  // Authenticate"&amp;gt;&amp;gt; $driver_conf
  echo "  \\"spark.authenticate\\" = true" &amp;gt;&amp;gt; $driver_conf
  echo "  \\"spark.authenticate.secret\\" = \\"$sasl_secret\\"" &amp;gt;&amp;gt; $driver_conf
 
  echo "  // Configure AES encryption"&amp;gt;&amp;gt; $driver_conf
  echo "  \\"spark.network.crypto.enabled\\" = true" &amp;gt;&amp;gt; $driver_conf
  echo "  \\"spark.network.crypto.saslFallback\\" = false" &amp;gt;&amp;gt; $driver_conf
 
  echo "  // Configure SSL"&amp;gt;&amp;gt; $driver_conf
  echo "  \\"spark.ssl.enabled\\" = true" &amp;gt;&amp;gt; $driver_conf
  echo "  \\"spark.ssl.keyPassword\\" = \\"$keystore_password\\"" &amp;gt;&amp;gt; $driver_conf
  echo "  \\"spark.ssl.keyStore\\" = \\"$keystore_file\\"" &amp;gt;&amp;gt; $driver_conf
  echo "  \\"spark.ssl.keyStorePassword\\" = \\"$keystore_password\\"" &amp;gt;&amp;gt; $driver_conf
  echo "  \\"spark.ssl.protocol\\" = \\"TLSv1.2\\"" &amp;gt;&amp;gt; $driver_conf
  echo "  \\"spark.ssl.standalone.enabled\\" = true" &amp;gt;&amp;gt; $driver_conf
  echo "  \\"spark.ssl.ui.enabled\\" = true" &amp;gt;&amp;gt; $driver_conf
  echo " }"  &amp;gt;&amp;gt; $driver_conf
  echo "Successfully configured driver conf at $driver_conf"
fi  
 
spark_defaults_conf="$DB_HOME/spark/conf/spark-defaults.conf"
echo "Configuring spark defaults conf at $spark_default_conf"
if [ ! -e $spark_defaults_conf ] ; then
  touch $spark_defaults_conf
fi
echo "spark.authenticate true" &amp;gt;&amp;gt; $spark_defaults_conf
echo "spark.authenticate.secret $sasl_secret" &amp;gt;&amp;gt; $spark_defaults_conf
 
echo "spark.network.crypto.enabled true" &amp;gt;&amp;gt; $spark_defaults_conf
echo "spark.network.crypto.saslFallback false" &amp;gt;&amp;gt; $spark_defaults_conf
 
echo "spark.ssl.enabled true" &amp;gt;&amp;gt; $spark_defaults_conf
echo "spark.ssl.keyPassword $keystore_password" &amp;gt;&amp;gt; $spark_defaults_conf
echo "spark.ssl.keyStore $keystore_file" &amp;gt;&amp;gt; $spark_defaults_conf
echo "spark.ssl.keyStorePassword $keystore_password" &amp;gt;&amp;gt; $spark_defaults_conf
echo "spark.ssl.protocol TLSv1.2" &amp;gt;&amp;gt; $spark_defaults_conf
echo "spark.ssl.standalone.enabled true" &amp;gt;&amp;gt; $spark_defaults_conf
echo "spark.ssl.ui.enabled true" &amp;gt;&amp;gt; $spark_defaults_conf
echo "Successfully configured spark defaults conf at $spark_default_conf"
""", True)&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Follow this documentation for further details to set up encryption as a global init script:&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A class="" href="https://docs.microsoft.com/en-us/azure/databricks/security/encryption/encrypt-otw" target="_blank" rel="noopener noreferrer"&gt;https://docs.microsoft.com/en-us/azure/databricks/security/encryption/encrypt-otw&lt;/A&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 id="user-content-azure-data-factory-secure-network-setup%3A"&gt;Azure Data Factory Secure Network Setup:&lt;/H2&gt;
&lt;UL&gt;
&lt;LI&gt;This ADF is provisioned with a managed VNET&lt;/LI&gt;
&lt;LI&gt;The network access of ADF is set to connect via private endpoints to the Data VNET(Custom VNET)&lt;BR /&gt;&lt;img /&gt;&lt;/LI&gt;
&lt;LI&gt;Create a SHIR (Self Hosted Integration Runtime) for the Data Factory to access resources within the Data VNET.&lt;BR /&gt;&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;SHIR in Linked Services
&lt;UL&gt;
&lt;LI&gt;Datafactory is connected to databricks via SHIR that is in the same databricks vnet, but on a seperate subnet. This is authenticated via managed identity and must be having contributor RBAC permissions on that subnet.&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI&gt;Example of databricks linked service&lt;/LI&gt;
&lt;/UL&gt;
&lt;PRE class="hljs"&gt;&lt;CODE class="json"&gt;{
   &lt;SPAN class="hljs-attr"&gt;"name"&lt;/SPAN&gt;: &lt;SPAN class="hljs-string"&gt;"ls_databricks"&lt;/SPAN&gt;,
   &lt;SPAN class="hljs-attr"&gt;"properties"&lt;/SPAN&gt;: {
       &lt;SPAN class="hljs-attr"&gt;"description"&lt;/SPAN&gt;: &lt;SPAN class="hljs-string"&gt;"Linked Service for connecting to Databricks"&lt;/SPAN&gt;,
       &lt;SPAN class="hljs-attr"&gt;"annotations"&lt;/SPAN&gt;: [],
       &lt;SPAN class="hljs-attr"&gt;"type"&lt;/SPAN&gt;: &lt;SPAN class="hljs-string"&gt;"AzureDatabricks"&lt;/SPAN&gt;,
       &lt;SPAN class="hljs-attr"&gt;"typeProperties"&lt;/SPAN&gt;: {
           &lt;SPAN class="hljs-attr"&gt;"domain"&lt;/SPAN&gt;: &lt;SPAN class="hljs-string"&gt;"https://adb-XXXXX.net"&lt;/SPAN&gt;,
           &lt;SPAN class="hljs-attr"&gt;"authentication"&lt;/SPAN&gt;: &lt;SPAN class="hljs-string"&gt;"MSI"&lt;/SPAN&gt;,
           &lt;SPAN class="hljs-attr"&gt;"workspaceResourceId"&lt;/SPAN&gt;: &lt;SPAN class="hljs-string"&gt;"/subscriptions/XXXXXX/resourceGroups/rg-dev/providers/Microsoft.Databricks/workspaces/XXXXX"&lt;/SPAN&gt;,
           &lt;SPAN class="hljs-attr"&gt;"newClusterNodeType"&lt;/SPAN&gt;: &lt;SPAN class="hljs-string"&gt;"Standard_DS4_v2"&lt;/SPAN&gt;,
           &lt;SPAN class="hljs-attr"&gt;"newClusterNumOfWorker"&lt;/SPAN&gt;: &lt;SPAN class="hljs-string"&gt;"2:10"&lt;/SPAN&gt;,
           &lt;SPAN class="hljs-attr"&gt;"newClusterSparkEnvVars"&lt;/SPAN&gt;: {
               &lt;SPAN class="hljs-attr"&gt;"PYSPARK_PYTHON"&lt;/SPAN&gt;: &lt;SPAN class="hljs-string"&gt;"/databricks/python3/bin/python3"&lt;/SPAN&gt;
           },
           &lt;SPAN class="hljs-attr"&gt;"newClusterVersion"&lt;/SPAN&gt;: &lt;SPAN class="hljs-string"&gt;"8.2.x-scala2.12"&lt;/SPAN&gt;,
           &lt;SPAN class="hljs-attr"&gt;"newClusterInitScripts"&lt;/SPAN&gt;: []
       },
       &lt;SPAN class="hljs-attr"&gt;"connectVia"&lt;/SPAN&gt;: {
           &lt;SPAN class="hljs-attr"&gt;"referenceName"&lt;/SPAN&gt;: &lt;SPAN class="hljs-string"&gt;"selfHostedIr"&lt;/SPAN&gt;,
           &lt;SPAN class="hljs-attr"&gt;"type"&lt;/SPAN&gt;: &lt;SPAN class="hljs-string"&gt;"IntegrationRuntimeReference"&lt;/SPAN&gt;
       }
   }
}
&lt;/CODE&gt;&lt;/PRE&gt;
&lt;UL&gt;
&lt;LI&gt;Create Managed Private Endpoints for accessing resources outside the ADF managed V-net(which don't have public internet access). For Example, Synapse SQL pool cannot be accessed by the public internet and it will be outside the ADF managed vnet. Therefore a Managed Private Endpoint needs to be created for Data Factory access to Synapse SQL Pool.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;H2 id="user-content-azure-key-vault"&gt;Azure Key Vault&lt;/H2&gt;
&lt;UL&gt;
&lt;LI&gt;Azure KeyVault should be configured with a private endpoint to prevent access from the public internet.&lt;/LI&gt;
&lt;LI&gt;In addition to using Azure KeyVault for secret scope management, it can be integrated with Azure Databricks for Azure KeyVault-backed scope.&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 id="user-content-azure-datalake-storage-accounts"&gt;Azure Datalake Storage Accounts&lt;/H2&gt;
&lt;UL&gt;
&lt;LI&gt;Public Access to all Data Lakes should be disabled.&lt;/LI&gt;
&lt;LI&gt;Private Endpoint Access should be configured for all Data Lakes&lt;/LI&gt;
&lt;LI&gt;VNET Access is configured where necessary for Azure Storage Explorer on custom VNET located VMs.&lt;/LI&gt;
&lt;LI&gt;ACL permissions to containers are programmatically handled via PowerShell code&lt;/LI&gt;
&lt;LI&gt;RBACs are restricted to Azure Resource Managed Identities when specifically required e.g Azure Data Factory Storage Blob Data Contributor Role.&lt;/LI&gt;
&lt;LI&gt;Along with the above mentioned points, here are the storage exception and Network routing&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;H2 id="user-content-self-hosted-agent-installation-procedure%3A"&gt;Self Hosted Agent Installation Procedure:&lt;/H2&gt;
&lt;P&gt;&lt;U&gt;&lt;STRONG&gt;Purpose:&lt;/STRONG&gt;&lt;/U&gt;&lt;/P&gt;
&lt;P&gt;In order to run CI and CD pipelines through a secure VNET. We need to install a VM(connected to a VNET/SUBNET) as a self hosted agent in Azure Devops.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;U&gt;&lt;STRONG&gt;Installation Procedure:&lt;/STRONG&gt;&lt;/U&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Log on to the VM you want the self hosted agent installed on via Bastion
&lt;UL&gt;
&lt;LI&gt;&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI&gt;Within the Virtual Machine, navigate to the Azure Devops website using a web browser and log in.&lt;/LI&gt;
&lt;LI&gt;Create a new agent pool or use an existing pool
&lt;UL&gt;
&lt;LI&gt;Navigate to project settings/agent pools within azure devops to create a new pool or view the existing pool that you want to use&lt;/LI&gt;
&lt;LI&gt;&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;Follow the add pool dialogue to create a new pool if needed&lt;/LI&gt;
&lt;LI&gt;&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI&gt;Within the pool you want to use, navigate to the agents tab and select new agent.
&lt;UL&gt;
&lt;LI&gt;&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI&gt;Follow the instructions outlined here:&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A class="" href="https://docs.microsoft.com/en-us/azure/devops/pipelines/agents/v2-windows?view=azure-devops" target="_blank" rel="noopener noreferrer"&gt;https://docs.microsoft.com/en-us/azure/devops/pipelines/agents/v2-windows?view=azure-devops&lt;/A&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;
&lt;UL&gt;
&lt;LI&gt;This will show you how to create a PAT token to authenticate the agent (this token is only used once at authentication time and never used again)&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Make sure to install the agent to run as a service&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H2 id="user-content-self-hosted-agent-dependencies-for-pipelines%3A"&gt;Self Hosted Agent Dependencies for Pipelines:&lt;/H2&gt;
&lt;P&gt;Because the agent will be installed on a brand new blank windows image, other dependencies/packages need to be installed on the virtual machine in order for our CI/CD pipelines to run.&lt;/P&gt;
&lt;P&gt;&lt;U&gt;&lt;STRONG&gt;Example:&lt;/STRONG&gt;&lt;/U&gt;&lt;/P&gt;
&lt;P&gt;Let's consider an example scenario that we need to install some modules like below&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Here is a list of packages to install and where to install them.&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Log on to the VM using Bastion again and manually install these packages
&lt;UL&gt;
&lt;LI&gt;Packages:
&lt;UL&gt;
&lt;LI&gt;sqlpackage
&lt;UL&gt;
&lt;LI&gt;Link to .msi installation file:&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A class="" href="https://go.microsoft.com/fwlink/?linkid=2157201" target="_blank" rel="noopener noreferrer"&gt;https://go.microsoft.com/fwlink/?linkid=2157201&lt;/A&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;Follow instructions detailed here:&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A class="" href="https://docs.microsoft.com/en-us/sql/tools/sqlpackage/sqlpackage-download?view=sql-server-ver15" target="_blank" rel="noopener noreferrer"&gt;https://docs.microsoft.com/en-us/sql/tools/sqlpackage/sqlpackage-download?view=sql-server-ver15&lt;/A&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Make sure to add&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;sqlpackage&lt;/CODE&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;to the system PATH&lt;/STRONG&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;in order for azure devops to recognize this as a capability for this machine&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI&gt;AZ CLI
&lt;UL&gt;
&lt;LI&gt;Link to installer:&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A class="" href="https://aka.ms/installazurecliwindows" target="_blank" rel="noopener noreferrer"&gt;https://aka.ms/installazurecliwindows&lt;/A&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI&gt;PowerShell Modules:
&lt;UL&gt;
&lt;LI&gt;Az.Accounts&lt;/LI&gt;
&lt;LI&gt;Az.Resources&lt;/LI&gt;
&lt;LI&gt;Az.Datafactory&lt;/LI&gt;
&lt;LI&gt;For all these modules above, open up a powershell session as an adminstrator and type&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;Install-Module 'name-of-module'&lt;/CODE&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;FONT size="5"&gt;&lt;STRONG&gt;ADDING SELF HOSTED AGENTS IN OUR CI-CD YAML deployment PIPELINES:&lt;/STRONG&gt;&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;•The code shows how to run your release agent on a specific self hosted agent(connected to VNET):&lt;/P&gt;
&lt;P&gt;- Take note of the pool and demands configuration&lt;/P&gt;
&lt;P&gt;Considering we are deploying Data bricks notebooks in the below case&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;stages:
- stage: Release
  displayName: Release stage

  jobs:
  - deployment: DeployDatabricks
    displayName: Deploy Databricks Notebooks
    pool:
      name: DataPool
      demands:
        - agent.name -equals vm-ado
    environment: Data-SANDBOX&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;•In the continuous deployment pipeline, we must deploy the artifacts build to Dev , QA, UAT and prod environments. We will have approval gates setup before the deployment to each environment(stage) gets started.&lt;/P&gt;</description>
      <pubDate>Fri, 11 Feb 2022 12:04:32 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/azure-data-components-network-architecture-with-secure/ba-p/3141331</guid>
      <dc:creator>Akshay_Attota</dc:creator>
      <dc:date>2022-02-11T12:04:32Z</dc:date>
    </item>
    <item>
      <title>Data bricks Notebook Deployment using YAML code</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/data-bricks-notebook-deployment-using-yaml-code/ba-p/3046952</link>
      <description>&lt;P&gt;Purpose of this Blog:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;We have defined end-end workflow for code check-in using two methods
&lt;UL&gt;
&lt;LI&gt;Notebook Revision History (standard check-in process which was defined to check-in the notebook code)&lt;/LI&gt;
&lt;LI&gt;Azure Databricks Repos (No Repo Size Limitation from May 13th)&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;LI&gt;Changes made externally to the Databricks notebook will not automatically sync with the Databricks Workspace. Due to this limitation, it is recommended that developers sync the entire git repository as detailed in the process below.&lt;/LI&gt;
&lt;LI&gt;We have written yaml template for CI and CD with Powershell code which can deploy notebooks from multiple folders and the powershell code in pipeline will create folder in the destination if the folder doesn't exist against the Data thirst extension which can only deploy notebooks in single folder.&lt;/LI&gt;
&lt;LI&gt;One stop for whole Databricks deployment workflow from code-check-in to pipelines with detailed explanation(which is used by stakeholders)&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H1 id="user-content-developer-workflow-(ci%2Fcd)"&gt;Developer Workflow (CI/CD)&lt;/H1&gt;
&lt;HR /&gt;
&lt;H2 id="user-content-**git-integration**%3A"&gt;&lt;STRONG&gt;Git Integration&lt;/STRONG&gt;:&lt;/H2&gt;
&lt;OL&gt;
&lt;LI&gt;
&lt;P&gt;Create a feature branch based on the main branch and link a work item to it.&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;Login into your Azure Databricks Dev/Sandbox and click on user icon (top right) and open user settings.&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;Click on Git Integration Tab and make sure you have selected Azure Devops Services&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;There are two ways to check-in the code from Databricks UI (described below)&lt;BR /&gt;1.Using Revision History after opening Notebooks&lt;BR /&gt;2.Work with notebooks and folders in an Azure Databricks repo(Repos which is a recent development - 13th May)&lt;/P&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H2 id="user-content-code-check-in-into-the-git-repository-from-databricks-ui"&gt;Code Check-in into the Git repository from Databricks UI&lt;/H2&gt;
&lt;HR /&gt;
&lt;H3 id="user-content-**i.-notebook-revision-history**%3A"&gt;&lt;STRONG&gt;I. Notebook Revision History&lt;/STRONG&gt;:&lt;/H3&gt;
&lt;HR /&gt;
&lt;OL&gt;
&lt;LI&gt;
&lt;P&gt;Go to notebook you want to make changes and deploy to another environment.&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&lt;EM&gt;&lt;STRONG&gt;Note: Developers need to make sure to maintain a shared/common folder for all the notebooks. You can make all required changes in your personal folder and then finally move these changes to the shared/common folder. The CI process will create the artifact from this shared/common folder.&lt;/STRONG&gt;&lt;/EM&gt;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;Click on the Revision history on top right.&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;If it is a new notebook, you will be able to notice that git is not linked to the notebook or Git might be linked to older branch which might not exist.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;Click on the ‘Git:Not Linked’(New Notebook) or 'Git Synced'(Already existing notebook).&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;We need to configure the Git Repository. (screenshot below)&lt;/P&gt;
&lt;img /&gt;&lt;/LI&gt;
&lt;/OL&gt;
&lt;UL&gt;
&lt;LI&gt;Select feature branch from the list of branches in the drop down&lt;/LI&gt;
&lt;LI&gt;Mention the URL of the repository in the format&lt;BR /&gt;&lt;CODE&gt;&lt;A href="https://dev.azure.com/" target="_blank" rel="noopener"&gt;https://dev.azure.com/&lt;/A&gt;&amp;lt;organisationname&amp;gt;/&amp;lt;ProjectName&amp;gt;/_git/&amp;lt;Repo&amp;gt;&lt;/CODE&gt;&lt;/LI&gt;
&lt;LI&gt;Mention the path of the notebook in the repository. In our case it is&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;src/Databricks/ITDataEngineerADBDev2/notebooks/&amp;lt;folder name&amp;gt;/&amp;lt;notebook.py&amp;gt;&lt;/CODE&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;OL start="6"&gt;
&lt;LI&gt;
&lt;P&gt;Click on Save Notebook by adding a comment to integrate the code to the repository.&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;We need to create a PR after the changes are reflected in the feature branch.&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;After PR gets approved, the code now is merged into the main branch and CI-CD process will start from here.&lt;/P&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;EM&gt;Note: Linking individual notebooks has the following limitation&lt;/EM&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Changes made externally to the Databricks notebook (outside of the Databricks workspace) will not automatically sync with the Databricks Workspace. This issue is documented here:&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A class="" href="https://forums.databricks.com/questions/6752/external-changes-to-git-in-synced-notebook.html*" target="_blank" rel="noopener noreferrer"&gt;https://forums.databricks.com/questions/6752/external-changes-to-git-in-synced-notebook.html*&lt;/A&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;Due to this limitation, it is recommended that developers sync the entire git repository as detailed below.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3 id="user-content-**ii.-azure-databricks-repos**%3A"&gt;&lt;STRONG&gt;II. Azure Databricks Repos&lt;/STRONG&gt;:&lt;/H3&gt;
&lt;HR /&gt;
&lt;P&gt;&lt;STRONG&gt;Introduction&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Repos is a newly introduced feature in Azure Databricks which is in Public Preview.&lt;/LI&gt;
&lt;LI&gt;This feature used to have a 100mb limitation on the size of the linked repository but this feature is now working with larger repositories as of May 13th.&lt;/LI&gt;
&lt;LI&gt;We can directly link a repository to the Databricks workspace to work on notebooks based on git branches.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;Repos Check-in Process:&lt;/STRONG&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;
&lt;P&gt;Click on Repos tab and right click on the folder you want to work and then select "Add Repos".&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;Fill in the Repo URL from Azure Devops and select the Git provider as "Azure Devops Services" and click on create.&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;The repo gets added with folder name as repo name(Data in the screenshot below) and a selection with all the branch names(branch symbol with feature/ and down arrow in screenshot below). Click on the down arrow beside the branch name.&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;After clicking on down arrow(previous screenshot), search and select your existing feature branch OR create a new feature branch (as shown in screenshot below).&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;All the folders in the branch are visible (refer the screenshot below)&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;Open the folder which contains the notebooks(refer the screenshot below). Create a new notebook and write code(Right click on the folder and select "create"----&amp;gt;"Notebook" like screenshot below) or edit an existing notebook in the folder.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;After creating a new notebook or editing an existing notebook, click on top left hand of the notebook which contains feature branch name. Then, new window will appear which will show the changes. Add Summary(mandatory) and Description(optional), then click on "commit and push".&lt;/P&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;OL start="8"&gt;
&lt;LI&gt;Then a pop up with the following message appears that it is "committed and pushed" and then the user should raise a PR to merge the feature branch into the "main" branch.&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;After successful PR merge, the CI-CD pipeline is run for the deployment of the notebooks to the higher environments(QA/Prod).&lt;/LI&gt;
&lt;/UL&gt;
&lt;H1 id="user-content-ci-cd-process"&gt;CI-CD Process&lt;/H1&gt;
&lt;HR /&gt;
&lt;H2 id="user-content-**continuous-integration(ci)-pipeline**%3A"&gt;&lt;STRONG&gt;Continuous Integration(CI) pipeline&lt;/STRONG&gt;:&lt;/H2&gt;
&lt;P&gt;The CI pipeline builds the artifact by copying the notebooks from main branch to staging directory.&lt;/P&gt;
&lt;P&gt;It has two tasks:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;
&lt;P&gt;Copy Task - Copies from main branch to staging directory.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;Publish Artifacts - publishes artifacts from $(build.stagingdirectory)&lt;/P&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;UL&gt;
&lt;LI&gt;YAML Template&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;name: Release-$(rev:r)

trigger: none

variables:
  workingDirectory: '$(System.DefaultWorkingDirectory)/&amp;lt;path&amp;gt;'

stages:
- stage: Build
  displayName: Build stage

  jobs:
  - job: Build
    displayName: Build
    steps:
    - task: CopyFiles@2
      displayName: 'Copy Files to:  $(build.artifactstagingdirectory)'
      inputs:
        SourceFolder: '$(workingDirectory)'
        TargetFolder: ' $(build.artifactstagingdirectory)'

    - task: PublishBuildArtifacts@1
      displayName: 'Publish Artifact: notebooks'
      inputs:
        ArtifactName: dev_release&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H2 id="user-content-**continuous-deployment(cd)-pipeline**%3A"&gt;&lt;STRONG&gt;Continuous Deployment(CD) pipeline&lt;/STRONG&gt;:&lt;/H2&gt;
&lt;P&gt;Deployment with secure hosted agent&lt;/P&gt;
&lt;P&gt;a. Run the release pipeline for the specified target environment.&lt;BR /&gt;This will download the previously generated Build Artifacts. It will also download secure connection strings from Azure Key Vault. Make sure your self hosted agent is configured properly as per&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A class="" href="https://dev.azure.com/nbadev/DTC/_wiki/wikis/DTC.wiki?wikiVersion=GBwikiMaster&amp;amp;pagePath=/Work%20Stream%209%20%252D%20Data/Data%20Components%20CI%252DCD%20Process/Self%252DHosted%20Agent" target="_blank" rel="noopener"&gt;Self-Hosted Agent&lt;/A&gt;.Then it will deploy notebooks to your Target Azure Databricks Workspace.&lt;/P&gt;
&lt;P&gt;The code below shows how to run your release agent on a specific self hosted agent: Take note of the&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;pool&lt;/CODE&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;and&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;CODE&gt;demands&lt;/CODE&gt;&lt;SPAN&gt;&amp;nbsp;&lt;/SPAN&gt;configuration.&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;In the continuous deployment pipeline we have to deploy the artifacts build to Dev , QA, UAT and prod environments.&lt;/LI&gt;
&lt;LI&gt;We will have approval gates setup before the deployment to each environment(stage) gets started.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;stages:
- stage: Release
  displayName: Release stage

  jobs:
  - deployment: DeployDatabricks
    displayName: Deploy Databricks Notebooks
    pool:
      name: OTT-DEV-FanDataPool
      demands:
        - agent.name -equals &amp;lt;agent-name&amp;gt;
    environment: &amp;lt;env-name&amp;gt;&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;We have two steps for the deployment:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;
&lt;P&gt;Getting Key Vault Secrets PAT(Personal Access Token) and Target Data bricks Workspace URL from the Key Vault.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;Importing the Notebooks to the target Databricks using import Rest API.(PowerShell Task with Inline Script)&lt;/P&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;UL&gt;
&lt;LI&gt;
&lt;P&gt;We can perform multiple folder notebook deployments using this script.&lt;/P&gt;
&lt;/LI&gt;
&lt;LI&gt;
&lt;P&gt;We are creating a folder(Folder in Repository similar to Sandbox/Dev environment) if it does not exist in the target(dev/QA/UAT/Prod) data bricks workspace and then importing the notebooks into the folder.&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;
&lt;P&gt;YAML template&lt;/P&gt;
&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;name: Release-$(rev:r)

trigger: none

resources:
  pipelines:
    - pipeline: notebooks
      source: Databricks-CI
      trigger:
        branches:
          - main

variables:
  - name: azureSubscription
    value: '&amp;lt;serviceConnectionName&amp;gt;'
  - name: workingDirectory_shared
    value: '$(Pipeline.Workspace)/&amp;lt;path&amp;gt;/'

stages:
- stage: Release
  displayName: Release stage

  jobs:
  - deployment: DeployDatabricks
    displayName: Deploy Databricks Notebooks
    environment: DEV
    strategy:
      runOnce:
        deploy:
          steps:
          - task: AzureKeyVault@1
            inputs:
              azureSubscription: '$(azureSubscription)'
              KeyVaultName: $(dev_keyvault)
              SecretsFilter: 'databricks-pat,databricks-url'
              RunAsPreJob: true
          - task: AzurePowerShell@5
            inputs:
              azureSubscription: '$(azureSubscription)'
              ScriptType: 'InlineScript'
              Inline: |

                #Create secret and headers
                $Secret = "Bearer " + "$(databricks-pat)";
                $headers = @{
                    Authorization = $Secret
                }

                #Clean workspace/delete existing folders
                $folderNames = Get-ChildItem $(setVars.notebooksDirectory) -dir
                $folderNames | ForEach-Object {
                    #Create API Request to Delete Folder
                    $folderpath = "/" + $_.Name + "/";
                    $DeleteFolderBody = @{
                        path      = $folderpath
                        recursive = $true
                    }
                    $DeleteFolderBodyText = $DeleteFolderBody | ConvertTo-Json
                    $FolderDeleteAPI = "https://" + "$(databricks-url)" + "/api/2.0/workspace/delete"

                    #Delete Folder
                    try {
                        $DeleteFolder = Invoke-RestMethod -Uri $FolderDeleteAPI -Method Post -Headers $headers -Body $DeleteFolderBodyText
                    }
                    catch [System.Net.WebException] {
                        Write-Host "Folder does not exist";
                    }
                }

                #Upload existing files/folders
                $filenames = get-childitem $(setVars.notebooksDirectory) -recurse | where { $_.extension -eq ".py" };
                $filenames | ForEach-Object {
                    #API Endpoints
                    $ImportNoteBookAPI = "https://" + "$(databricks-url)" + "/api/2.0/workspace/import";
                    $FolderCheckAPI = "https://" + "$(databricks-url)" + "/api/2.0/workspace/get-status";
                    $FolderCreateAPI = "https://" + "$(databricks-url)" + "/api/2.0/workspace/mkdirs"

                    #Open and Import the Notebook to Workspace
                    $BinaryContents = [System.IO.File]::ReadAllBytes($_.FullName);
                    $EncodedContents = [System.Convert]::ToBase64String($BinaryContents);

                    $notebookDirectory = "$(setVars.notebooksDirectory)".Replace('/','\');
                    $pathIndex=$_.FullName.IndexOf($notebookDirectory);
                    $notebookpath = "/" + $_.FullName.Substring($pathIndex+$notebookDirectory.Length).Replace('\','/');
                    $folderIndex = $notebookpath.IndexOf($_.Name);
                    $folderpath = $notebookpath.Substring(0,$folderIndex);

                    #API Body for Importing Notebooks
                    $ImportNoteBookBody = @{
                        content   = "$EncodedContents"
                        language  = "PYTHON"
                        overwrite = $true
                        format    = "SOURCE"
                        path      = $notebookpath
                    }
                    $ImportNoteBookBodyText = $ImportNoteBookBody | ConvertTo-Json

                    #API Body for Creating Folder
                    $CreateFolderBody = @{
                        path = $folderpath
                    }
                    $CreateFolderBodyText = $CreateFolderBody | ConvertTo-Json
                    $CheckPath = $FolderCheckAPI + "?path=" + $folderpath;

                    #Check if the folder exists, if not create folder
                    try {
                        $CheckFolder = Invoke-RestMethod -Uri $CheckPath -Method Get -Headers $headers;
                    }
                    catch [System.Net.WebException] {
                        Write-Host "Creating Folder $folderpath";
                        Invoke-RestMethod -Uri $FolderCreateAPI -Method Post -Headers $headers -Body $CreateFolderBodyText
                    }

                    #Importing a notebook to the Folder in target DataBricks workspace
                    Write-Host "Creating Notebook " + $notebookpath;
                    Invoke-RestMethod -Uri $ImportNoteBookAPI -Method Post -Headers $headers -Body $ImportNoteBookBodyText
                }
              azurePowerShellVersion: 'LatestVersion'&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 07 Sep 2022 15:34:42 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/data-bricks-notebook-deployment-using-yaml-code/ba-p/3046952</guid>
      <dc:creator>Akshay_Attota</dc:creator>
      <dc:date>2022-09-07T15:34:42Z</dc:date>
    </item>
    <item>
      <title>Azure Databricks Artifacts Deployment</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/azure-databricks-artifacts-deployment/ba-p/2913522</link>
      <description>&lt;P&gt;This article is intended for deploying Jar Files, XML Files, JSON Files, wheel files and Global Init Scripts in Databricks Workspace.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;Overview:&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;In Databricks Workspace, we have notebooks, clusters, and data stores. These notebooks are run on data bricks clusters and use datastores if they need to refer to any custom configuration in the cluster.&lt;/LI&gt;
&lt;LI&gt;Developers need environment specific configurations, mapping files and custom functions using packaging for running the notebooks in Databricks Workspace.&lt;/LI&gt;
&lt;LI&gt;Developers need a global Init script which runs on every cluster created in your workspace. Global Init scripts are&amp;nbsp;&lt;STRONG&gt;useful when you want to enforce organization-wide library configurations or security screens&lt;/STRONG&gt;.&lt;/LI&gt;
&lt;LI&gt;This pipeline can automate the process of deploying these artifacts in Databricks workspace.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;Purpose of this Pipeline:&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;The purpose this pipeline is to pick up the Databricks artifacts from the Repository and upload to Databricks workspace DBFS location and uploads the global init script using REST API's.&lt;/LI&gt;
&lt;LI&gt;The CI pipeline builds the wheel (.whl) file using setup.py and publishes required files (whl file, Global Init scripts, jar files etc.) as a build artifact.&lt;/LI&gt;
&lt;LI&gt;The CD pipeline uploads all the artifacts to DBFS location and also uploads the global Init scripts using the REST API's.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;Pre-Requisites:&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;&lt;STRONG&gt;Developers need to make sure that all the artifacts that need to be uploaded to Databricks Workspace need to be present in the Repository (main branch). The location of the artifacts in the repository should be fixed (Let us consider’/artifacts’ as the location). The CI process will create the build artifact from this folder location&lt;/STRONG&gt;&lt;STRONG&gt;&lt;EM&gt;.&lt;/EM&gt;&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;OL start="2"&gt;
&lt;LI&gt;The Databricks PAT Token and Databricks Target Workspace URL should be present in the key vault.&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;STRONG&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; &lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;Continuous Integration (CI) pipeline:&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;The CI pipeline builds a wheel (.whl) file using the a setup.py file and also creates a build artifact from all files in the&amp;nbsp;artifacts/&amp;nbsp;folder such as Configuration files (.json), Packages (.jar and .whl), and shell scripts (.sh).&lt;/LI&gt;
&lt;LI&gt;It has the following Tasks:&lt;/LI&gt;
&lt;/UL&gt;
&lt;OL&gt;
&lt;LI&gt;Building the Wheel file using setup.py file (Subtasks below):
&lt;UL&gt;
&lt;LI&gt;using the latest python version&lt;/LI&gt;
&lt;LI&gt;updating pip&lt;/LI&gt;
&lt;LI&gt;Installing wheel package&lt;/LI&gt;
&lt;LI&gt;Building the wheel file using command "python setup.py sdist bdist_wheel"&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;This setup.py file can be replaced with any python file that is used to create .whl files&lt;/STRONG&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;OL start="2"&gt;
&lt;LI&gt;Copying all the Artifacts(Jar,Json Config,Whl file, Shell Script) to artifact staging directory&lt;/LI&gt;
&lt;LI&gt;Publishing the Artifacts from the staging directory.&lt;/LI&gt;
&lt;LI&gt;The CD Pipeline will then be triggered after a successful run.&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;The YAML code for this pipeline is included in next page with all the steps included.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;CI- Pipeline YAML Code:&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;name: Release-$(rev:r)

trigger: none

variables:
  workingDirectory: '$(System.DefaultWorkingDirectory)/Artifacts'
  pythonVersion: 3.7

stages:
- stage: Build
  displayName: Build stage

  jobs:
  - job: Build
    displayName: Build
    steps:
    - task: UsePythonVersion@0
      displayName: 'Use Python version'
      inputs:
        versionSpec: $(pythonVersion)
    - task: CmdLine@2
      displayName: 'Upgrade Pip'
      inputs:
        script: 'python -m pip install --upgrade pip'
    - task: CmdLine@2
      displayName: 'Install wheel'
      inputs:
        script: 'python -m pip install wheel'
    - task: CmdLine@2
      displayName: 'Build wheel'
      inputs:
        script: 'python setup.py sdist bdist_wheel'
        workingDirectory: '$(workingDirectory)'
    - task: CopyFiles@2
      displayName: 'Copy Files to:  $(build.artifactstagingdirectory)'
      inputs:
        SourceFolder: '$(workingDirectory)'
        TargetFolder: ' $(build.artifactstagingdirectory)'
    - task: PublishBuildArtifacts@1
      displayName: 'Publish Artifact: DatabricksArtifacts'
      inputs:
        ArtifactName: DatabricksArtifacts

&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;Continuous Deployment (CD) pipeline:&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The CD pipeline uploads all the artifacts (Jar, Json Config, Whl file) built by the CI pipeline into the Databricks File System (DBFS). The CD pipeline will also update/upload any (.sh) files from the build artifact as Global Init Scripts for the Databricks Workspace.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;It has the following Tasks:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Key vault task to fetch the data bricks secrets(PAT Token, URL)&lt;/LI&gt;
&lt;LI&gt;Upload Databricks Artifacts
&lt;UL&gt;
&lt;LI&gt;This will run a PowerShell script that uses the DBFS API from data bricks -&amp;gt;&amp;nbsp;&lt;SPAN&gt;&lt;A href="https://docs.databricks.com/dev-tools/api/latest/dbfs.html#create" target="_blank" rel="noopener"&gt;https://docs.databricks.com/dev-tools/api/latest/dbfs.html#create&lt;/A&gt;&lt;/SPAN&gt;&amp;nbsp;&lt;/LI&gt;
&lt;LI&gt;Script Name: DBFSUpload.ps1&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;P class="lia-indent-padding-left-90px"&gt;&lt;STRONG&gt;Arguments:&lt;/STRONG&gt;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-120px"&gt;Databricks PAT Token to access Databricks Workspace&lt;/P&gt;
&lt;P class="lia-indent-padding-left-120px"&gt;Databricks Workspace URL&lt;/P&gt;
&lt;P class="lia-indent-padding-left-120px"&gt;Pipeline Working Directory URL where the files((Jar, Json Config, Whl file) are present&lt;/P&gt;
&lt;P class="lia-indent-padding-left-120px"&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp;3.Upload Global Init Scripts&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&amp;nbsp;This will run a PowerShell script that uses the Global Init Scripts API from data bricks -&amp;nbsp; &amp;gt;&amp;nbsp;&lt;SPAN&gt;&lt;A href="https://docs.databricks.com/dev-tools/api/latest/global-init-scripts.html#operation/create-script" target="_blank" rel="noopener"&gt;https://docs.databricks.com/dev-tools/api/latest/global-init-scripts.html#operation/create-script&lt;/A&gt;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI&gt;Script Name : DatabricksGlobalInitScriptUpload.ps1&lt;/LI&gt;
&lt;/UL&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;&lt;STRONG&gt;Arguments&lt;/STRONG&gt;:&lt;/P&gt;
&lt;P class="lia-indent-padding-left-90px"&gt;Databricks PAT Token to access Databricks Workspace&lt;/P&gt;
&lt;P class="lia-indent-padding-left-90px"&gt;Databricks Workspace URL&lt;/P&gt;
&lt;P class="lia-indent-padding-left-90px"&gt;Pipeline Working Directory URL where the global init scripts are present&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;The YAML code for this CD pipeline with all the steps included. and scripts for uploading artifacts are included in the next page.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;CD-YAML code:&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="yaml"&gt;name: Release-$(rev:r)

trigger: none

resources:
  pipelines:
    - pipeline: DatabricksArtifacts
      source: DatabricksArtifacts-CI
      trigger:
        branches:
          - main

variables:
  - group: Sample-Variable-Group
  - name: azureSubscription
    value: 'Sample-Azure-Service-Connection'
  - name: workingDirectory_utilities
    value: '$(Pipeline.Workspace)/DatabricksArtifacts/DatabricksArtifacts'

stages:
  - stage: Release
    displayName: Release stage

    jobs:
      - deployment: DeployDatabricksArtifacts
        displayName: Deploy Databricks Artifacts
        strategy:
          runOnce:
            deploy:
              steps:
                - checkout: self
                - task: AzureKeyVault@1
                  inputs:
                    azureSubscription: "$(azureSubscription)"
                    KeyVaultName: $(keyvault_name)
                    SecretsFilter: "databricks-pat,databricks-url"
                    RunAsPreJob: true
                - task: AzurePowerShell@5
                  displayName: Upload Databricks Artifacts
                  inputs:
                    azureSubscription: '$(azureSubscription)'
                    ScriptType: 'FilePath'
                    ScriptPath: '$(System.DefaultWorkingDirectory)/Pipelines/Scripts/DatabricksArtifactsUpload.ps1'
                    ScriptArguments: '-databricksPat $(databricks-pat) -databricksUrl $(databricks-url) -workingDirectory $(workingDirectory_utilities)'
                    azurePowerShellVersion: 'LatestVersion'
                - task: AzurePowerShell@5
                  displayName: Upload Global Init Scripts
                  inputs:
                    azureSubscription: '$(azureSubscription)'
                    ScriptType: 'FilePath'
                    ScriptPath: '$(System.DefaultWorkingDirectory)/Pipelines/Scripts/DatabricksGlobalInitScriptUpload.ps1'
                    ScriptArguments: '-databricksPat $(databricks-pat) -databricksUrl $(databricks-url) -workingDirectory $(workingDirectory_utilities)'
                    azurePowerShellVersion: 'LatestVersion'

&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;DBFSUpload.ps1&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="powershell"&gt;param(
    [String] [Parameter (Mandatory = $true)] $databricksPat,
    [String] [Parameter (Mandatory = $true)] $databricksUrl,
    [String] [Parameter (Mandatory = $true)] $workingDirectory
)

Function UploadFile {
    param (
        [String] [Parameter (Mandatory = $true)] $sourceFilePath,
        [String] [Parameter (Mandatory = $true)] $fileName,
        [String] [Parameter (Mandatory = $true)] $targetFilePath
    )

    #Grab bytes of source file
    $BinaryContents = [System.IO.File]::ReadAllBytes($sourceFilePath);
    $enc = [System.Text.Encoding]::GetEncoding("ISO-8859-1");
    $fileEnc = $enc.GetString($BinaryContents);

    #Create body of request
    $LF = "`r`n";
    $boundary = [System.Guid]::NewGuid().ToString();
    $bodyLines = (
        "--$boundary",
        "Content-Disposition: form-data; name=`"path`"$LF",
        $targetFilePath,
        "--$boundary",
        "Content-Disposition: form-data; name=`"contents`";filename=`"$fileName`"",
        "Content-Type: application/octet-stream$LF",
        $fileEnc,
        "--$boundary",
        "Content-Disposition: form-data; name=`"overwrite`"$LF",
        "true",
        "--$boundary--$LF"
    ) -join $LF;

    #Create Request
    $params = @{
        Uri         = "$databricksUrl/api/2.0/dbfs/put"
        Body        = $bodyLines
        Method      = 'Post'
        Headers     = @{
            Authorization = "Bearer $databricksPat"
        }
        ContentType = "multipart/form-data; boundary=$boundary"
    }

    Invoke-RestMethod @params;
}

Function GetTargetFilePath {
    param (
        [System.IO.FileInfo] [Parameter (Mandatory = $true)] $sourceFile
    )

    switch ($sourceFile.extension)
    {
        ".json" {return "/FileStore/config/$($sourceFile.Name)"}
        ".jar" {return "/FileStore/jar/$($sourceFile.Name)"}
        ".whl" {return "/FileStore/whl/$($sourceFile.Name)"}
    }
}

#Loop through all files and upload to dbfs
$filenames = get-childitem $workingDirectory -recurse;
$filenames | ForEach-Object {
    if ( $_.extension -eq ".json" -OR $_.extension -eq ".whl" -OR $_.extension -eq ".jar") {
        $targetFilePath = GetTargetFilePath -sourceFile $_;
        Write-Host "Uploading $($_.FullName) to dbfs at $targetFilePath.";
        UploadFile -sourceFilePath $_.FullName -fileName $_.Name -targetFilePath $targetFilePath;
    }
}

&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;DatabricksGlobalInitScriptUpload.ps1&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="powershell"&gt;param(
    [String] [Parameter (Mandatory = $true)] $databricksPat,
    [String] [Parameter (Mandatory = $true)] $databricksUrl,
    [String] [Parameter (Mandatory = $true)] $workingDirectory
)

Function UploadFile {
    param (
        [String] [Parameter (Mandatory = $true)] $uri,
        [String] [Parameter (Mandatory = $true)] $restMethod,
        [String] [Parameter (Mandatory = $true)] $sourceFilePath,
        [String] [Parameter (Mandatory = $true)] $fileName
    )

    #Grab bytes of source file
    $base64string = [Convert]::ToBase64String([IO.File]::ReadAllBytes($sourceFilePath))

    #Create body of request
    $body = @{
        name     = $fileName
        script   = $base64string
        position = 1
        enabled  = "false"
    }

    #Create Request
    $params = @{
        Uri         = $uri
        Body        = $body | ConvertTo-Json
        Method      = $restMethod
        Headers     = @{
            Authorization = "Bearer $databricksPat"
        }
        ContentType = "application/json"
    }

    Invoke-RestMethod @params;
}

Function GetAllScripts {
    #Create Request
    $params = @{
        Uri         = "$databricksUrl/api/2.0/global-init-scripts"
        Method      = "GET"
        Headers     = @{
            Authorization = "Bearer $databricksPat"
        }
        ContentType = "application/json"
    }

    return Invoke-RestMethod @params;
}

#Loop through all files and upload to databricks global init
$scripts = GetAllScripts
$filenames = get-childitem $workingDirectory -recurse;
$filenames | ForEach-Object {
    if ( $_.extension -eq ".sh") {
        #Check if file name already exists in databricks
        $scriptId = ($scripts.scripts -match $_.Name).script_id
        if (!$scriptId){
            #Create Global init script
            Write-Host "Uploading $($_.FullName) as a global init script with name $($_.Name) to databricks";
            UploadFile -uri "$databricksUrl/api/2.0/global-init-scripts" -restMethod "POST" -sourceFilePath $_.FullName -fileName $_.Name;
        } else{
            #Update Global init script
            Write-Host "Updating global init script with name $($_.Name) to databricks";
            UploadFile -uri "$databricksUrl/api/2.0/global-init-scripts/$scriptId" -restMethod "PATCH" -sourceFilePath $_.FullName -fileName $_.Name;
        }
    }
}

&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H1&gt;End Result of Successful Pipeline Runs:&lt;/H1&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;H1&gt;Global Init Script Upload:&lt;/H1&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;H1&gt;Conclusion:&lt;/H1&gt;
&lt;P&gt;Using this CI CD approach we were successfully able to upload the artifacts to the Databricks file system.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;U&gt;References:&lt;/U&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;&amp;nbsp;&lt;SPAN&gt;&lt;A href="https://docs.databricks.com/dev-tools/api/latest/dbfs.html#create" target="_blank" rel="noopener"&gt;https://docs.databricks.com/dev-tools/api/latest/dbfs.html#create&lt;/A&gt;&lt;/SPAN&gt;&amp;nbsp;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN&gt;&lt;A href="https://docs.databricks.com/dev-tools/api/latest/global-init-scripts.html#operation/create-script" target="_blank" rel="noopener"&gt;https://docs.databricks.com/dev-tools/api/latest/global-init-scripts.html#operation/create-script&lt;/A&gt;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Wed, 02 Feb 2022 15:26:07 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/azure-databricks-artifacts-deployment/ba-p/2913522</guid>
      <dc:creator>poornamishra</dc:creator>
      <dc:date>2022-02-02T15:26:07Z</dc:date>
    </item>
    <item>
      <title>Get a 360-degree view of data sharing lineage with Azure Data Share and Purview</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/get-a-360-degree-view-of-data-sharing-lineage-with-azure-data/ba-p/2228501</link>
      <description>&lt;P&gt;As companies embark on the journey to be data-driven, they need access to data both within and outside of their organization. It is critical to be able to share data easily and securely with external business partners and internally between different departments. Azure Data Share enables easy and secure sharing of files, folders, and databases with just a few clicks. Through the integration with Azure Purview, you can now have a 360-degree view of data sharing including what data is shared with or received from other organizations, which helps you perform impact and root cause analysis on your datasets.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Scenario 1: 360-degree view of data sharing&lt;/H3&gt;
&lt;P&gt;Whether you are sharing data with business partners or within your organization, you can see bi-directional sharing relationships in the Purview asset lineage graph. You can discover all the datasets that are shared to or received from a specific organization.&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Scenario 2: Impact analysis on data shared with another organization&lt;/H3&gt;
&lt;P&gt;When you make a change to a dataset, such as changing the file format or the schema of a database table, it is important to know who is using that data and will be impacted by the change. Lineage lets you easily understand the impact of the downstream internal or external users who you have shared the data with.&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Scenario 3: Root cause analysis for data received from other organizations&lt;/H3&gt;
&lt;P&gt;When troubleshooting a data issue, it is important to find out source of the data. Using lineage, you can identify where the data is originally coming from, and if it is from within or outside your organization.&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Get started&lt;/H3&gt;
&lt;P&gt;It just takes a few simple steps to get data sharing lineage in Purview.&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Connect your Azure Data Share account to a Purview account&lt;/LI&gt;
&lt;LI&gt;Trigger scheduled or on-demand snapshots in Azure Data Share&lt;/LI&gt;
&lt;LI&gt;Browse and select an asset in Purview&lt;/LI&gt;
&lt;LI&gt;View the asset lineage in Purview&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;This quick demo shows the steps:&lt;/P&gt;
&lt;P&gt;&lt;div data-video-id="https://youtu.be/rLhXiXD-gYY?vq=hd1080" data-video-remote-vid="https://youtu.be/rLhXiXD-gYY?vq=hd1080" class="lia-video-container lia-media-is-center lia-media-size-large"&gt;&lt;iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2FrLhXiXD-gYY%3Ffeature%3Doembed&amp;amp;display_name=YouTube&amp;amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3DrLhXiXD-gYY&amp;amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2FrLhXiXD-gYY%2Fhqdefault.jpg&amp;amp;type=text%2Fhtml&amp;amp;schema=youtube" allowfullscreen="" style="max-width: 100%"&gt;&lt;/iframe&gt;&lt;/div&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Resources&lt;/H3&gt;
&lt;P&gt;&lt;A href="https://docs.microsoft.com/en-us/azure/purview/how-to-link-azure-data-share" target="_blank" rel="noopener"&gt;Connect to Azure Data Share - Azure Purview | Microsoft Docs&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://docs.microsoft.com/en-us/azure/data-share/" target="_blank" rel="noopener"&gt;Azure Data Share | Microsoft Docs&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 24 Mar 2021 17:22:02 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/get-a-360-degree-view-of-data-sharing-lineage-with-azure-data/ba-p/2228501</guid>
      <dc:creator>jiefeng</dc:creator>
      <dc:date>2021-03-24T17:22:02Z</dc:date>
    </item>
    <item>
      <title>Share IoT and Log data in real time using Azure Data Share and Azure Data Explorer</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/share-iot-and-log-data-in-real-time-using-azure-data-share-and/ba-p/1141419</link>
      <description>&lt;P&gt;In today's digital world, more and more data is generated from devices and software deployed at home and work. This could be data collected from applications, websites, IoT devices and equipment. Often, this data needs to be shared and enriched with 3rd party data in real-time to derive insights. Traditionally, data is shared by generating a data feed made available via file share, FTP, etc. This process requires organizations to invest in maintaining the data pipelines and introduces latency into the system.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Azure Data Explorer is a fully managed data analytics service for real-time analysis on large volumes of data streamed from applications and devices. You can use Azure Data Explorer to collect, store, and analyze diverse data on the fly to quickly identify patterns, anomalies and trends. Azure Data Share enables easy and secure data sharing within or cross organizations. You can share data code-free while maintain full visibility into your data sharing relationships including what data is shared and who it is shared with.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;With the integration between Azure Data Explorer and Azure Data Share, data can be easily and securely shared with partners, service providers or customers for near-real-time collaboration.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Here is an example of how organizations with manufacturing plants are using Azure Data Explorer and Azure Data Share for near-real-time analytics. Data is streamed from IoT Hub into Azure Data Explorer cluster and stored in databases. It is then shared through Azure Data Share to the service provider’s Azure Data Explorer cluster for analysis and dashboarding.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;H3&gt;&lt;BR /&gt;&lt;STRONG&gt;How data sharing works&lt;/STRONG&gt;&lt;/H3&gt;
&lt;P&gt;Azure Data Share enables sharing of databases in-place from Azure Data Explorer clusters. Data provider can create a share by specifying the databases they want to share, who to share with and terms of use. The same data can be shared with multiple data consumers. Azure Data Share service sends an email invitation to each data consumer, who can then accept the invitation and specify a target Azure Data Explorer cluster in the same Azure data center to access the data.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;BR /&gt;After sharing relationship is established, Azure Data Share creates a symbolic link between the provider and consumer's Azure Data Explorer clusters. This enables data consumer to read and query the data, but not write to the shared databases. Access to the data uses compute resources from the consumer's Azure Data Explorer cluster. Data consumer can also configure its own access and caching policies to the shared databases, as well as view logs of its own query. When caching policy is configured, data is cached on the consumer's Azure Data Explorer cluster for improved query performance, but is not copied into the blob storage powering the consumer's Azure Data Explorer cluster.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;Data provider can stop sharing by revoking access. This will remove the symbolic link and terminate access to the shared database(s).&lt;/P&gt;
&lt;H3&gt;&lt;BR /&gt;&lt;STRONG&gt;Data provider share data&lt;/STRONG&gt;&lt;/H3&gt;
&lt;P&gt;To share data, user needs write permission and permission to add role assignment to the source Azure Data Explorer cluster. These permissions typically exist in the Owner role.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN style="display: inline !important; float: none; background-color: #ffffff; color: #333333; cursor: text; font-family: inherit; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 300; letter-spacing: normal; line-height: 1.7142; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px;"&gt;The following are steps to share data:&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Create an Azure Data Share resource&lt;/LI&gt;
&lt;LI&gt;Within the Azure Data Share resource, create a share by specifying the following information:
&lt;UL&gt;
&lt;LI&gt;Name&lt;/LI&gt;
&lt;LI&gt;Share type: In-place&lt;/LI&gt;
&lt;LI&gt;Description (optional)&lt;/LI&gt;
&lt;LI&gt;Terms of use (optional)&lt;/LI&gt;
&lt;LI&gt;Azure Data Explorer cluster or databases - When a cluster is shared, all databases on the cluster including future databases created by the data provider are made available to the data consumer. When database is shared, only the individual database is available to the data consumer.&lt;/LI&gt;
&lt;LI&gt;Recipients’ Azure login email address&lt;/LI&gt;
&lt;/UL&gt;
&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;IFRAME src="https://www.youtube-nocookie.com/embed/QmsTnr90_5o?rel=0&amp;amp;vq=hd1080" width="1056px" height="594px" frameborder="0" allowfullscreen="allowfullscreen" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture"&gt;&lt;/IFRAME&gt;&lt;/P&gt;
&lt;H3&gt;&lt;BR /&gt;&lt;STRONG&gt;Data consumer receive data&lt;/STRONG&gt;&lt;/H3&gt;
&lt;P&gt;To receive data, user needs write permission and permission to add role assignment to the target Azure Data Explorer cluster.&amp;nbsp;These permissions typically exist in the Owner role.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN style="display: inline !important; float: none; background-color: #ffffff; color: #333333; cursor: text; font-family: inherit; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 300; letter-spacing: normal; line-height: 1.7142; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px;"&gt;The following are steps to receive data:&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;Receive email invitation and follow the link to log into Azure&lt;/LI&gt;
&lt;LI&gt;Agree to the terms of use and accept invitation into its own Azure Data Share resource&lt;/LI&gt;
&lt;LI&gt;Specify an Azure Data Explorer cluster in the same Azure data center as the source to access data&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;IFRAME src="https://www.youtube-nocookie.com/embed/vBq6iFaCpdA?rel=0&amp;amp;vq=hd1080" width="1056px" height="594px" frameborder="0" allowfullscreen="allowfullscreen" allow="accelerometer; autoplay; encrypted-media; gyroscope; picture-in-picture"&gt;&lt;/IFRAME&gt;&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;Data consumer can now go to its Azure Data Explorer cluster to grant user permissions to the share databases and access the data. Data ingested using batch mode into the source Azure Data Explorer cluster will show up on the target cluster within a few seconds to a few minutes.&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;&lt;BR /&gt;&lt;STRONG&gt;Additional resources&lt;/STRONG&gt;&lt;/H3&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;A title="Azure Data Share Documentation" href="https://docs.microsoft.com/en-us/azure/data-share/" target="_blank" rel="noopener"&gt;Azure Data Share documentation&lt;/A&gt;&amp;nbsp;&lt;/LI&gt;
&lt;LI&gt;&lt;A title="Azure Data Share videos" href="https://channel9.msdn.com/Tags/azure-data-share" target="_blank" rel="noopener"&gt;Azure Data Share videos&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;A title="Azure Data Share product web page" href="https://azure.microsoft.com/en-us/services/data-share/" target="_blank" rel="noopener"&gt;Azure Data Share product web page&lt;/A&gt;&amp;nbsp;&lt;/LI&gt;
&lt;LI&gt;&lt;A title="Azure Data Explorer Documentation" href="https://docs.microsoft.com/en-us/azure/data-explorer/" target="_blank" rel="noopener"&gt;Azure Data Explorer documentation&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;A title="Azure Data Explorer product web page" href="https://azure.microsoft.com/en-us/services/data-explorer/" target="_blank" rel="noopener"&gt;Azure Data Explorer product web page&lt;/A&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 02 Mar 2020 21:32:35 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/share-iot-and-log-data-in-real-time-using-azure-data-share-and/ba-p/1141419</guid>
      <dc:creator>jiefeng</dc:creator>
      <dc:date>2020-03-02T21:32:35Z</dc:date>
    </item>
    <item>
      <title>Scale your data sharing needs with the power of Azure Data Share’s .NET SDK</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/scale-your-data-sharing-needs-with-the-power-of-azure-data-share/ba-p/1061191</link>
      <description>&lt;P&gt;Azure Data Share, now a generally available service, makes it very simple to share your organization’s data securely with your partners. You may have already seen and tried it on the Azure Portal to share data swiftly, without writing any code. But what if you want to scale your sharing needs to thousands of customers spread across the world? Data Share offers a rich API/SDK for you to leverage and scale your sharing relationships seamlessly. Let’s jump right in and walk through a sample use case using the .NET SDK.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;FONT size="5"&gt;Why use the .NET SDK?&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Imagine the following situation: Your organization provides data to some partners and consumes data from others, where these partners can be departments within your company or external organizations. &lt;A href="https://docs.microsoft.com/en-us/azure/data-share/share-your-data" target="_self"&gt;Using Data Shares&lt;/A&gt;'s portal experience for creating and managing the first few sharing relationships is quick and intuitive. However, as the number of sharing relationships grows larger, the process would soon grow tedious. Imagine managing potentially hundreds or even thousands of sharing relationships. This would become manually unscalable. We’ve designed the Data Share SDK for ease-of-use, to facilitate the scaling of your organization’s sharing needs.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H1&gt;&lt;FONT size="5"&gt;Scenario&lt;BR /&gt;&lt;BR /&gt;&lt;/FONT&gt;&lt;/H1&gt;
&lt;P&gt;Since we would like to demonstrate both the data provider’s and consumer’s perspectives, let’s try a common customer scenario of sharing between departments of the same organization. Specifically, suppose the provider department (say Marketing) has its data in a blob store and wants to share that with a different department (say Sales). To make this sharing more interesting, we’ll try to share the data to a different tenant. Let's see how the Data Share SDK can be used for this.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H2&gt;&lt;FONT size="5"&gt;Setting up the Console Application&lt;BR /&gt;&lt;BR /&gt;&lt;/FONT&gt;&lt;/H2&gt;
&lt;H3&gt;Getting a copy of the sample code&lt;BR /&gt;&lt;BR /&gt;&lt;/H3&gt;
&lt;P&gt;Start by cloning the sample &lt;A href="https://github.com/Azure-Samples/azure-data-share-dotnet-api-sample.git" target="_self"&gt;Git repository&lt;/A&gt;: by typing on the command prompt:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="markup"&gt;git clone &lt;A href="https://github.com/Azure-Samples/azure-data-share-dotnet-api-sample.git" target="_blank"&gt;https://github.com/Azure-Samples/azure-data-share-dotnet-api-sample.git&lt;/A&gt;&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;&lt;BR /&gt;Creating a Service Principal for the Console Application&lt;BR /&gt;&lt;BR /&gt;&lt;/H3&gt;
&lt;P&gt;We'll use active directory application Id and secret for authenticating the console application. For this, an Azure Active Directory (AAD) application must be created each for the provider and the consumer. Follow this &lt;A href="https://docs.microsoft.com/en-us/azure/active-directory/develop/howto-create-service-principal-portal" target="_self"&gt;tutorial to set up the AAD application&lt;/A&gt;.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Creating Storage Accounts&lt;BR /&gt;&lt;BR /&gt;&lt;/H3&gt;
&lt;P&gt;The console application will share data from the provider data share account to the consumer data share account, each of which will point to an underlying data store. For this demo you will need to create a storage account each for the provider and consumer. Please follow this &lt;A href="https://docs.microsoft.com/en-us/azure/storage/common/storage-quickstart-create-account?tabs=azure-portal" target="_self"&gt;tutorial to create the storage accounts&lt;/A&gt;. Also ensure that the Service Principal created in the previous section has &lt;EM&gt;"Owner"&lt;/EM&gt; role on the storage accounts. To learn how to add role assignment to resources, please follow this &lt;A href="https://docs.microsoft.com/en-us/azure/role-based-access-control/role-assignments-portal" target="_self"&gt;tutorial&lt;/A&gt;.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Configuring the run-time settings&lt;BR /&gt;&lt;BR /&gt;&lt;/H3&gt;
&lt;P&gt;Once the repository has been cloned, navigate to the file &lt;STRONG&gt;DataShareSample.sln &lt;/STRONG&gt;and open it. By default, it should open in Visual studio 2017. Try to build the solution and make sure everything compiles correctly.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Now we will go ahead and configure the run-time settings in the &lt;EM&gt;appSettings.json&lt;/EM&gt; file (shown in Snippet 1):&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Snippet 1: appSettings.json&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="python"&gt;{
    "configs": {
        "provider": {
            "tenantId": "",
            "clientId": "",
            "objectId": "",
            "secret": "",
            "subscriptionId": "",

            "dataShareResourceGroup": "",
            "dataShareAccountName": "",
            "dataShareShareName": "",
            "dataShareInvitation": "",
            "dataShareDataSetName": "",
            "dataShareDataSetMappingName": "",

            "storageResourceGroup": "",
            "storageAccountName": "",
            "storageContainerName": "",
            "storageBlobName": ""
          },
        "consumer": {
            "tenantId": "",
            "clientId": "",
            "objectId": "",
            "secret": "",
            "subscriptionId": "",

            "dataShareResourceGroup": "",
            "dataShareAccountName": "",
            "dataShareShareSubscriptionName": "",
            "dataShareInvitation": "",
            "dataShareDataSetName": "",
            "dataShareDataSetMappingName": "",

            "storageResourceGroup": "",
            "storageAccountName": "",
            "storageContainerName": "",
            "storageBlobName": ""
          }
      }
  }&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;That's it! Now that you have everything configured, let’s run the code by debugging through the important lines.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H2&gt;Code walk-through and execution&lt;BR /&gt;&lt;BR /&gt;&lt;/H2&gt;
&lt;P&gt;&lt;EM&gt;Program.cs&lt;/EM&gt; looks similar to the code given below in Snippet 2. Let’s have a look within the Main method. First the configurations are read from the &lt;EM&gt;appSettings.json&lt;/EM&gt; that you have just filled. Following this, a Resource Group is created (for a logical grouping of the data share resources that we are about to create). Once the resource group is in order, the Data Share Account creation code is invoked, followed immediately by the Share creation.&amp;nbsp; An important step to enable the data sharing is to assign the Data Share Account the Blob Reader role on the underlying provider storage account. Finally on the provider side, Data Sets are created and an invitation is sent to the consumer.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;&lt;EM&gt;&lt;STRONG&gt;Note: the AAD application should have permission to create resources in the subscription configured and the Microsoft.DataShare resource provider should be registered in the subscriptions configured in appSettings.json.&lt;BR /&gt;&lt;BR /&gt;&lt;/STRONG&gt;&lt;/EM&gt;&lt;/P&gt;
&lt;P&gt;On the consumer side, a similar flow is followed. The consumer Data Share Account is created,&amp;nbsp; and subsequently the invitation is accepted by creating a Share Subscription. On the consumer side, the account Id is assigned a Blob writer role on the underlying consumer data store. A Data Set Mapping is created to link the Data Set received on the consumer side to the consumer data store. Finally, a data Synchronization is initiated and the result reported.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;Go ahead and execute the code or debug through it line by line to gain a better understanding. You should be able to track the resource creation on Azure Portal while the code executes. Further, the blob from the provider blob store would appear on the consumer blob store at the end of a successful synchronize call.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Snippet 2: Program.cs&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;LI-CODE lang="csharp"&gt;// -----------------------------------------------------------------------
//  &amp;lt;copyright file="Program.cs" company="Microsoft Corporation"&amp;gt;
//      Copyright (C) Microsoft Corporation. All rights reserved.
//  &amp;lt;/copyright&amp;gt;
// -----------------------------------------------------------------------

namespace DataShareSample
{
    using System;
    using System.IO;
    using System.Threading.Tasks;
    using Microsoft.Azure.Management.DataShare.Models;
    using Microsoft.Azure.Management.ResourceManager.Fluent;
    using Microsoft.Extensions.Configuration;

    public class Program
    {
        public static async Task Main(string[] args)
        {
            Console.WriteLine("\r\n\r\nReading the configurations...");
            IConfigurationRoot configurationRoot = new ConfigurationBuilder()
                .SetBasePath(Directory.GetCurrentDirectory()).AddJsonFile("AppSettings.json").Build();
            var configuration = configurationRoot.GetSection("configs").Get&amp;lt;Configuration&amp;gt;();

            Console.WriteLine("\r\n\r\nIdempotent creates for provider resources...");
            var providerContext = new UserContext(configuration.Provider);
            IResourceGroup providerResourceGroup = providerContext.IdempotentCreateResourceGroup();
            Account providerAccount = providerContext.IdempotentCreateAccount();
            Share share = providerContext.IdempotentCreateShare();

            Console.WriteLine($"\r\n\r\nAssign MSI of {providerAccount.Id} as the Blob Reader on the Provider Storage...");
            await providerContext.AssignRoleTaskAsync(
                configuration.Provider,
                providerAccount.Identity.PrincipalId,
                "2a2b9908-6ea1-4ae2-8e65-a410df84e7d1");

            Console.WriteLine("\r\n\r\nCreate data set and send invitation");
            DataSet dataSet = providerContext.CreateIfNotExistDataSet(configuration.Provider);

            Invitation invitation = providerContext.CreateIfNotExistInvitation(configuration.Consumer);

            Console.WriteLine("\r\n\r\nIdempotent creates for consumer");
            var consumerContext = new UserContext(configuration.Consumer);
            IResourceGroup consumerResourceGroup = consumerContext.IdempotentCreateResourceGroup();
            Account consumerAccount = consumerContext.IdempotentCreateAccount();

            Console.WriteLine("\r\n\r\nTo accept the invitation create a share subscription/received share...");
            ShareSubscription shareSubscription = consumerContext.CreateIfNotExistShareSubscription(invitation);

            Console.WriteLine($"\r\n\r\nAssign MSI of {consumerAccount.Id} as the Blob Contributor on the consumer Storage...");
            await consumerContext.AssignRoleTaskAsync(
                configuration.Consumer,
                consumerAccount.Identity.PrincipalId,
                "ba92f5b4-2d11-453d-a403-e96b0029c9fe");

            Console.WriteLine("\r\n\r\nCreate data set mapping to setup storage for the consumer");
            ConsumerSourceDataSet consumerSourceDataSet = consumerContext.GetConsumerSourceDataSet();
            DataSetMapping dataSetMapping = consumerContext.CreateDataSetMapping(
                configuration.Consumer,
                consumerSourceDataSet);

            Console.WriteLine("\r\n\r\nInitiate a snapshot copy (duration depends on how large the data is)...");
            ShareSubscriptionSynchronization response = consumerContext.Synchronize();
            Console.WriteLine(
                $"Synchronization Status: {response.Status}. Check resource {consumerAccount.Id} on &lt;A href="https://portal.azure.com" target="_blank"&gt;https://portal.azure.com&lt;/A&gt; for further details. \r\n\r\n Hit Enter to continue...");

            Console.ReadLine();
        }
    }
}&lt;/LI-CODE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The overall program flow can be summarized by Figure 1.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P style="text-align: center;"&gt;Figure 1: Sharing Model&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H1&gt;&lt;FONT size="5"&gt;Additional Capabilities: Scheduled Snapshots&lt;BR /&gt;&lt;BR /&gt;&lt;/FONT&gt;&lt;/H1&gt;
&lt;P&gt;In addition to the above process of triggering an on-demand synchronization, Data Share also provides native automation. You may choose to write a wrapper around scheduling on-demand runs or use the scheduled synchronization feature. To enable this cool feature, the provider, at the time of creating the share, specifies a snapshot schedule with a daily or hourly frequency along with a schedule start time; the consumer simply needs to accept the schedule by creating a Trigger on their side to receive automated snapshots as per the schedule. The consumer, of course always has the option to disable or re-enable the schedule. This is specifically useful for daily/hourly reports or non-real-time incremental updates. The snapshots taken after the first snapshot will be incremental in case the schedule is enabled. You can find API documentation for this at &lt;A href="https://docs.microsoft.com/bs-latn-ba/rest/api/datashare/synchronizationsettings/create?view=azurermps-6.8.1" target="_self"&gt;Synchronization Settings&lt;/A&gt; and &lt;A href="https://docs.microsoft.com/bs-latn-ba/rest/api/datashare/triggers/create?view=azurermps-6.8.1" target="_self"&gt;Triggers&lt;/A&gt;&amp;nbsp;&amp;nbsp;.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;FONT size="5"&gt;Conclusion&lt;BR /&gt;&lt;BR /&gt;&lt;/FONT&gt;&lt;/P&gt;
&lt;P&gt;With the Azure Data Share’s easy-to-use .NET SDK, you can now take control of sharing big data across organizations and geographies and through a single pane-of-glass create and manage all your sharing relationships at scale. For further reading please refer to our &lt;A href="https://docs.microsoft.com/en-us/azure/data-share/." target="_self"&gt;public documentation&lt;/A&gt;.&lt;/P&gt;</description>
      <pubDate>Thu, 12 Dec 2019 07:55:59 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/scale-your-data-sharing-needs-with-the-power-of-azure-data-share/ba-p/1061191</guid>
      <dc:creator>Akshat_Dave</dc:creator>
      <dc:date>2019-12-12T07:55:59Z</dc:date>
    </item>
    <item>
      <title>Azure Data Share is now in GA!</title>
      <link>https://techcommunity.microsoft.com/t5/azure-data-blog/azure-data-share-is-now-in-ga/ba-p/947046</link>
      <description>&lt;P&gt;Todays most successful organizations have one thing in common: they are tirelessly data driven at their core.&lt;SPAN style="display: inline !important; float: none; background-color: #ffffff; color: #333333; cursor: text; font-family: inherit; font-size: 16px; font-style: normal; font-variant: normal; font-weight: 300; letter-spacing: normal; line-height: 1.7142; orphans: 2; text-align: left; text-decoration: none; text-indent: 0px; text-transform: none; -webkit-text-stroke-width: 0px; white-space: normal; word-spacing: 0px;"&gt; The ability to easily and securely share and consume data from customers and partners is critical for organizations looking to unlock transformational insights and make critical business decisions.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The Azure Data Share product team is pleased to announce the general availability of snapshot-based sharing for Azure Data Lake Store and Azure Blob Storage. In just a few clicks, organizations can share data stored in their data lakes with third party organizations outside their Azure tenancy. Data providers wanting to share data with their customers/partners can easily create a new share, populate it with data residing in a variety of stores and add recipients.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P style="margin: 0in; margin-bottom: .0001pt;"&gt;&lt;SPAN style="font-size: 12.0pt; font-family: &amp;amp;quot; color: #333333;"&gt;Data providers always stay in control of their data sharing relationships. They are able to track who they've shared with and whether the data consumers have started consuming their data. If they no longer want their data consumers to receive new data, they can revoke their access to future updates. Azure Data Share offers a hassle&lt;/SPAN&gt;&lt;SPAN style="font-size: 12.0pt; font-family: &amp;amp;quot;"&gt;-&lt;SPAN style="color: #333333;"&gt;free way to manage data &lt;/SPAN&gt;– organizations &lt;SPAN style="color: #333333;"&gt;pay &lt;/SPAN&gt;only for &lt;SPAN style="color: #333333;"&gt;what &lt;/SPAN&gt;they&lt;SPAN style="color: #333333;"&gt; share, &lt;/SPAN&gt;with &lt;SPAN style="color: #333333;"&gt;no code to write and no infrastructure to manage.&amp;nbsp;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P style="margin: 0in; margin-bottom: .0001pt; box-sizing: border-box;"&gt;&lt;SPAN style="font-size: 12.0pt; font-family: &amp;amp;quot; color: #333333;"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Azure Data Share uses Managed Identities for Azure Services to authenticate to the underlying data store, ensuring that no exchange of credentials occurs between data providers and data consumers.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;We're also excited to announce that snapshot-based sharing from Azure SQL-based sources including Azure SQL Database and Azure SQL Data Warehouse is now in public preview. Data providers that want to share data from SQL-based sources no longer need to be concerned about how their data consumers will receive the data. The ability to share from one source, receive in another is a capability which many customers have been eagerly awaiting and one that offers customers flexibility of choice. Providers can share tables and views from an Azure SQL Data Warehouse, and their data consumers can receive this in parquet or csv directly to their data lake. Alternatively, data consumers can choose to receive SQL-based data into a SQL-based source, another option which provides customers with data fluidity previously unseen in the market.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Customers that have been seeking out the ability to perform in-place sharing - sharing data without data movement - will be delighted to learn that we are now in limited public preview with in-place sharing from Azure Data Explorer. In-place sharing works by establishing a symbolic link to where the data resides, enabling customers to run their compute workloads against data in real-time. This feature is not to be missed - click &lt;A title="in-place sharing for Azure Data Explorer" href="https://aka.ms/azuredatasharepreviewsignup" target="_blank" rel="noopener"&gt;here&lt;/A&gt; to join! &amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;img /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Secure, code free, pay for what you share, no infra to manage - it's as simple as that. Start sharing today!&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Mon, 04 Nov 2019 13:00:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-data-blog/azure-data-share-is-now-in-ga/ba-p/947046</guid>
      <dc:creator>joannapod</dc:creator>
      <dc:date>2019-11-04T13:00:00Z</dc:date>
    </item>
  </channel>
</rss>

