<?xml version="1.0" encoding="UTF-8"?>
<rss xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:taxo="http://purl.org/rss/1.0/modules/taxonomy/" version="2.0">
  <channel>
    <title>Azure Databricks articles</title>
    <link>https://techcommunity.microsoft.com/t5/azure-databricks/bg-p/azure-databricks</link>
    <description>Azure Databricks articles</description>
    <pubDate>Wed, 29 Apr 2026 20:54:15 GMT</pubDate>
    <dc:creator>azure-databricks</dc:creator>
    <dc:date>2026-04-29T20:54:15Z</dc:date>
    <item>
      <title>Introducing Lakeflow Connect Free Tier, now available on Azure Databricks</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/introducing-lakeflow-connect-free-tier-now-available-on-azure/ba-p/4502755</link>
      <description>&lt;P&gt;We're excited to introduce the Lakeflow Connect Free Tier on Azure Databricks, so you can easily bring your enterprise data into your lakehouse to build analytics and AI applications faster.&lt;/P&gt;
&lt;P&gt;Modern applications require reliable access to operational data, especially for training analytics and AI agents, but connecting and gathering data across silos can be challenging. With this new release, you can seamlessly ingest all of your enterprise data from SaaS and database sources to unlock data intelligence for your AI agents.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H2&gt;Ingest millions of records per day, per workspace for free&lt;/H2&gt;
&lt;P&gt;This new Lakeflow Connect Free Tier provides 100 DBUs per day, per workspace, which allows you to ingest approximately 100 million records* from many popular data sources**, including SaaS applications and databases.&amp;nbsp;&lt;/P&gt;
&lt;H2&gt;Unlock your enterprise data for free with Lakeflow Connect&lt;/H2&gt;
&lt;P&gt;This new offering provides all the benefits of &lt;A href="https://www.databricks.com/product/data-engineering/lakeflow-connect" target="_blank" rel="noopener"&gt;Lakeflow Connect&lt;/A&gt;, eliminating the heavy lifting so your teams can focus on unlocking data insights instead of managing infrastructure.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In the past year, Databricks has continued rolling out several fully managed connectors, supporting popular data sources. The free tier supports popular SaaS applications (Salesforce, ServiceNow, Google Analytics, Workday, Microsoft Dynamics 365), and top-used databases (SQL Server, Oracle, Teradata, PostgreSQL, MySQL, Snowflake, Redshift, Synapse, and BigQuery).&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Lakeflow Connect benefits include:&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;Simple UI: Avoid complex setups and architectural overhead, these fully managed connectors provide a simple UI and API to democratize data access. Automated features also help simplify pipeline maintenance with minimal overhead.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Efficient ingestion: Increase efficiency and accelerate time to value. Optimized incremental reads and writes and data transformation help improve the performance and reliability of your pipelines, reduce bottlenecks, and reduce impact to the source data for scalability.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Unified with the Databricks Platform: Create ingestion pipelines with governance from Unity Catalog, observability from Lakehouse Monitoring, and seamless orchestration with Lakeflow Jobs for analytics, AI and BI.&lt;/LI&gt;
&lt;/UL&gt;
&lt;img /&gt;
&lt;H2&gt;Availability&lt;/H2&gt;
&lt;P&gt;The Lakeflow Connect Free Tier is available starting today on Azure Databricks.&lt;/P&gt;
&lt;P&gt;If you are at FabCon in Atlanta,&amp;nbsp;&lt;A href="https://fabriccon.com/program/agenda?search=azure+databricks" target="_blank" rel="noopener"&gt;Accelerating Data and AI with Azure Databricks&lt;/A&gt; on Thursday, March 19th, 8:00–9:00 AM, room C302 to see how these capabilities come together to accelerate performance, simplify architecture, and maximize value on Azure&lt;/P&gt;
&lt;H2&gt;Getting Started Resources&amp;nbsp;&lt;/H2&gt;
&lt;P&gt;To learn more about the Lakeflow Connect Free Tier and Lakeflow Connect, review our &lt;A href="https://www.databricks.com/product/pricing/lakeflow-connect" target="_blank" rel="noopener"&gt;pricing page&lt;/A&gt;, and &lt;A href="https://docs.databricks.com/aws/en/ingestion/overview" target="_blank" rel="noopener"&gt;documentation&lt;/A&gt;. Get started ingesting your data today for free, &lt;A href="https://login.databricks.com/signup?provider=DB_FREE_TIER&amp;amp;dbx_source=www&amp;amp;itm_data=dbx-web&amp;amp;l=en-EN&amp;amp;tuuid=7d3c1b33-e87c-4f65-8e12-04f5d9adc328&amp;amp;intent=SIGN_UP&amp;amp;rl_aid=d079708c-6872-4bcb-90d2-e4718607fb01" target="_blank" rel="noopener"&gt;signup with an Azure free account&lt;/A&gt;.&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;&lt;A href="https://login.databricks.com/signup?provider=DB_FREE_TIER&amp;amp;dbx_source=www&amp;amp;itm_data=dbx-web&amp;amp;l=en-EN&amp;amp;tuuid=7d3c1b33-e87c-4f65-8e12-04f5d9adc328&amp;amp;intent=SIGN_UP&amp;amp;rl_aid=d079708c-6872-4bcb-90d2-e4718607fb01" target="_blank" rel="noopener"&gt;Get started with Azure Databricks for free&lt;/A&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Product tour: &lt;A href="https://www.databricks.com/resources/demos/tours/platform/discover-databricks-lakeflow-connect-demo?itm_data=demo_center&amp;amp;itm_source=www&amp;amp;itm_category=resources&amp;amp;itm_page=library&amp;amp;itm_location=body&amp;amp;itm_component=general-asset-card&amp;amp;itm_offer=discover-databricks-lakeflow-connect-demo" target="_blank" rel="noopener"&gt;Databricks Lakeflow Connect for Salesforce: Powering Smarter Selling with AI and Analytics&lt;/A&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Product tour: &lt;A href="https://www.databricks.com/resources/demos/tours/platform/servicenow-lakeflow-connect?itm_data=demo_center&amp;amp;itm_source=www&amp;amp;itm_category=resources&amp;amp;itm_page=library&amp;amp;itm_location=body&amp;amp;itm_component=general-asset-card&amp;amp;itm_offer=servicenow-lakeflow-connect" target="_blank" rel="noopener"&gt;Effortless ServiceNow Data Ingestion with Databricks Lakeflow Connect&lt;/A&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Product tour: &lt;A href="https://www.databricks.com/resources/demos/tours/platform/google-analytics-lakeflow-connect?itm_data=demo_center&amp;amp;itm_source=www&amp;amp;itm_category=resources&amp;amp;itm_page=library&amp;amp;itm_location=body&amp;amp;itm_component=general-asset-card&amp;amp;itm_offer=google-analytics-lakeflow-connect" target="_blank" rel="noopener"&gt;Simplify Data Ingestion with Lakeflow Connect: From Google Analytics to AI&lt;/A&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;On-demand video: &lt;A href="https://www.databricks.com/resources/demos/videos/lakeflow-connect-salesforce-connector?itm_data=demo_center&amp;amp;itm_source=www&amp;amp;itm_category=resources&amp;amp;itm_page=library&amp;amp;itm_location=body&amp;amp;itm_component=general-asset-card&amp;amp;itm_offer=lakeflow-connect-salesforce-connector" target="_blank" rel="noopener"&gt;Use Lakeflow Connect for Salesforce to predict customer churn&lt;/A&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;On-demand video: &lt;A href="https://www.databricks.com/resources/demos/tours/appdev/lakeflow-workday-connect?itm_data=demo_center&amp;amp;itm_source=www&amp;amp;itm_category=resources&amp;amp;itm_page=library&amp;amp;itm_location=body&amp;amp;itm_component=general-asset-card&amp;amp;itm_offer=lakeflow-workday-connect" target="_blank" rel="noopener"&gt;Databricks Lakeflow Connect for Workday Reports: Connect, Ingest, and Analyze Workday Data Without Complexity&lt;/A&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;On-demand video: &lt;A href="https://www.databricks.com/resources/demos/videos/data-ingestion-with-lakeflow-connect?itm_data=demo_center&amp;amp;itm_source=www&amp;amp;itm_category=resources&amp;amp;itm_page=library&amp;amp;itm_location=body&amp;amp;itm_component=general-asset-card&amp;amp;itm_offer=data-ingestion-with-lakeflow-connect" target="_blank" rel="noopener"&gt;Data Ingestion With Lakeflow Connect&lt;/A&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;—--&lt;/P&gt;
&lt;P&gt;&lt;EM&gt;* Your actual ingestion capacity will vary based on specific workload characteristics, record sizes, and source types.&lt;/EM&gt;&lt;/P&gt;
&lt;P&gt;&lt;EM&gt;** Excludes Zerobus Ingest, Auto Loader and other self-managed connectors. Customer will continue to incur charges for underlying infrastructure consumption from the cloud vendor.&lt;/EM&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 18 Mar 2026 16:00:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/introducing-lakeflow-connect-free-tier-now-available-on-azure/ba-p/4502755</guid>
      <dc:creator>AnaviNahar</dc:creator>
      <dc:date>2026-03-18T16:00:00Z</dc:date>
    </item>
    <item>
      <title>Near–Real-Time CDC to Delta Lake for BI and ML with Lakeflow on Azure Databricks</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/near-real-time-cdc-to-delta-lake-for-bi-and-ml-with-lakeflow-on/ba-p/4502750</link>
      <description>&lt;H2 aria-level="2"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;The Challenge: Too Many Tools, Not Enough Clarity&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:360,&amp;quot;335559739&amp;quot;:80}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H2&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Modern data teams on Azure often stitch together separate orchestrators, custom streaming consumers, hand-rolled transformation notebooks, and third-party connectors — each with its own monitoring UI, credential system, and failure modes. The result is observability gaps, weeks of work per new data source, disconnected lineage, and governance bolted on as an afterthought.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;A href="https://www.databricks.com/product/data-engineering" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Lakeflow&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;,&amp;nbsp;Databricks’ unified data engineering solution,&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;solves this by&amp;nbsp;consolidating&amp;nbsp;ingestion, transformation, and orchestration natively inside Azure Databricks — governed end-to-end by&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://www.databricks.com/product/unity-catalog" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Unity Catalog&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;DIV class="styles_lia-table-wrapper__h6Xo9 styles_table-responsive__MW0lN"&gt;&lt;table border="1" style="width: 78.6111%; border-width: 1px;"&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Component&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;What It Does&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/ingestion/lakeflow-connect/" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Lakeflow Connect&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Point-and-click connectors for databases (&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;using&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;CDC), SaaS apps, files, streaming, and&amp;nbsp;Zerobus&amp;nbsp;for direct telemetry&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/ldp/concepts" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Lakeflow Spark Declarative Pipelines&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Declarative ETL with&amp;nbsp;AutoCDC, data quality enforcement, and automatic incremental processing&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/jobs/" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Lakeflow Jobs&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Managed orchestration with 99.95% uptime, a visual task DAG, and repair-and-rerun&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;colgroup&gt;&lt;col style="width: 37.3295%" /&gt;&lt;col style="width: 62.6292%" /&gt;&lt;/colgroup&gt;&lt;/table&gt;&lt;/DIV&gt;
&lt;H2&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;Architecture&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:360,&amp;quot;335559739&amp;quot;:80}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H2&gt;
&lt;img /&gt;
&lt;H3&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;Step 1: Stream Application Telemetry with &lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;Zerobus&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;&amp;nbsp;Ingest&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:360,&amp;quot;335559739&amp;quot;:80}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H3&gt;
&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/ingestion/zerobus-overview" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Zerobus Ingest&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;,&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;part of&amp;nbsp;Lakeflow&amp;nbsp;Connect, lets your application push events directly to a Delta table over&amp;nbsp;gRPC&amp;nbsp;— no message bus, no Structured Streaming job. Sub-5-second latency, up to 100 MB/sec per connection,&amp;nbsp;immediately&amp;nbsp;queryable&amp;nbsp;in&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Unity Catalog&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Prerequisites&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="20" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="none"&gt;Azure Databricks workspace with Unity Catalog enabled and serverless compute on&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="20" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;A service principal with write access to the target table&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Setup&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;First, create the target table in a SQL notebook:&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;PRE class="lia-indent-padding-left-60px"&gt;&lt;SPAN data-contrast="none"&gt;CREATE CATALOG IF NOT EXISTS&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod;&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;CREATE SCHEMA IF NOT EXISTS&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.bronze;&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;CREATE TABLE IF NOT EXISTS&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.bronze.telemetry_events&amp;nbsp;(&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;event_id&amp;nbsp;&amp;nbsp;&amp;nbsp; STRING,&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;user_id&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; STRING,&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;event_type&amp;nbsp; STRING,&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;session_id&amp;nbsp; STRING,&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;ts&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;BIGINT&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;,&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp; page&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp; STRING,&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;duration_ms&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;INT&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;);&lt;/SPAN&gt; &lt;/PRE&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;1. Go to&amp;nbsp;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Settings → Identity and Access → Service Principals → Add service principal&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;2. Open the service principal →&amp;nbsp;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Secrets&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;tab →&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Generate&amp;nbsp;secret&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;. Save the Client ID and secret.&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;3. In a SQL notebook, grant access:&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;PRE class="lia-indent-padding-left-60px"&gt;&lt;SPAN data-contrast="none"&gt;GRANT USE CATALOG ON CATALOG&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;TO&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;`&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;lt;client-id&amp;gt;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;`&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;;&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;GRANT USE SCHEMA ON SCHEMA&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.bronze&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;TO&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;`&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;lt;client-id&amp;gt;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;`&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;;&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;GRANT MODIFY&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;,&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;SELECT ON TABLE&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.bronze.telemetry_events&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;TO&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;`&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;lt;client-id&amp;gt;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;`&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;;&lt;/SPAN&gt; &lt;/PRE&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;4. Derive your Zerobus endpoint from your workspace URL:&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;lt;workspace-id&amp;gt;.zerobus.&amp;lt;region&amp;gt;.azuredatabricks.net&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;(The workspace ID is the number in your workspace URL,&amp;nbsp;e.g.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;adb-**1234567890**.12.azuredatabricks.net&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;)&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;5. Install the SDK:&amp;nbsp;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;pip install&amp;nbsp;databricks-zerobus-ingest-sdk&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;6. In your application, open a stream and push records:&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;PRE class="lia-indent-padding-left-60px"&gt;&lt;SPAN data-contrast="none"&gt;from&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;zerobus.sdk.sync&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;import&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;ZerobusSdk&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;from&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;zerobus.sdk.shared&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;import&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;RecordType,&amp;nbsp;StreamConfigurationOptions,&amp;nbsp;TableProperties&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;sdk&amp;nbsp;&amp;nbsp;&amp;nbsp; =&amp;nbsp;ZerobusSdk(&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;"&amp;lt;workspace-id&amp;gt;.zerobus.&amp;lt;region&amp;gt;.azuredatabricks.net"&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;,&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;"&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;https&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;://&amp;lt;workspace-&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;url&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;gt;"&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;)&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;stream =&amp;nbsp;sdk.create_stream(&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;"&amp;lt;client-id&amp;gt;"&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;,&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;"&amp;lt;client-secret&amp;gt;"&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;,&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;TableProperties(&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;"prod.bronze.telemetry_events"&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;),&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;StreamConfigurationOptions(record_type=RecordType.JSON)&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;)&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;stream.ingest_record({&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;"event_id"&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;:&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;"e1"&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;,&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;"user_id"&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;:&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;"u42"&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;,&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;"event_type"&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;:&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;"page_view"&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;,&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;"ts"&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;:&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;1700000000000&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;})&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;stream.close()&lt;/SPAN&gt; &lt;/PRE&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;7. Verify in&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Catalog → prod → bronze →&amp;nbsp;telemetry_events&amp;nbsp;→ Sample Data&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;img /&gt;
&lt;H2&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;Step 2: Ingest from On-Premises SQL Server via CDC&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:360,&amp;quot;335559739&amp;quot;:80}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H2&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Lakeflow&amp;nbsp;Connect reads SQL Server's transaction log incrementally — no full table scans, no custom extraction software. Connectivity to your on-prem server is over Azure ExpressRoute.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Prerequisites&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="13" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="none"&gt;SQL Server reachable from Databricks over ExpressRoute (TCP port 1433)&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="13" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;CDC enabled on the source database and tables (see setup below)&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="13" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;A SQL login with CDC read permissions on the source database&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:210,&amp;quot;335559739&amp;quot;:210}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="13" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Databricks: &lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;CREATE CONNECTION&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;privilege on the&amp;nbsp;metastore;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;USE CATALOG&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;,&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;CREATE TABLE&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;on the destination catalog&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;Setup&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;PRE class="lia-indent-padding-left-30px"&gt;&lt;SPAN data-contrast="none"&gt;Enable CDC on SQL Server:&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;USE&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;YourDatabase;&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;EXEC&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;sys.sp_cdc_enable_db;&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;EXEC&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;sys.sp_cdc_enable_table&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp; @source_schema =&amp;nbsp;N&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;dbo&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;, @source_name =&amp;nbsp;N&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;orders&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;,&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;@role_name =&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;NULL&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;;&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;EXEC&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;sys.sp_cdc_enable_table&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp; @source_schema =&amp;nbsp;N&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;dbo&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;, @source_name =&amp;nbsp;N&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;customers&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;, @role_name =&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;NULL&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/PRE&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Configure the connector in Databricks:&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="12" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="none"&gt;Click&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Data Ingestion&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;in the sidebar (or&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;+ New → Add Data&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;)&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="12" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Select &lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;SQL Server&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;from the connector list&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="12" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Ingestion Gateway page&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;— enter a gateway name, select staging catalog/schema, click&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Next&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="12" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Ingestion Pipeline page&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;— name the pipeline, click&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Create connection&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;:&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="12" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Host: your on-prem IP (e.g. &lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;10.0.1.50&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;) · Port:&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;1433&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;· Database:&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;YourDatabase&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="12" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Enter credentials, click &lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Create&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;, then&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Create pipeline and continue&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="12" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Source page&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;— expand the database tree, check&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;dbo.orders&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;and&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;dbo.customers&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;; optionally enable&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;History tracking&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;(SCD Type 2) per table.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Set&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Destination table name&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;to&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;orders_raw&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;and&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;customers_raw&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;respectively.&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="12" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Destination page&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;— set catalog:&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;prod&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;, schema:&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;bronze&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;, click&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Save and continue&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="12" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Settings page&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;— set a sync schedule (e.g.&amp;nbsp;every 5 minutes), click&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Save and run pipeline&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/OL&gt;
&lt;img /&gt;
&lt;H3&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;Step 3: Transform with Spark Declarative Pipelines&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:360,&amp;quot;335559739&amp;quot;:80}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H3&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;The&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/ldp/multi-file-editor" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Lakeflow Pipelines Editor&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;is an IDE built for developing pipelines in&amp;nbsp;Lakeflow&amp;nbsp;Spark Declarative Pipelines (SDP), and&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;lets you define Bronze → Silver → Gold in SQL. SDP then handles incremental execution, schema evolution, and lineage automatically.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Prerequisites&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="11" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="none"&gt;Bronze tables populated (from Steps 1 and 2)&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="11" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;CREATE TABLE&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;and&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;USE SCHEMA&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;privileges on&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;prod.silver&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;and&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;prod.gold&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Setup&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;1. In the sidebar, click&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Jobs &amp;amp; Pipelines → ETL pipeline → Start with an empty file → SQL&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;2. Rename the pipeline (click the name at top) to&amp;nbsp;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;lakeflow-demo-pipeline&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;3. Paste the following SQL:&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;PRE class="lia-indent-padding-left-30px"&gt;&lt;SPAN data-contrast="none"&gt;-- Silver: latest order state (SCD Type 1)&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;CREATE OR REFRESH&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;STREAMING&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;TABLE&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.silver.orders;&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;APPLY CHANGES&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;INTO&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.silver.orders&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;FROM&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;STREAM(prod.bronze.orders_raw)&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;KEYS&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;(order_id)&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;SEQUENCE BY&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;updated_at&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;STORED AS&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;SCD&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;TYPE&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;1&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;;&lt;/SPAN&gt; &lt;BR /&gt;&lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;-- Silver: full customer history (SCD Type 2)&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;CREATE OR REFRESH&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;STREAMING&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;TABLE&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.silver.customers;&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;APPLY CHANGES&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;INTO&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.silver.customers&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;FROM&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;STREAM(prod.bronze.customers_raw)&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;KEYS&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;(customer_id)&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;SEQUENCE BY&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;updated_at&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;STORED AS&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;SCD&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;TYPE&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;2&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;;&lt;/SPAN&gt; &lt;BR /&gt;&lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;-- Silver: telemetry with data quality check&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;CREATE OR REFRESH&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;STREAMING&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;TABLE&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.silver.telemetry_events&amp;nbsp;(&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559685&amp;quot;:360,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;BR /&gt;&lt;SPAN data-contrast="none"&gt; &amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;CONSTRAINT&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;valid_event_type&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp; EXPECT (event_type&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;IN&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;(&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'page_view'&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;,&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'add_to_cart'&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;,&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'purchase'&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;))&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;ON&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;VIOLATION&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;DROP ROW&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;)&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;AS SELECT&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;*&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;FROM&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;STREAM(prod.bronze.telemetry_events);&lt;/SPAN&gt; &lt;BR /&gt;&lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;-- Gold: materialized view joining all three&amp;nbsp;Silver&amp;nbsp;tables&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;CREATE OR REFRESH MATERIALIZED VIEW&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.gold.customer_activity&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;AS&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;SELECT&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;o.order_id,&amp;nbsp;o.customer_id,&amp;nbsp;c.customer_name,&amp;nbsp;c.email,&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;o.order_amount,&amp;nbsp;o.order_status,&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;COUNT&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;(e.event_id)&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;AS&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;total_events,&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;SUM&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;(&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;CASE WHEN&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;e.event_type&amp;nbsp;=&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'purchase'&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;THEN&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;1&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;ELSE&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;0&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;END&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;)&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;AS&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;purchase_events&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;FROM&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.silver.orders&amp;nbsp;o&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;LEFT JOIN&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.silver.customers&amp;nbsp;c&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;ON&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;o.customer_id&amp;nbsp;=&amp;nbsp;c.customer_id&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;LEFT JOIN&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.silver.telemetry_events&amp;nbsp;e&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;ON&amp;nbsp;CAST&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;(o.customer_id&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;AS&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;STRING) =&amp;nbsp;e.user_id&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;--&amp;nbsp;user_id&amp;nbsp;in telemetry is string&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;GROUP BY&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;o.order_id,&amp;nbsp;o.customer_id,&amp;nbsp;c.customer_name,&amp;nbsp;c.email,&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;o.order_amount,&amp;nbsp;o.order_status;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559685&amp;quot;:360,&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/PRE&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;4. Click&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Settings&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;(gear icon) → set&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Pipeline mode: Continuous&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;→&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Target catalog: prod&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;→&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Save&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;5. Click&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Start&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;— the editor switches to the live Graph view&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;img /&gt;
&lt;H3&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;Step 4: Govern with Unity Catalog&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:360,&amp;quot;335559739&amp;quot;:80}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H3&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;All tables from Steps 1–3 are automatically registered in&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Unity Catalog&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;, Databricks’ built-in governance and security offering, with full lineage. No manual registration&amp;nbsp;needed.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;View lineage&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="8" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="none"&gt;Go to&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Catalog → prod → gold →&amp;nbsp;customer_activity&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="8" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Click the &lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Lineage&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;tab →&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;See Lineage Graph&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="8" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Click the expand icon on each upstream node to reveal the full chain: Bronze sources → Silver → Gold&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;STRONG&gt;Set Permissions&lt;/STRONG&gt;&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;PRE class="lia-indent-padding-left-60px"&gt;&lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;-- Grant analysts read access to the&amp;nbsp;Gold&amp;nbsp;layer only&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;GRANT SELECT ON TABLE&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.gold.customer_activity&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;TO&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://encoded-592c9deb-987b-4562-aa3c-9fa3d37d83e9.uri/mailto%3a%2560analysts%40contoso.com%2560" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;`analysts@contoso.com`&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;;&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;-- Mask PII for non-privileged users&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;CREATE FUNCTION&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.security.mask_email(email STRING)&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;RETURNS&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;STRING&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;RETURN CASE WHEN&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;is_account_group_member(&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'data-engineers'&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;)&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;THEN&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;email&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;ELSE&amp;nbsp;CONCAT&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;(&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;LEFT&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;(email,&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;2&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;),&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;'&lt;/SPAN&gt;&lt;A href="https://encoded-592c9deb-987b-4562-aa3c-9fa3d37d83e9.uri/mailto%3a***%40***.com" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;***@***.com&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;'&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;)&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;END&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;;&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;ALTER TABLE&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;prod.silver.customers&lt;/SPAN&gt; &lt;BR /&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;ALTER COLUMN&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;email&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;SET&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;MASK&amp;nbsp;prod.security.mask_email;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/PRE&gt;
&lt;/BLOCKQUOTE&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;H3 aria-level="2"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;Step 5: Orchestrate and Monitor with&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;Lakeflow&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;&amp;nbsp;Jobs&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:360,&amp;quot;335559739&amp;quot;:80}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H3&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Wire the Connect pipeline and SDP pipeline into a single job with dependencies, scheduling, and alerting, all from the UI with&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;Lakeflow&amp;nbsp;Jobs.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Prerequisites&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="7" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="none"&gt;Pipelines from Steps 2 and 3 saved in the workspace&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Setup&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="6" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="none"&gt;Go to&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Jobs &amp;amp; Pipelines → Create → Job&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="6" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Task 1:&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;click the Pipeline tile → name it&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;ingest_sql_server_cdc&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;→ select your&amp;nbsp;Lakeflow&amp;nbsp;Connect pipeline →&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Create task&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="6" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Task 2:&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;click&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;+ Add task → Pipeline&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;→ name it&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;transform_bronze_to_gold&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;→ select&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;lakeflow-demo-pipeline&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;→ set&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Depends on:&amp;nbsp;ingest_sql_server_cdc&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;→&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Create task&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="6" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;In the &lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Job details&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;panel on the right: click&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Add schedule&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;→ set frequency → add email notification on failure →&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Save&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="6" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Click &lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Run now&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;to trigger a run, then click the run ID to open the Run detail view&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/OL&gt;
&lt;img /&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;For health monitoring across all jobs, query system tables in any notebook or SQL warehouse:&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;BLOCKQUOTE&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;&lt;SPAN data-contrast="none"&gt;SELECT&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;job_name,&amp;nbsp;result_state,&amp;nbsp;DATEDIFF(second,&amp;nbsp;start_time,&amp;nbsp;end_time)&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;AS&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;duration_sec&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;&lt;SPAN data-contrast="none"&gt;FROM&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;system.lakeflow.job_run_timeline&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;&lt;SPAN data-contrast="none"&gt;WHERE&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;start_time&amp;nbsp;&amp;gt;=&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;CURRENT_TIMESTAMP&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;-&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;INTERVAL&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;24&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;HOURS&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;&lt;SPAN data-contrast="none"&gt;ORDER BY&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;start_time&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;DESC&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px" aria-level="2"&gt;&amp;nbsp;&lt;/P&gt;
&lt;/BLOCKQUOTE&gt;
&lt;H3&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;Step 6: Visualize with AI/BI Dashboards and Genie&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:360,&amp;quot;335559739&amp;quot;:80}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/H3&gt;
&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/ai-bi/" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;AI/BI Dashboard&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;helps you create AI-powered, low-code dashboards.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="5" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="none"&gt;Click&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;+ New → Dashboard&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="5" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Click &lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Add a visualization&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;, connect to&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;prod.gold.customer_activity&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;, and build charts&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="%1." data-font="DM Sans" data-listid="5" data-list-defn-props="{&amp;quot;335552541&amp;quot;:0,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769242&amp;quot;:[65533,0],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;%1.&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Click &lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Publish&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;— viewers see data under their own Unity Catalog permissions automatically&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/genie/" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Genie&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;allows you to interact with their data using natural language&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;1. In the sidebar, click&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;Genie → New&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;2. On&amp;nbsp;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Choose data sources&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;, select&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;prod.gold.customer_activity&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;→&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Create&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;3. Add context in the&amp;nbsp;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Instructions&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;box (e.g., table relationships, business definitions)&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;4. Switch to the&amp;nbsp;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;Chat&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt; tab and ask a question:&lt;/SPAN&gt;&lt;/P&gt;
&lt;P class="lia-indent-padding-left-60px"&gt;"Which customers have the highest total events and what were their order amounts?"&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559685&amp;quot;:1320,&amp;quot;335559737&amp;quot;:600,&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;5. Genie generates and executes SQL, returning a result table. Click&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;View SQL&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;to inspect the query.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;img /&gt;
&lt;H2 aria-level="2"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;Everything in One Platform&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:360,&amp;quot;335559739&amp;quot;:80}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H2&gt;
&lt;DIV class="styles_lia-table-wrapper__h6Xo9 styles_table-responsive__MW0lN"&gt;&lt;table border="1" style="border-width: 1px;"&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Capability&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Lakeflow&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Previously Required&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335551550&amp;quot;:2,&amp;quot;335551620&amp;quot;:2,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Telemetry ingestion&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Zerobus&amp;nbsp;Ingest&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Message bus + custom consumer&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Database CDC&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Lakeflow&amp;nbsp;Connect&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Custom scripts or 3rd-party tools&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Transformation +&amp;nbsp;AutoCDC&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Spark Declarative Pipelines&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Hand-rolled MERGE logic&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Data quality&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;SDP Expectations&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Separate validation tooling&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Orchestration&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Lakeflow&amp;nbsp;Jobs&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;External schedulers (Airflow, etc.)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Governance&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Unity Catalog&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Disconnected ACLs and lineage&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Monitoring&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Job UI + System Tables&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Separate APM tools&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;BI + NL Query&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;AI/BI Dashboards + Genie&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;External BI tools&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;colgroup&gt;&lt;col style="width: 33.33%" /&gt;&lt;col style="width: 33.33%" /&gt;&lt;col style="width: 33.33%" /&gt;&lt;/colgroup&gt;&lt;/table&gt;&lt;/DIV&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Customers seeing results on Azure Databricks:&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:120}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="2" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="none"&gt;Ahold Delhaize&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;— 4.5x faster deployment and 50% cost reduction running 1,000+ ingestion jobs daily&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="2" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;Porsche Holding&lt;SPAN style="color: rgb(30, 30, 30);" data-contrast="none"&gt;&amp;nbsp;— 85% faster ingestion pipeline development vs. a custom-built solution&lt;/SPAN&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;H2 aria-level="2"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 2"&gt;Next Steps&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;335559738&amp;quot;:360,&amp;quot;335559739&amp;quot;:80}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/H2&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;A href="https://www.databricks.com/product/data-engineering" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Lakeflow product page&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;A style="font-style: normal; font-weight: 400; background-color: rgb(255, 255, 255);" href="https://docs.databricks.com/ingestion/lakeflow-connect/index.html" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Lakeflow Connect documentation&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;A style="font-style: normal; font-weight: 400; background-color: rgb(255, 255, 255);" href="https://databricks.com/demos" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Live demos on Demo Center&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt; &lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;A style="font-style: normal; font-weight: 400; background-color: rgb(255, 255, 255);" href="https://azure.microsoft.com/products/databricks" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Get started with Azure Databricks&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN style="color: rgb(30, 30, 30);" data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;</description>
      <pubDate>Wed, 18 Mar 2026 16:00:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/near-real-time-cdc-to-delta-lake-for-bi-and-ml-with-lakeflow-on/ba-p/4502750</guid>
      <dc:creator>AnaviNahar</dc:creator>
      <dc:date>2026-03-18T16:00:00Z</dc:date>
    </item>
    <item>
      <title>Azure Databricks Lakebase is now generally available</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/azure-databricks-lakebase-is-now-generally-available/ba-p/4498779</link>
      <description>&lt;P&gt;Modern applications are built on real-time, intelligent, and increasingly powered by AI agents that need fast, reliable access to operational data—without sacrificing governance, scale, or simplicity.&lt;/P&gt;
&lt;P&gt;To solve for this, Azure Databricks Lakebase introduces a serverless, Postgres database architecture that separates compute from storage and integrates natively with the Databricks Data Intelligence Platform on Azure. Lakebase is now generally available in Azure Databricks enabling you and your team to start building and validating real-time and AI-driven applications directly on your lakehouse foundation.&lt;/P&gt;
&lt;H3&gt;Why Azure Databricks Lakebase?&lt;/H3&gt;
&lt;P&gt;Lakebase was created for modern workloads and reduce silos. By decoupling compute from storage, Lakebase treats infrastructure as an on-demand service—scaling automatically with workload needs and scaling to zero when idle.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Key capabilities include:&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Serverless Postgres for Production Workloads: Lakebase delivers a managed Postgres experience with predictable performance and built-in reliability features suitable for production applications, while abstracting away infrastructure management.&lt;/LI&gt;
&lt;LI&gt;Instant Branching and Point-in-Time Recovery: Teams can create zero-copy branches of production data in seconds for testing, debugging, or experimentation, and restore databases to precise points in time to recover from errors or incidents.&lt;/LI&gt;
&lt;LI&gt;Unified Governance with Unity Catalog: Operational data in Lakebase can be governed using the same Unity Catalog policies that secure analytics and AI workloads, enabling consistent access control, auditing, and compliance across the platform.&lt;/LI&gt;
&lt;LI&gt;Built for AI and Real-Time Applications: Lakebase is designed to support AI-native patterns such as real-time feature serving, agent memory, and low-latency application state—while keeping data directly connected to the lakehouse for analytics and learning workflows.&lt;/LI&gt;
&lt;LI&gt;Lakebase allows applications to operate directly on governed, lake-backed data—reducing complexity with pipeline synchronization or duplicating storage&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;On Azure Databricks, this unlocks new scenarios such as:&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;Real-time applications built on lakehouse data&lt;/LI&gt;
&lt;LI&gt;AI agents with persistent, governed memory&lt;/LI&gt;
&lt;LI&gt;Faster release cycles with safe, isolated database branches&lt;/LI&gt;
&lt;LI&gt;Simplified architectures with fewer moving parts&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;All while using familiar Postgres interfaces and tools.&lt;/P&gt;
&lt;H3&gt;Get Started with Azure Databricks Lakebase&lt;/H3&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Lakebase&amp;nbsp;is integrated into the Azure Databricks experience and can be provisioned directly within Azure Databricks workspaces.&amp;nbsp;For Azure Databricks customers building intelligent, real-time applications, it offers a new foundation—one designed for the pace and complexity of modern data-driven systems.&amp;nbsp;We’re&amp;nbsp;excited to see what you build,&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/oltp/projects/" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;get started today&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;!&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;img /&gt;</description>
      <pubDate>Tue, 03 Mar 2026 03:34:11 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/azure-databricks-lakebase-is-now-generally-available/ba-p/4498779</guid>
      <dc:creator>AnaviNahar</dc:creator>
      <dc:date>2026-03-03T03:34:11Z</dc:date>
    </item>
    <item>
      <title>Serverless Workspaces are generally available in Azure Databricks</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/serverless-workspaces-are-generally-available-in-azure/ba-p/4491314</link>
      <description>&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Recently, we&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://techcommunity.microsoft.com/blog/analyticsonazure/serverless-workspaces-are-live-in-azure-databricks/4474712" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;announced&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;that Serverless Workspaces in public preview.&amp;nbsp;Today, we are excited to share that Serverless Workspaces are&amp;nbsp;generally available&amp;nbsp;in Azure Databricks.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;Azure Databricks now offers&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;two workspace models&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;: Serverless and Classic.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:210,&amp;quot;335559739&amp;quot;:210,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;With Serverless, Azure Databricks&amp;nbsp;operates&amp;nbsp;and maintains the entire environment on your behalf. You still configure governance elements like Unity Catalog, identity federation, and workspace-level policies, but the heavy lifting of infrastructure setup disappears. As a result, teams can start building&amp;nbsp;immediately&amp;nbsp;instead of waiting on networking or compute provisioning.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:210,&amp;quot;335559739&amp;quot;:210,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Classic workspaces take the opposite approach. When creating a Classic workspace, you must design and deploy the full networking layout, determine how compute should be managed, establish storage patterns, and configure all inbound and outbound connectivity. These decisions are critical and have benefits in heavily regulated or secure industries, but they may create overhead for teams who simply want to start working with data.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:210,&amp;quot;335559739&amp;quot;:210,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;A&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;Serverless Workspace&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;eliminates&amp;nbsp;that overhead entirely. Once created,&amp;nbsp;it’s&amp;nbsp;ready for use—no&amp;nbsp;virtual network&amp;nbsp;design, no storage configuration, and no cluster management.&amp;nbsp;Serverless workspaces use serverless&amp;nbsp;compute&amp;nbsp;and default storage.&amp;nbsp;Unity Catalog&amp;nbsp;is&amp;nbsp;automatically provisioned so that the same governance model applies.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:210,&amp;quot;335559739&amp;quot;:210,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P aria-level="3"&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;Key capabilities&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;&amp;nbsp;and consideration&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;of Serverless workspaces&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:246,&amp;quot;335559739&amp;quot;:246,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;U&gt;&lt;SPAN data-contrast="auto"&gt;Storage:&lt;/SPAN&gt;&lt;/U&gt;&lt;BR /&gt;&lt;SPAN data-contrast="auto"&gt;Each Serverless workspace includes fully managed object storage&amp;nbsp;called default storage. You can build managed catalogs, volumes, and tables without supplying your own storage accounts or credentials. Features like multi-key projection and restricted object-store access ensure that only authorized users can work with the data.&amp;nbsp;Classic&amp;nbsp;compute&amp;nbsp;cannot interact with data assets in default storage.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:210,&amp;quot;335559739&amp;quot;:210,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;If you already have&amp;nbsp;an Azure Blob Storage account (likely with&amp;nbsp;hierarchical namespace enabled)&amp;nbsp;or if your organization requires using your own storage due to security&amp;nbsp;or compliance&amp;nbsp;requirements,&amp;nbsp;you can also&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/cloud-storage/" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;create a connection between your storage account and Serverless Workspace&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:210,&amp;quot;335559739&amp;quot;:210,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;U&gt;&lt;SPAN data-contrast="auto"&gt;Compute:&lt;/SPAN&gt;&lt;/U&gt;&lt;BR /&gt;&lt;SPAN data-contrast="auto"&gt;Workloads run on automatically provisioned serverless&amp;nbsp;compute—no need to build or&amp;nbsp;maintain&amp;nbsp;clusters. Azure Databricks handles scaling and resource&amp;nbsp;optimization&amp;nbsp;so users can focus purely on data and analytics.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:210,&amp;quot;335559739&amp;quot;:210,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;U&gt;&lt;SPAN data-contrast="auto"&gt;Network:&lt;/SPAN&gt;&lt;/U&gt;&lt;BR /&gt;&lt;SPAN data-contrast="auto"&gt;There’s&amp;nbsp;no requirement to deploy NAT gateways, firewalls, or Private Link endpoints. Instead, you define serverless egress rules and serverless Private Link controls that apply uniformly to all workloads in the workspace.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:210,&amp;quot;335559739&amp;quot;:210,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;U&gt;&lt;SPAN data-contrast="auto"&gt;Unity Catalog access:&lt;/SPAN&gt;&amp;nbsp;&lt;/U&gt;&lt;BR /&gt;&lt;SPAN data-contrast="auto"&gt;Governed data&amp;nbsp;remains&amp;nbsp;accessible from the new workspace with existing permissions intact. Your data estate stays consistent and secure.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:210,&amp;quot;335559739&amp;quot;:210,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P aria-level="3"&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-parastyle="heading 3"&gt;Choosing between Serverless and Classic&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;134245418&amp;quot;:true,&amp;quot;134245529&amp;quot;:true,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:246,&amp;quot;335559739&amp;quot;:246,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Azure Databricks supports both workspace&amp;nbsp;types&amp;nbsp;so organizations can select what best matches their needs.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:210,&amp;quot;335559739&amp;quot;:210,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="11" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559683&amp;quot;:0,&amp;quot;335559684&amp;quot;:-2,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="1" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Use Serverless&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;when rapid workspace creation, minimal configuration, and reduced operational overhead are priorities.&amp;nbsp;It’s&amp;nbsp;the fastest path to a fully governed environment.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="11" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559683&amp;quot;:0,&amp;quot;335559684&amp;quot;:-2,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="2" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Use Classic&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;when you&amp;nbsp;require&amp;nbsp;a custom&amp;nbsp;VNet&amp;nbsp;design, specific network topologies,&amp;nbsp;finer&amp;nbsp;grain security controls,&amp;nbsp;or features that are not yet available in the&amp;nbsp;serverless model. Some organizations also simply prefer to manage Azure resources directly, making Classic workspaces a suitable&amp;nbsp;option.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:0,&amp;quot;335551620&amp;quot;:0,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0,&amp;quot;335559740&amp;quot;:300}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Note that Azure Databricks serverless workspaces are only available in&amp;nbsp;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/resources/supported-regions" target="_blank" rel="noopener"&gt;regions that support serverless compute&lt;/A&gt;. To learn get started&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;create your Serverless workspace&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;&amp;nbsp;today!&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 02 Feb 2026 17:15:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/serverless-workspaces-are-generally-available-in-azure/ba-p/4491314</guid>
      <dc:creator>AnaviNahar</dc:creator>
      <dc:date>2026-02-02T17:15:00Z</dc:date>
    </item>
    <item>
      <title>Serverless Workspaces are live in Azure Databricks</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/serverless-workspaces-are-live-in-azure-databricks/ba-p/4474712</link>
      <description>&lt;P&gt;As customers continue to deploy large scale data and AI projects in Azure Databricks, the complexity of deploying and managing these increases. To help you with this, we’re excited to announce the public preview of Serverless workspaces in Azure Databricks. With Serverless workspaces, creation of workspaces is extremely simplified. This means you can quickly create a workspace with serverless compute and default storage. Your teams can focus on gathering the insights from the data instead of managing workspace infrastructure.&lt;/P&gt;
&lt;H2&gt;What’s new?&lt;/H2&gt;
&lt;P&gt;Azure Databricks now has Serverless workspaces in addition to Classic workspaces. Serverless workspaces are managed and run by the Azure Databricks service. You can still use Unity Catalog, identity federation, and your workspace settings but without the effort to build and maintain the underlying infrastructure. This means your teams can get immediate access to the workspace preventing delays in spinning projects up.&lt;/P&gt;
&lt;P&gt;On the contrary, when deploying Classic workspaces, you need to make all the decisions from how to set up the Virtual Network (VNet), deploy and manage compute, where data should land, and ensuring inbound and outbound connectivity to the workspace is properly set up. While essential, it is not something that an end user of the workspace should need to worry about.&lt;/P&gt;
&lt;H2&gt;What are Serverless Workspaces?&lt;/H2&gt;
&lt;P&gt;A Serverless Workspace is one where the overhead of deploying and managing the infrastructure is removed from the end user. The workspace is ready to use right away once created. Unity Catalog and storage are set up automatically. As a result, the same governance model is in tract without the set up overhead.&lt;/P&gt;
&lt;H2&gt;Why should I use Serverless Workspaces?&lt;/H2&gt;
&lt;P&gt;The main reason to use Serverless workspaces is to reduce the configuration and set up needed. These workspaces are managed for you, so these can be created almost instantaneously and given to your teams to start working on projects.&lt;/P&gt;
&lt;P&gt;Serverless workspaces also have the following features you can benefit from features in the following areas:&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;&lt;STRONG&gt;Storage:&lt;/STRONG&gt; You can create managed catalogs, tables, and volumes without needing to bring your own Azure Blob storage account, provide storage credentials, or configure the locations up front since each Serverless Workspace provides fully managed object storage by default. Multi-key projection and no direct access to the object storage ensure only you can access your data.&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Compute: &lt;/STRONG&gt;You can run workloads without provisioning a cluster. Azure Databricks automatically deploys and manages the needed serverless compute efficiently. This means end users can focus on analyzing the data without needing managing clusters.&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Network security:&lt;/STRONG&gt;&lt;STRONG&gt; &lt;/STRONG&gt;You no longer have to deploy NAT Gateways, Firewalls, or Private Link endpoints and instead define the serverless egress policies and &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/security/network/serverless-network-security/pl-to-internal-network" target="_blank"&gt;serverless Private Link rules&lt;/A&gt; that then are applied to all serverless workloads in the workspace &lt;A href="https://www.databricks.com/trust/security-features/serverless-egress-controls" target="_blank"&gt;serverless egress policies&lt;/A&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Unity Catalog data estate&lt;/STRONG&gt;: Because all of your existing governed data is available in the new workspace existing permissions allow access.&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Bring your own data when needed:&lt;/STRONG&gt; Similar to Classic Workspaces, you can connect to your existing Azure Blob Storage account through Unity Catalog credentials and external locations. As a result, lineage and permissions are consistent while avoiding duplication&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Cost management: &lt;/STRONG&gt;You can set attribution tags via budget policies on workloads to analyze spend without needing end users to tag jobs continually.&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;H2&gt;When should I use Serverless Workspaces?&lt;/H2&gt;
&lt;P&gt;Azure Databricks supports both –Serverless and Classic workspaces.&lt;/P&gt;
&lt;P&gt;You can choose Serverless when time to get going on your project matters since it is the fastest way to get a governed environment set up with minimal configuration and management.&lt;/P&gt;
&lt;P&gt;You can choose Classic when you need your own custom VNet design, specific network patterns, or other functionality not supported by serverless. Your organization also may prefer to continue to manage the underlying Azure resources directly, in which case Classic workspaces are a good choice.&lt;/P&gt;
&lt;H2&gt;Things to note&lt;/H2&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Region availability:&lt;/STRONG&gt; Serverless workspaces are only available in all &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/resources/supported-regions" target="_blank"&gt;Azure Databricks regions that support serverless compute&lt;/A&gt;.&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Feature surface:&lt;/STRONG&gt; A serverless workspace inherits serverless compute constraints (e.g., Python/SQL focus, unsupported legacy APIs). &lt;A class="lia-external-url" href="https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/limitations" target="_blank"&gt;Validate critical workloads&lt;/A&gt; before migrating your workspace.&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Billing for default storage is not yet enabled in Azure Databricks serverless workspaces&lt;/STRONG&gt;. During this time, Azure Databricks will not charge for default storage use in serverless workspaces. Azure will notify customers 30 days before we enable billing for default storage usage&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;For other considerations on Serverless Workspaces, see &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/compute/serverless/limitations" target="_blank"&gt;Azure&lt;/A&gt; documentation&lt;/P&gt;
&lt;P&gt;In summary, with Azure Databricks Serverless Workspaces, you can:&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Create&lt;/STRONG&gt; new Azure Databricks workspaces &lt;STRONG&gt;quickly&lt;/STRONG&gt; without manual set up&lt;/LI&gt;
&lt;LI&gt;&lt;STRONG&gt;Ensure &lt;/STRONG&gt;consistent &lt;STRONG&gt;data governance&lt;/STRONG&gt; and security right from set up&lt;/LI&gt;
&lt;LI&gt;Support your analytics and AI projects with &lt;STRONG&gt;clear cost visibility&lt;/STRONG&gt; and built-in budget guardrails&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;You can learn how to create an Azure Databricks serverless workspace and get started today &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/admin/workspace/serverless-workspaces" target="_blank"&gt;here&lt;/A&gt;.&lt;/P&gt;</description>
      <pubDate>Tue, 02 Dec 2025 23:08:45 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/serverless-workspaces-are-live-in-azure-databricks/ba-p/4474712</guid>
      <dc:creator>AnaviNahar</dc:creator>
      <dc:date>2025-12-02T23:08:45Z</dc:date>
    </item>
    <item>
      <title>Azure Databricks Genie integration with Copilot Studio and Microsoft Foundry is now live!</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/azure-databricks-genie-integration-with-copilot-studio-and/ba-p/4471087</link>
      <description>&lt;P&gt;&lt;EM&gt;&lt;SPAN data-contrast="none"&gt;This blog was co-authored by &lt;A class="lia-external-url" href="https://www.linkedin.com/in/toussaint-webb-b8b425173" target="_blank"&gt;Toussaint Webb&lt;/A&gt;, Databricks&lt;/SPAN&gt;&lt;/EM&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;We’re excited to announce the Public Preview availability of &lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/genie/" target="_blank"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;AI/BI Genie&amp;nbsp;&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;in Microsoft&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://www.microsoft.com/en/microsoft-copilot/microsoft-copilot-studio?market=af" target="_blank"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Copilot Studio&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;and&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://azure.microsoft.com/en-us/products/ai-foundry" target="_blank"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Microsoft Foundry&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;via MCP. This makes it easier than ever for organizations to unlock and scale the power of their Genie spaces across the Microsoft ecosystem (e.g., Teams), ultimately democratizing trusted data and insights to business users.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;AI/BI Genie opens the power of conversational analytics to everyone in the organization. A user can ask a question such as “What is my revenue growth this month?” and Genie interprets the intent, generates the appropriate query, and returns the data insight. Users can also review the underlying logic for transparency. By supporting iterative questioning, Genie enables users to investigate their data directly and build confidence in their understanding without requiring code or specialist intervention.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;The Challenge Before: Complex setup via Custom Code&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Previously, connecting Genie to the Microsoft ecosystem was challenging. Organizations had to develop custom connections to manage API flows, which added architectural overhead. This complexity limited organizations’ ability to distribute Genie’s trusted insights efficiently across Microsoft platforms.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;What’s Now Possible:&amp;nbsp; Unlock the value of Microsoft Ecosystem&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:120,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;The new integrations between Genie and Copilot Studio, as well as Genie and Microsoft Foundry, solve these challenges by providing easy and secure ways to connect each platform. Additionally, by leveraging MCP, updates to the underlying Genie APIs are seamlessly managed for users, eliminating the need to modify your integration.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:120,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Genie + Copilot Studio&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:120,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;Connect a Genie space to a Copilot Studio Agent as a tool (over MCP)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN data-contrast="auto"&gt;Instantly make Genie’s trusted insights available in Teams or M365 Copilot by publishing Genie enabled Copilot Studio agents&amp;nbsp;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN data-contrast="auto"&gt;Genie’s complete context of a customer’s data estate will enable Copilot Studio agents to deliver richer, more accurate responses&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Genie + Microsoft Foundry&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:120,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;Connect a Genie space to Foundry as a tool (over MCP)&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;Instantly make Genie’s trusted insights available to developers building agents using Foundry&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;How It Works: Genie + Copilot Studio&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;Create a connection to your Azure Databricks workspace in Power Apps, using OAuth or a Microsoft Entra ID Service Principal&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;Open Copilot Studio&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;Select an existing Copilot Studio&amp;nbsp; agent or create a new one&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;Open ‘Tools’, click ‘+Add a tool’, and search for “Azure Databricks Genie” or find it within the Model Context Protocol section&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;Select the Genie space to connect and configure the connection&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Note:&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;It’s important to give your Genie space a clear title and description so the Copilot Studio agent can effectively orchestrate requests.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559685&amp;quot;:720,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;After completing the steps above, you are ready to go and can publish the agent to channels, such as Microsoft Teams, to easily distribute the value of Genie to your organization.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;How It Works: Genie + Microsoft Foundry Portal &lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;OL&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;Open Microsoft Foundry&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;Go to the tool catalog within the Discover tab&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;Select ‘Azure Databricks Genie’&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;Configure the connection your Azure Databricks Genie space and click ‘Connect’&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;Click ‘Use in an agent’ and select the desired agent to connect your Genie space to&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Once completed, Genie is available for use in your Foundry agent!&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Try It Out: Get started with Genie in Copilot Studio and Microsoft Foundry&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;To get started with Genie + Copilot Studio, check out our&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/integrations/msft-power-platform" target="_blank"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;technical documentation&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;To get started with Genie + Microsoft Foundry, check out the&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/integrations/azure-ai-foundry" target="_blank"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;documentation&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN data-contrast="none"&gt;To learn more about the Generally Available Azure Databricks Power Platform connector explore this&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://www.databricks.com/blog/introducing-azure-databricks-power-platform-connector-unleashing-governed-real-time-data-power" target="_blank"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;blog&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:220,&amp;quot;335559739&amp;quot;:220}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;</description>
      <pubDate>Tue, 18 Nov 2025 18:48:18 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/azure-databricks-genie-integration-with-copilot-studio-and/ba-p/4471087</guid>
      <dc:creator>AnaviNahar</dc:creator>
      <dc:date>2025-11-18T18:48:18Z</dc:date>
    </item>
    <item>
      <title>SAP Business Data Cloud Connect with Azure Databricks is now generally available</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/sap-business-data-cloud-connect-with-azure-databricks-is-now/ba-p/4459490</link>
      <description>&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;We are excited to share that&amp;nbsp;&lt;A class="lia-external-url" href="https://www.sap.com/products/data-cloud.html" target="_blank" rel="noopener"&gt;SAP Business Data Cloud (SAP BDC)&lt;/A&gt; Connect for &lt;A class="lia-external-url" href="https://azure.microsoft.com/en-us/products/databricks/" target="_blank" rel="noopener"&gt;Azure Databricks&lt;/A&gt; is&amp;nbsp;generally available. With this announcement, Azure Databricks customers&amp;nbsp;like you,&amp;nbsp;can connect&amp;nbsp;your SAP BDC&amp;nbsp;environment to&amp;nbsp;your&amp;nbsp;existing&amp;nbsp;Azure Databricks instance – without copying the data – to enable bi-directional, live data sharing.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Connecting SAP data with other enterprise data prevents governance risk, compliance gaps, and data silos. In addition, maintenance costs are also&amp;nbsp;reduced&amp;nbsp;and manual building of semantics is no longer needed.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;SAP data products can now be shared directly via Delta Sharing into your existing Azure Databricks instances ensuring complete context for your business. &lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;You can now unify your data estate across Azure Databricks and SAP BDC This makes it easier for you to:&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="2" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Enforce governance&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI aria-setsize="-1" data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" data-aria-posinset="3" data-aria-level="1"&gt;&lt;SPAN data-contrast="auto"&gt;Power&amp;nbsp;analytics, data warehousing, BI and AI&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Connecting SAP BDC to Azure Databricks is simple, secure, and fast. The connection is trusted and requires approval from both platforms to enable bi-directional sharing of data products. Once approved, data products in SAP BDC can be directly mounted into Azure Databricks Unity Catalog and are treated like other assets shared using Delta sharing. As a result, your teams can query, analyze, and gather insights on SAP data in addition to your existing business data in one unified way. Instead of spending time gathering the data in once place, your teams can instead focus on unlocking insights from this unified data quickly and securely.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;This launch complements SAP Databricks in SAP BDC running on Azure that enables AI, ML, data engineering, and data warehousing capabilities directly inside your SAP environment. We have expanded the list of &lt;A class="lia-external-url" href="https://docs.databricks.com/sap/en/#supported-azure-regions" target="_blank" rel="noopener"&gt;supported regions&lt;/A&gt; for SAP Databricks on SAP BDC running on Azure.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;To learn more with SAP BDC Connect with Azure Databricks &lt;A class="lia-external-url" href="https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/" target="_blank" rel="noopener"&gt;review documentation&lt;/A&gt;&amp;nbsp;and &lt;A class="lia-external-url" href="https://learn.microsoft.com/en-us/azure/databricks/delta-sharing/sap-bdc/create-connection" target="_blank" rel="noopener"&gt;get started today.&lt;/A&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 15 Oct 2025 16:00:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/sap-business-data-cloud-connect-with-azure-databricks-is-now/ba-p/4459490</guid>
      <dc:creator>AnaviNahar</dc:creator>
      <dc:date>2025-10-15T16:00:00Z</dc:date>
    </item>
    <item>
      <title>General Availability: Automatic Identity Management (AIM) for Entra ID on Azure Databricks</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/general-availability-automatic-identity-management-aim-for-entra/ba-p/4452206</link>
      <description>&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;In February, we announced that Automatic Identity Management in public preview and loved to hear your overwhelmingly positive feedback. Prior to public preview, you either had to set up an Entra Enterprise Application or involve an Azure Databricks account admin to import the appropriate groups. This required manual steps whether it was adding or removing users with organizational changes, maintaining scripts, or requiring additional Entra or SCIM configuration. Identity management was thus cumbersome and required management overhead.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Today, we are excited to announce that Automatic Identity management (AIM) for Entra ID on Azure Databricks is &lt;STRONG&gt;generally available&lt;/STRONG&gt;. This means no manual user setup is needed and you can instantly add users to your workspace(s). Users, groups, and service principals from Microsoft Entra ID are automatically available within Azure Databricks, including support for nested groups and dashboards.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;This native integration is one of the many reasons &lt;/SPAN&gt;&lt;A href="https://azure.microsoft.com/en-us/blog/databricks-runs-best-on-azure/" target="_blank"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Databricks runs best on Azure.&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Here are some addition ways AIM could benefit you and your organization:&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Seamlessly share dashboards&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;You can share AI/BI dashboards with any user, service principal, or group in Microsoft Entra ID immediately as these users are automatically added to the Azure Databricks account upon login. Members of Microsoft Entra ID who do not have access to the workspace are granted access to a view-only copy of a dashboard published with embedded credentials. This enables you to share dashboards with users outside your organization, too. To learn more, see&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/dashboards/share" target="_blank"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;s&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;hare a dashboard&lt;/SPAN&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;.&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233118&amp;quot;:false,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Updated defaults for new accounts&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335559685&amp;quot;:0,&amp;quot;335559737&amp;quot;:0,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:279}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;All new Azure Databricks accounts have AIM enabled – no opt in or additional configuration required. &amp;nbsp;For existing accounts, you can enable AIM with a single click in the Account Admin Console. Soon, we will also make this the default for existing accounts.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335559685&amp;quot;:0,&amp;quot;335559737&amp;quot;:0,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:279}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Automation at scale enabled via APIs&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335559685&amp;quot;:0,&amp;quot;335559737&amp;quot;:0,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:279}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;You can also register users, groups, or service principles in Microsoft Entra ID via APIs. Being able to do this programmatically enables the enterprise scale most of our customers need. You can also enable automation via scripts leveraging these APIs.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;201341983&amp;quot;:0,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335559685&amp;quot;:0,&amp;quot;335559737&amp;quot;:0,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:160,&amp;quot;335559740&amp;quot;:279}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Read the Databricks blog &lt;/SPAN&gt;&lt;A href="https://www.databricks.com/blog/automatic-identity-management-entra-id-now-generally-available-azure-databricks" target="_blank"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;here&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; and get started via &lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/admin/users-groups/automatic-identity-management" target="_blank"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;documentation&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; today!&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 10 Sep 2025 16:00:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/general-availability-automatic-identity-management-aim-for-entra/ba-p/4452206</guid>
      <dc:creator>AnaviNahar</dc:creator>
      <dc:date>2025-09-10T16:00:00Z</dc:date>
    </item>
    <item>
      <title>Closing the loop: Interactive write-back from Power BI to Azure Databricks</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/closing-the-loop-interactive-write-back-from-power-bi-to-azure/ba-p/4442999</link>
      <description>&lt;P&gt;&lt;EM&gt;This is a collaborative post from Microsoft and Databricks. We thank&amp;nbsp;&lt;A href="https://www.linkedin.com/in/toussaint-webb-b8b425173/" target="_blank" rel="noopener" data-stringify-link="https://www.linkedin.com/in/toussaint-webb-b8b425173/" data-sk="tooltip_parent"&gt;Toussaint Webb&lt;/A&gt;, Product Manager at Databricks, for his contributions.&lt;/EM&gt;&lt;BR /&gt;&lt;BR /&gt;We're excited to announce that the &lt;A href="https://www.databricks.com/product/azure" target="_blank" rel="noopener"&gt;Azure Databricks&lt;/A&gt; connector for &lt;A href="https://www.microsoft.com/en-us/power-platform" target="_blank" rel="noopener"&gt;Power Platform&lt;/A&gt; is now Generally Available. With this integration, organizations can seamlessly build Power Apps, Power Automate flows, and Copilot Studio agents with secure, governed data and no data duplication.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;A key functionality unlocked by this connector is the ability to write data back from Power BI to Azure Databricks. Many organizations want to not only analyze data but also act on insights quickly and efficiently. Power BI users, in particular, have been seeking a straightforward way to “close the loop” by writing data back from Power BI into Azure Databricks. This capability is now here - real-time updates and streamlined operational workflows with the new &lt;A href="https://www.databricks.com/blog/introducing-azure-databricks-power-platform-connector-unleashing-governed-real-time-data-power" target="_blank" rel="noopener"&gt;Azure Databricks connector for Power Platform&lt;/A&gt;.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;With this connector, users can now read from and write to Azure Databricks data warehouses in real time, all from within familiar interfaces — no custom connectors, no data duplication, and no loss of governance.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;How It Works: Write-backs from Power BI through Power Apps&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Enabling writebacks from Power BI to Azure Databricks is seamless. Follow these steps:&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;Open Power Apps and create a connection to Azure Databricks (&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/integrations/msft-power-platform" target="_blank" rel="noopener"&gt;documentation&lt;/A&gt;).&lt;/LI&gt;
&lt;LI aria-level="1"&gt;In Power BI (desktop or service), add a Power Apps visual to your report (purple Power Apps icon).&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Add data to connect to your Power App via the visualization pane.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Create a new Power App directly from the Power BI interface, or choose an existing app to embed.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Start writing records to Azure Databricks!&lt;/LI&gt;
&lt;/UL&gt;
&lt;img /&gt;
&lt;P&gt;With this integration, users can make real-time updates directly within Power BI using the embedded Power App, instantly writing changes back to Azure Databricks. Think of all the workflows that this can unlock, such as warehouse managers monitoring performance and flagging issues on the spot, or store owners reviewing and adjusting inventory levels as needed. The seamless connection between Azure Databricks, Power Apps, and Power BI lets you close the loop on critical processes by uniting reporting and action in one place.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;&lt;STRONG&gt;T&lt;SPAN data-olk-copy-source="MessageBody"&gt;ry It Out: Get started with Azure Databricks Power Platform Connector&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;The Power Platform Connector is now Generally Available for all Azure Databricks customers. Explore more in the deep dive blog &lt;A href="https://www.databricks.com/blog/introducing-azure-databricks-power-platform-connector-unleashing-governed-real-time-data-power" target="_blank" rel="noopener"&gt;here&lt;/A&gt; and to get started, check out our &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/integrations/msft-power-platform" target="_blank" rel="noopener"&gt;technical documentation&lt;/A&gt;. Coming soon we will add the ability to execute existing Azure Databricks Jobs via Power Automate.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;If your organization is looking for an even more customizable end-to-end solution, check out &lt;A href="https://www.databricks.com/product/databricks-apps" target="_blank" rel="noopener"&gt;Databricks Apps&lt;/A&gt; in Azure Databricks!&amp;nbsp; No extra services or licenses required.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Fri, 15 Aug 2025 00:12:37 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/closing-the-loop-interactive-write-back-from-power-bi-to-azure/ba-p/4442999</guid>
      <dc:creator>AnaviNahar</dc:creator>
      <dc:date>2025-08-15T00:12:37Z</dc:date>
    </item>
    <item>
      <title>Announcing the Azure Databricks connector in Power Platform</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/announcing-the-azure-databricks-connector-in-power-platform/ba-p/4422647</link>
      <description>&lt;img /&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;We are ecstatic to announce the public preview of the&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;Azure Databricks Connector for Power Platform. This native connector is specifically for Power Apps, Power Automation, and Copilot Studio within Power Platform and enables seamless, single click connection.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;With this connector, your organization can build data-driven, intelligent conversational experiences that leverage the full power of your data within Azure Databricks without any additional custom configuration or scripting – it's all fully built in!&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;The Azure Databricks connector in power platform enables you to:&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559685&amp;quot;:0,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="7" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559683&amp;quot;:0,&amp;quot;335559684&amp;quot;:-2,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="1" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Maintain governance&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="none"&gt;: All access controls for data you set up in Azure Databricks are maintained in Power Platform&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="7" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559683&amp;quot;:0,&amp;quot;335559684&amp;quot;:-2,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="2" data-aria-level="1"&gt;&lt;SPAN data-contrast="none"&gt;&lt;STRONG&gt;Prevent data copy&lt;/STRONG&gt;:&lt;/SPAN&gt;&lt;SPAN data-contrast="none"&gt; Read and write to your data without data duplication&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="7" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559683&amp;quot;:0,&amp;quot;335559684&amp;quot;:-2,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="3" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Secure your connection&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="none"&gt;: Connect Azure Databricks to Power Platform using Microsoft Entra user-based OAuth or service principals&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="7" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559683&amp;quot;:0,&amp;quot;335559684&amp;quot;:-2,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="4" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Have&amp;nbsp;real time updates&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="none"&gt;: Read and write data and see updates in Azure Databricks in near real time&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="7" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559683&amp;quot;:0,&amp;quot;335559684&amp;quot;:-2,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="5" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Build agents with contex&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="none"&gt;&lt;STRONG&gt;t&lt;/STRONG&gt;: Build agents with Azure Databricks as grounding knowledge&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt; with all the context of your data&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Instead of spending time copying or moving data and building custom connections which require additional manual maintenance, you can&amp;nbsp;now seamlessly connect and focus on what matters – getting rich insights from your data – without worrying about security or governance.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Let’s see how this connector can be beneficial across Power Apps, Power Automate, and Copilot Studio:&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Azure Databricks Connector for Power Apps&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="none"&gt; – You can seamlessly connect to Azure Databricks from Power Apps to enable read/write access to your data directly within canvas apps enabling your organization to build data-driven experiences in real time. For example, our retail customers are using this connector to visualize different placements of items within the store and how they impact revenue.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559685&amp;quot;:0,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Azure Databricks Connector for Power Automate&lt;/SPAN&gt;&lt;/STRONG&gt; &lt;SPAN data-contrast="none"&gt;– You can&amp;nbsp;execute SQL commands against your data within Azure Databricks with the rich context of your business use case. For example, one of our global retail customers is using automated workflows to track safety incidents, which plays a crucial role in keeping employees safe.&lt;/SPAN&gt; &lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559685&amp;quot;:0,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="none"&gt;Azure Databricks as a Knowledge Source in Copilot Studio&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="none"&gt; – You can add Azure Databricks as a primary knowledge source for your agents, enabling them to understand, reason over, and respond to user prompts based on data from Azure Databricks.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559685&amp;quot;:0,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;To get started, all you need to do in Power Apps or Power Automate is add a new connection – that's how simple it is! &amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559685&amp;quot;:0,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;/P&gt;
&lt;div data-video-id="https://www.youtube.com/watch?v=-_KoJ67hecA/1749621189101" data-video-remote-vid="https://www.youtube.com/watch?v=-_KoJ67hecA/1749621189101" class="lia-video-container lia-media-is-center lia-media-size-large"&gt;&lt;iframe src="https://cdn.embedly.com/widgets/media.html?src=https%3A%2F%2Fwww.youtube.com%2Fembed%2F-_KoJ67hecA%3Fstart%3D152%26feature%3Doembed&amp;amp;display_name=YouTube&amp;amp;url=https%3A%2F%2Fwww.youtube.com%2Fwatch%3Fv%3D-_KoJ67hecA&amp;amp;image=https%3A%2F%2Fi.ytimg.com%2Fvi%2F-_KoJ67hecA%2Fhqdefault.jpg&amp;amp;type=text%2Fhtml&amp;amp;schema=youtube" allowfullscreen="" style="max-width: 100%"&gt;&lt;/iframe&gt;&lt;/div&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559685&amp;quot;:0,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="none"&gt;Check out our demo &lt;/SPAN&gt;&lt;A href="https://www.youtube.com/watch?v=-_KoJ67hecA&amp;amp;t=152s" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;here&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="none"&gt; and get started using our &lt;/SPAN&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/databricks/integrations/msft-power-platform" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;documentation&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; today! This connector is available in all public cloud regions. You can also learn more about customer use cases in this &lt;/SPAN&gt;&lt;A href="http://www.databricks.com/blog/introducing-azure-databricks-power-platform-connector-unleashing-governed-real-time-data-power" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;blog&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;134233117&amp;quot;:false,&amp;quot;134233118&amp;quot;:false,&amp;quot;335551550&amp;quot;:1,&amp;quot;335551620&amp;quot;:1,&amp;quot;335557856&amp;quot;:16777215,&amp;quot;335559685&amp;quot;:0,&amp;quot;335559738&amp;quot;:0,&amp;quot;335559739&amp;quot;:0}"&gt; You can also review the connector reference &lt;A class="lia-external-url" href="https://learn.microsoft.com/en-us/connectors/databricks/" target="_blank" rel="noopener"&gt;here&lt;/A&gt;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Wed, 11 Jun 2025 13:00:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/announcing-the-azure-databricks-connector-in-power-platform/ba-p/4422647</guid>
      <dc:creator>AnaviNahar</dc:creator>
      <dc:date>2025-06-11T13:00:00Z</dc:date>
    </item>
    <item>
      <title>Announcing the availability of Azure Databricks connector in Azure AI Foundry</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/announcing-the-availability-of-azure-databricks-connector-in/ba-p/4415263</link>
      <description>&lt;img /&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;At Microsoft, Databricks&amp;nbsp;&lt;/SPAN&gt;&lt;A href="https://www.databricks.com/product/data-intelligence-platform" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Data Intelligence Platform&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt; is available as a fully managed, native, first party Data and AI solution called &lt;/SPAN&gt;&lt;A href="https://azure.microsoft.com/en-us/products/databricks/" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;Azure Databricks. &lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;This makes Azure the optimal cloud for running Databricks workloads. Because of our unique partnership, we can bring you seamless integrations leveraging the power of the entire Microsoft ecosystem to do more with your data.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Azure AI Foundry is an integrated platform for Developers and IT Administrators to design, customize, and manage AI applications and agents.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Today we are excited to announce the public preview of the Azure Databricks connector in Azure AI Foundry.&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt; With this launch you can build enterprise-grade AI agents that reason over real-time Azure Databricks data while being governed by Unity Catalog. These agents will also be enriched by the responsible AI capabilities of Azure AI Foundry.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;Here are a few ways this can benefit you and your organization:&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="1" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Native Integration&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="auto"&gt;: Connect to Azure Databricks AI/BI Genie from Azure AI Foundry&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="2" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Contextual Answers&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="auto"&gt;: Genie agents provide answers grounded in your unique data&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="3" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Supports Various LLMs&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="auto"&gt;: Secure, authenticated data access&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="4" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Streamlined Process&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="auto"&gt;: Real-time data insights within GenAI apps&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="5" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Seamless Integration&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="auto"&gt;: Simplifies AI agent management with data governance&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="6" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Multi-Agent workflows&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="auto"&gt;: Leverages Azure AI agents and Genie Spaces for faster insights&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;UL&gt;
&lt;LI data-leveltext="" data-font="Symbol" data-listid="1" data-list-defn-props="{&amp;quot;335552541&amp;quot;:1,&amp;quot;335559685&amp;quot;:720,&amp;quot;335559991&amp;quot;:360,&amp;quot;469769226&amp;quot;:&amp;quot;Symbol&amp;quot;,&amp;quot;469769242&amp;quot;:[8226],&amp;quot;469777803&amp;quot;:&amp;quot;left&amp;quot;,&amp;quot;469777804&amp;quot;:&amp;quot;&amp;quot;,&amp;quot;469777815&amp;quot;:&amp;quot;hybridMultilevel&amp;quot;}" aria-setsize="-1" data-aria-posinset="7" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-contrast="auto"&gt;Enhanced Collaboration&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-contrast="auto"&gt;: Boosts productivity between business and technical users&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559739&amp;quot;:0}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;To further democratize the use of data to those in your organization who aren't directly interacting with Azure Databricks, you can also take it one step further with Microsoft Teams and AI/BI Genie. AI/BI Genie enables you to get deep insights from your data using your natural language without needing to access Azure Databricks.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Here you see an example of what an agent built in AI Foundry using data from Azure Databricks available in Microsoft Teams looks like&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt;We'd love to hear your feedback as you use the Azure Databricks connector in AI Foundry. Try it out today – to help you get started, we’ve put together some samples &lt;/SPAN&gt;&lt;A href="https://github.com/Azure-Samples/AI-Foundry-Connections" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;here.&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240}"&gt; Read more on the &lt;A class="lia-external-url" href="http://databricks.com/blog/announcing-azure-databricks-native-connector-azure-ai-foundry" target="_blank"&gt;Databricks blog&lt;/A&gt;, too.&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 19 May 2025 16:00:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/announcing-the-availability-of-azure-databricks-connector-in/ba-p/4415263</guid>
      <dc:creator>AnaviNahar</dc:creator>
      <dc:date>2025-05-19T16:00:00Z</dc:date>
    </item>
    <item>
      <title>Announcing the availability of Azure Databricks connector in Azure AI Foundry</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/announcing-the-availability-of-azure-databricks-connector-in/ba-p/4414889</link>
      <description>&lt;img /&gt;
&lt;P&gt;&lt;BR /&gt;At Microsoft, Databricks &lt;A href="https://www.databricks.com/product/data-intelligence-platform" target="_blank" rel="noopener"&gt;Data Intelligence Platform&lt;/A&gt; is available as a fully managed, native, first party Data and AI solution called &lt;A href="https://azure.microsoft.com/en-us/products/databricks/" target="_blank" rel="noopener"&gt;Azure Databricks. &lt;/A&gt;This makes Azure the optimal cloud for running Databricks workloads. Because of our unique partnership, we can bring you seamless integrations leveraging the power of the entire Microsoft ecosystem to do more with your data.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Azure AI Foundry is an integrated platform for Developers and IT Administrators to design, customize, and manage AI applications and agents.&lt;/P&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;Today we are excited to announce the public preview of the Azure Databricks connector in Azure AI Foundry.&lt;/SPAN&gt;&lt;SPAN data-contrast="auto"&gt; With this launch you can build enterprise-grade AI agents that reason over real-time Azure Databricks data while being governed by Unity Catalog. These agents will also be enriched by the responsible AI capabilities of Azure AI Foundry.&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&lt;BR /&gt;Here are a few ways this seamless integration can benefit you and your organization:&lt;/P&gt;
&lt;UL&gt;
&lt;LI data-charcodes="8226" data-font="Arial,Sans-Serif" data-buautonum="8" data-margin="540" data-aria-posinset="2" data-aria-level="1"&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;&lt;STRONG&gt;Native Integration&lt;/STRONG&gt;:&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;Connect to Azure &lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;Databricks AI/BI Genie from Azure AI&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;Foundry&lt;/SPAN&gt;​&lt;/LI&gt;
&lt;LI data-charcodes="8226" data-font="Arial,Sans-Serif" data-buautonum="8" data-margin="540" data-aria-posinset="3" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;Contextual Answers&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;: Genie agents provide &lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;answers grounded in your unique&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;data&lt;/SPAN&gt;​&lt;/LI&gt;
&lt;LI data-charcodes="8226" data-font="Arial,Sans-Serif" data-buautonum="8" data-margin="540" data-aria-posinset="4" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;Supports Various LLMs&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;: Secure, authenticated data&lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;access&lt;/SPAN&gt;​&lt;/LI&gt;
&lt;LI data-charcodes="8226" data-font="Arial,Sans-Serif" data-buautonum="8" data-margin="540" data-aria-posinset="5" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;Streamlined Process&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;: Real-time data insights within&lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;&amp;nbsp;GenAI&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;apps&lt;/SPAN&gt;​&lt;/LI&gt;
&lt;LI data-charcodes="8226" data-font="Arial,Sans-Serif" data-buautonum="8" data-margin="540" data-aria-posinset="6" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;Seamless Integration&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;: Simplifies AI agent management&lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;&amp;nbsp;with data&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;governance&lt;/SPAN&gt;​&lt;/LI&gt;
&lt;LI data-charcodes="8226" data-font="Arial,Sans-Serif" data-buautonum="8" data-margin="540" data-aria-posinset="7" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;Multi-Agent workflows&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;: Leverages Azure AI &lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;agents and Genie Spaces for faster&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;insights&lt;/SPAN&gt;​&lt;/LI&gt;
&lt;LI data-charcodes="8226" data-font="Arial,Sans-Serif" data-buautonum="8" data-margin="540" data-aria-posinset="8" data-aria-level="1"&gt;&lt;STRONG&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;Enhanced Collaboration&lt;/SPAN&gt;&lt;/STRONG&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;: Boosts productivity &lt;/SPAN&gt;&lt;SPAN data-usefontface="true" data-contrast="none"&gt;between business and technical users&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;To further democratize the use of data for those in your organization aren't directly interacting with Azure Databricks, you can also take it one step further with Microsoft Teams and AI/BI Genie. AI/BI Genie enables you to get deep insights from your data using your natural language without needing to access Azure Databricks.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;Here you see an example of what an agent built in AI Foundry using data from Azure Databricks available in Microsoft Teams looks like&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;We'd love to hear your feedback&amp;nbsp;as you use the Azure Databricks connector in AI Foundry. Try it out today – to help you get started, we’ve put together some samples &lt;/SPAN&gt;&lt;A href="https://github.com/Azure-Samples/AI-Foundry-Connections" target="_blank" rel="noopener"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;here.&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-ccp-props="{&amp;quot;335559738&amp;quot;:240,&amp;quot;335559739&amp;quot;:240}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 19 May 2025 16:00:00 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/announcing-the-availability-of-azure-databricks-connector-in/ba-p/4414889</guid>
      <dc:creator>AnaviNahar</dc:creator>
      <dc:date>2025-05-19T16:00:00Z</dc:date>
    </item>
    <item>
      <title>Llama 4 is now available in Azure Databricks</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/llama-4-is-now-available-in-azure-databricks/ba-p/4401277</link>
      <description>&lt;img /&gt;
&lt;P&gt;&lt;SPAN style="font-style: var(--lia-blog-font-style); font-weight: var(--lia-blog-font-weight); font-family: var(--lia-blog-font-family); background-color: var(--lia-rte-bg-color); color: var(--lia-bs-body-color); font-size: var(--lia-bs-font-size-base);"&gt;We are excited to announce the availability of Meta's Llama 4 in Azure Databricks.&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;&lt;SPAN style="font-style: var(--lia-blog-font-style); font-weight: var(--lia-blog-font-weight); font-family: var(--lia-blog-font-family); background-color: var(--lia-rte-bg-color); color: var(--lia-bs-body-color); font-size: var(--lia-bs-font-size-base);"&gt;As you know, enterprises all over the world already use Llama models in Azure Databricks to power AI enterprise agents, workflows, and applications. Now with &lt;/SPAN&gt;&lt;SPAN style="font-style: var(--lia-blog-font-style); font-weight: var(--lia-blog-font-weight); font-family: var(--lia-blog-font-family); background-color: var(--lia-rte-bg-color); color: var(--lia-bs-body-color); font-size: var(--lia-bs-font-size-base);"&gt;&lt;STRONG&gt;Llama 4 and Azure Databrick&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;SPAN style="font-style: var(--lia-blog-font-style); font-weight: var(--lia-blog-font-weight); font-family: var(--lia-blog-font-family); background-color: var(--lia-rte-bg-color); color: var(--lia-bs-body-color); font-size: var(--lia-bs-font-size-base);"&gt;&lt;STRONG&gt;s&lt;/STRONG&gt;&lt;/SPAN&gt;&lt;SPAN style="font-style: var(--lia-blog-font-style); font-weight: var(--lia-blog-font-weight); font-family: var(--lia-blog-font-family); background-color: var(--lia-rte-bg-color); color: var(--lia-bs-body-color); font-size: var(--lia-bs-font-size-base);"&gt;, you can get higher quality, faster inference, and lower cost than previous models.&amp;nbsp;&lt;BR /&gt;&lt;BR /&gt;&lt;/SPAN&gt;&lt;SPAN style="font-style: var(--lia-blog-font-style); font-weight: var(--lia-blog-font-weight); font-family: var(--lia-blog-font-family); background-color: var(--lia-rte-bg-color); color: var(--lia-bs-body-color); font-size: var(--lia-bs-font-size-base);"&gt;Llama 4 Maverick&lt;STRONG style="font-weight: var(--lia-blog-font-weight);"&gt;,&amp;nbsp;&lt;/STRONG&gt;&lt;SPAN style="font-weight: var(--lia-blog-font-weight);"&gt;the highest-quality and largest Llama model from today's announcement, is built for developers building the next generation of AI products that combine multilingual fluency, image understanding precision, and security.&amp;nbsp; With &lt;/SPAN&gt;&lt;STRONG style="font-weight: var(--lia-blog-font-weight);"&gt;Maverick on Azure Databricks&lt;/STRONG&gt;&lt;SPAN style="font-weight: var(--lia-blog-font-weight);"&gt;, you can:&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;SPAN style="font-style: var(--lia-blog-font-style); font-weight: var(--lia-blog-font-weight); font-family: var(--lia-blog-font-family); background-color: var(--lia-rte-bg-color); color: var(--lia-bs-body-color); font-size: var(--lia-bs-font-size-base);"&gt;Build domain specific AI agents with your data&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN style="font-style: var(--lia-blog-font-style); font-weight: var(--lia-blog-font-weight); font-family: var(--lia-blog-font-family); background-color: var(--lia-rte-bg-color); color: var(--lia-bs-body-color); font-size: var(--lia-bs-font-size-base);"&gt;Run scalable inference with your data pipeline&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN style="font-style: var(--lia-blog-font-style); font-weight: var(--lia-blog-font-weight); font-family: var(--lia-blog-font-family); background-color: var(--lia-rte-bg-color); color: var(--lia-bs-body-color); font-size: var(--lia-bs-font-size-base);"&gt;Fine-tune for accuracy and&lt;/SPAN&gt;&lt;/LI&gt;
&lt;LI&gt;&lt;SPAN style="font-style: var(--lia-blog-font-style); font-weight: var(--lia-blog-font-weight); font-family: var(--lia-blog-font-family); background-color: var(--lia-rte-bg-color); color: var(--lia-bs-body-color); font-size: var(--lia-bs-font-size-base);"&gt;Govern AI usage with Mosaic AI Gateway&lt;/SPAN&gt;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;Azure Databricks Intelligence Platform makes it easy for you to securely connect Llama 4 to your enterprise data using Unity Catalog governed tools to build agents with contextual awareness.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Enterprise data needs enterprise scale, whether it is to summarize documents or analyze support tickets, but without the infrastructure overhead. With Azure Databricks workflows and Llama 4 at scale, you can use SQL/Python to run LLMs at scale without overhead.&lt;/P&gt;
&lt;P&gt;You can tune Llama 4 to your custom use case for accuracy and alignment such as assistant behavior or summarization.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;All this comes with built in security controls and compliant model usage via Azure Databricks Mosaic AI Gateway with PII detection, logging, and policy guardrails on Azure Databricks.&lt;/P&gt;
&lt;P&gt;&lt;BR /&gt;Llama 4 is available now in Azure Databricks. More models will become available in phases. &lt;A class="lia-external-url" href="https://www.databricks.com/blog/introducing-metas-llama-4-databricks-data-intelligence-platform" target="_blank" rel="noopener"&gt;Llama 4 Scout is coming soon&lt;/A&gt; and you'll be able to pick the model that fits your workload best. Learn more about Llama 4 and supported models in Azure Databricks &lt;A class="lia-external-url" href="https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/supported-models#meta-llama-4-maverick" target="_blank" rel="noopener"&gt;here&lt;/A&gt; and &lt;A class="lia-external-url" href="https://portal.azure.com/#create/Microsoft.Databricks" target="_blank" rel="noopener"&gt;get started today&lt;/A&gt;.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&lt;SPAN style="font-family: var(--lia-blog-font-family); background-color: var(--lia-rte-bg-color); color: var(--lia-bs-body-color); font-size: var(--lia-bs-font-size-base); font-style: var(--lia-font-style-base); font-weight: var(--lia-bs-font-weight-base);"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;</description>
      <pubDate>Mon, 07 Apr 2025 05:18:30 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/llama-4-is-now-available-in-azure-databricks/ba-p/4401277</guid>
      <dc:creator>AnaviNahar</dc:creator>
      <dc:date>2025-04-07T05:18:30Z</dc:date>
    </item>
    <item>
      <title>Power BI &amp; Azure Databricks: Smarter Refreshes, Less Hassle</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/power-bi-azure-databricks-smarter-refreshes-less-hassle/ba-p/4398388</link>
      <description>&lt;img /&gt;
&lt;P&gt;We are excited to extend the deep integration between Azure Databricks and &lt;A href="https://www.microsoft.com/en-us/power-platform/products/power-bi" target="_blank" rel="noopener"&gt;Microsoft Power BI&lt;/A&gt; with the Public Preview of the Power BI task type in &lt;A href="https://www.databricks.com/product/data-engineering/workflows" target="_blank" rel="noopener"&gt;Azure Databricks Workflows&lt;/A&gt;. This new capability allows users to update and refresh Power BI &lt;A href="https://learn.microsoft.com/en-us/power-bi/connect-data/service-datasets-understand" target="_blank" rel="noopener"&gt;semantic models&lt;/A&gt; directly from their Azure Databricks workflows, ensuring real-time data updates for reports and dashboards. By leveraging orchestration and triggers within Azure Databricks Workflows, organizations can improve efficiency, reduce refresh costs, and enhance data accuracy for Power BI users.&lt;/P&gt;
&lt;P&gt;Power BI tasks seamlessly integrate with &lt;A href="https://www.databricks.com/product/unity-catalog" target="_blank" rel="noopener"&gt;Unity Catalog&lt;/A&gt;&lt;A href="https://www.databricks.com/product/unity-catalog" target="_blank" rel="noopener"&gt; in Azure Databricks&lt;/A&gt;, enabling automated updates to tables, views, materialized views, and streaming tables across multiple schemas and catalogs. With support for Import, DirectQuery, and Dual Storage modes, Power BI tasks provide flexibility in managing performance and security. This direct integration eliminates manual processes, ensuring Power BI models stay synchronized with underlying data without requiring context switching between platforms.&lt;/P&gt;
&lt;P&gt;Built into Azure Databricks Lakeflow, Power BI tasks benefit from enterprise-grade orchestration and monitoring, including task dependencies, scheduling, retries, and notifications. This streamlines workflows and improves governance by utilizing Microsoft Entra ID authentication and Unity Catalog suite of security and governance offerings.&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;We invite you to explore the new Power BI tasks today and experience seamless data integration—get started by visiting the [ADB Power BI task documentation].&lt;/P&gt;</description>
      <pubDate>Mon, 31 Mar 2025 20:59:40 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/power-bi-azure-databricks-smarter-refreshes-less-hassle/ba-p/4398388</guid>
      <dc:creator>LindseyAllen</dc:creator>
      <dc:date>2025-03-31T20:59:40Z</dc:date>
    </item>
    <item>
      <title>Anthropic State-of-the-Art Models Available to Azure Databricks Customers</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/anthropic-state-of-the-art-models-available-to-azure-databricks/ba-p/4397140</link>
      <description>&lt;img /&gt;
&lt;P&gt;Our customers now have greater model choices with the arrival of Anthropic Claude 3.7 Sonnet in Azure Databricks. Databricks is announcing a partnership with Anthropic to integrate their state-of-the-art models into Databricks Data Intelligence Platform as a native offering, starting with Claude 3.7 Sonnet &lt;A href="https://nam06.safelinks.protection.outlook.com/?url=http%3A%2F%2Fdatabricks.com%2Fblog%2Fanthropic-claude-37-sonnet-now-natively-available-databricks&amp;amp;data=05%7C02%7CLindsey.Allen%40microsoft.com%7Cd03f5dcc3cb443c86d8908dd6c03eea6%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C638785486360443559%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;amp;sdata=5lm6MrsxMDc61W1La2AbjbRa523JzeEs4WfW34gnLxY%3D&amp;amp;reserved=0" target="_blank" rel="noopener"&gt;http://databricks.com/blog/anthropic-claude-37-sonnet-now-natively-available-databricks&lt;/A&gt;. With this announcement, Azure customers can use Claude Models directly in Azure Databricks. &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/machine-learning/foundation-model-apis/api-reference" target="_blank" rel="noopener"&gt;Foundation model REST API reference - Azure Databricks | Microsoft Learn&lt;/A&gt;&lt;/P&gt;
&lt;P&gt;With Anthropic models available in Azure Databricks, customers can use the Claude "think" tool with business data optimized promote to guide Claude efficiently perform complex tasks. With Claude models in Azure Databricks, enterprises can deliver domain-specific, high quality AI agents more efficiently. As an integrated component of the Azure Databricks&amp;nbsp;&lt;SPAN style="font-style: var(--lia-blog-font-style); font-weight: var(--lia-blog-font-weight); font-family: var(--lia-blog-font-family); background-color: var(--lia-rte-bg-color); color: var(--lia-bs-body-color); font-size: var(--lia-bs-font-size-base);"&gt;Data Intelligence Platform, Anthropic Claude models benefit from comprehensive end-to-end governance and monitoring throughout the entire data and AI lifecycle with Unity Catalog.&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;With Claude models, we remain committed to providing customers with model flexibility. Through the Azure Databricks Data Intelligence Platform, customers can securely connect to any model provider and select the most suitable model for their needs. They can further enhance these models with enterprise data to develop domain-specific, high-quality AI agents, supported by built-in custom evaluation governance across both data and models.&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 27 Mar 2025 02:27:17 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/anthropic-state-of-the-art-models-available-to-azure-databricks/ba-p/4397140</guid>
      <dc:creator>LindseyAllen</dc:creator>
      <dc:date>2025-03-27T02:27:17Z</dc:date>
    </item>
    <item>
      <title>Part 2: Performance Configurations for Connecting PBI to a Private Link ADB Workspace</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/part-2-performance-configurations-for-connecting-pbi-to-a/ba-p/4396060</link>
      <description>&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;PRE&gt;This blog was written in conjunction with &lt;A class="lia-external-url" href="https://www.linkedin.com/in/leoafurlongiv/" target="_blank" rel="noopener"&gt;Leo Furlong&lt;/A&gt;, Lead Solutions Architect at Databricks.&lt;BR /&gt;&lt;BR /&gt;&lt;/PRE&gt;
&lt;P&gt;In &lt;A href="https://techcommunity.microsoft.com/blog/analyticsonazure/part-1-power-bi-service-connections-to-azure-databricks-with-private-networking/4384391" target="_blank" rel="noopener"&gt;Part 1&lt;/A&gt;, we discussed networking options for connecting Power BI to an Azure Databricks workspace with a Public Endpoint protected with a workspace IP Access List.&amp;nbsp; &amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In Part 2, we continue our discussion and elaborate on private networking options for an &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link" target="_blank" rel="noopener"&gt;Azure Databricks Private Link&lt;/A&gt; workspace. When using Azure Databricks Private Link with &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard#create-the-workspace-and-private-endpoints-in-the-azure-portal-ui" target="_blank" rel="noopener"&gt;Allow Public Network Access&lt;/A&gt; setting set to Disabled, all connections to the workspace must go through Private Endpoints.&amp;nbsp; For one of the private networking options, we’ll also discuss how to configure your On-Premise Data Gateway VM to get good performance.&lt;/P&gt;
&lt;H2&gt;Connecting Power BI to a Private Link Azure Databricks Workspaces&lt;/H2&gt;
&lt;P&gt;As covered in &lt;A href="https://techcommunity.microsoft.com/blog/analyticsonazure/part-1-power-bi-service-connections-to-azure-databricks-with-private-networking/4384391" target="_blank" rel="noopener"&gt;Part 1&lt;/A&gt;, Power BI offers two primary methods for secure connections to data sources with private networking:&lt;/P&gt;
&lt;P&gt;1. &lt;A href="https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-onprem" target="_blank" rel="noopener"&gt;On-premises data gateway&lt;/A&gt;: An application that gets installed on a Virtual Machine that has a direct networking connection to a data source. It allows Power BI to connect to data sources that don’t allow public connections. The &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link-standard#diagrams" target="_blank" rel="noopener"&gt;general flow&lt;/A&gt; of this setup entails:&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; a. Create or leverage a set of Private Endpoints to the Azure Databricks workspace - both&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; sub-resources for databricks_ui_api and browser_authentication are required&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; b. Create or leverage a Private DNS Zone for privatelink.azuredatabricks.net&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; c. Deploy an Azure VM into a VNet/subnet&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; d. The VM’s VNet/subnet should have access to the Private Endpoints (PEs) via either them&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;being in the same VNet or being peered with another VNet where they reside&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; e. &lt;A class="lia-external-url" href="https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-install" target="_blank" rel="noopener"&gt;Install&lt;/A&gt; and &lt;A href="https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-app" target="_blank" rel="noopener"&gt;configure&lt;/A&gt; the on-premise data gateway software on the VM&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp; f. Create a &lt;A href="https://learn.microsoft.com/en-us/power-bi/connect-data/service-gateway-data-sources" target="_blank" rel="noopener"&gt;connection&lt;/A&gt; in the Power BI Service via Settings -&amp;gt;&amp;nbsp; Manage Connections and&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;Gateways UIs&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;g. &lt;A class="lia-external-url" href="https://learn.microsoft.com/en-us/power-bi/connect-data/refresh-data#using-an-enterprise-data-gateway" target="_blank" rel="noopener"&gt;Configure&lt;/A&gt; the Semantic Model to use the connection under the Semantic Model’s&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;settings and gateway and cloud connections sub-section&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;2. &lt;A href="https://learn.microsoft.com/en-us/data-integration/vnet/overview" target="_blank" rel="noopener"&gt;Virtual Network Data Gateway&lt;/A&gt;: A fully managed data gateway that gets created and managed by the Power BI service.&amp;nbsp; Connections work by allowing Power BI to delegate into a VNet for secure connectivity to the data source.&amp;nbsp; The &lt;A href="https://learn.microsoft.com/en-us/data-integration/vnet/create-data-gateways" target="_blank" rel="noopener"&gt;general flow&lt;/A&gt; of this setup entails: &lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;a. Create or leverage a set of Private Endpoints (PEs) to the Azure Databricks workspace -&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;both sub-resources for databricks_ui_api and browser_authentication are required&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;b. Create or leverage a Private DNS Zone for privatelink.azuredatabricks.net&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;c. Create a subnet in a VNet that has access to the Private Endpoints (PEs) via either them&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;being in the same VNet or being peered with another VNet where they reside.&amp;nbsp; Delegate the&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;Subnet to Microsoft.PowerPlatform/vnetaccesslinks&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;d. Create a virtual network data gateway in the Power BI Service via Settings -&amp;gt; Manage&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;Connections and Gateways UIs&lt;/P&gt;
&lt;P&gt;&amp;nbsp; &amp;nbsp; &amp;nbsp;e. &lt;A class="lia-external-url" href="https://learn.microsoft.com/en-us/power-bi/connect-data/refresh-data#using-an-enterprise-data-gateway" target="_blank" rel="noopener"&gt;Configure&lt;/A&gt; the the Semantic Model to use the connection under the Semantic Model’s&amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp; &amp;nbsp;settings and gateway and cloud connections sub-section&lt;/P&gt;
&lt;P&gt;The documentation for both options is fairly extensive, and this blog post will not focus on breaking down the configurations further.&amp;nbsp; Instead, this post is about configuring your private connections to get the best Import performance.&amp;nbsp;&lt;/P&gt;
&lt;H2&gt;On-Premise Data Gateway Performance Testing&lt;/H2&gt;
&lt;P&gt;In order to provide configuration guidance, a series of Power BI Import tests were performed using various configurations and a testing dataset.&lt;/P&gt;
&lt;H3&gt;Testing Data&lt;/H3&gt;
&lt;P&gt;The testing dataset used was a TPC-DS scale factor 10 dataset (you can create your own using this &lt;A href="https://github.com/databricks/tpcds-kit" target="_blank" rel="noopener"&gt;Repo&lt;/A&gt;).&amp;nbsp; A scale factor of 10 in TPC-DS generates about 10 gigabytes (GB) of data.&amp;nbsp; The TPC-DS dataset was loaded into Unity Catalog and the &lt;A href="https://github.com/yati1002/Power-BI-DatabricksSQL-QuickStart-Samples/blob/09gateway2/09.%20Private%20Connections/tpc-ds%20PKs" target="_blank" rel="noopener"&gt;primary&lt;/A&gt; and &lt;A href="https://github.com/yati1002/Power-BI-DatabricksSQL-QuickStart-Samples/blob/09gateway2/09.%20Private%20Connections/tpc-ds%20FKs" target="_blank" rel="noopener"&gt;foreign&lt;/A&gt; keys were created between the tables.&amp;nbsp; A model was then created in the Power BI Service using the &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi#publish" target="_blank" rel="noopener"&gt;Publish to Power BI&lt;/A&gt; capabilities in Unity Catalog; the primary and foreign keys were used to automatically create relationships between the tables in the Power BI semantic model.&amp;nbsp; Here’s an overview of the tables used in this dataset:&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;H3&gt;Fabric Capacity&lt;/H3&gt;
&lt;P&gt;An F64 Fabric Capacity was used in the West US region.&amp;nbsp; The F64 was the &lt;A href="https://learn.microsoft.com/en-us/power-bi/enterprise/service-premium-what-is#semantic-model-sku-limitation" target="_blank" rel="noopener"&gt;smallest size&lt;/A&gt; available (in terms of RAM) for refreshing the model without getting capacity errors - the compressed Semantic Model size is 5,244 MB.&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;Azure Databricks SQL Warehouse&lt;/H3&gt;
&lt;P&gt;An Azure Databricks workspace using Unity Catalog was deployed in the East US 2 and West US regions for the performance tests.&amp;nbsp; A &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/compute/sql-warehouse/warehouse-behavior#serverless-sizing" target="_blank" rel="noopener"&gt;Medium Databricks SQL Warehouse&lt;/A&gt; was used.&amp;nbsp; For Imports, generally speaking, the size of the SQL Warehouse isn’t very important.&amp;nbsp; Using an aggressive &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/compute/sql-warehouse/create#configure-sql-warehouse-settings" target="_blank" rel="noopener"&gt;Auto Stop&lt;/A&gt; configuration of 5 minutes is ideal to minimize compute charges (1 minute can be used if the SQL Warehouse is deployed via an API).&lt;/P&gt;
&lt;H3&gt;Testing Architecture&lt;/H3&gt;
&lt;P&gt;The following diagram summarizes a simplified Azure networking architecture for the performance tests.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;OL&gt;
&lt;LI aria-level="1"&gt;A Power BI Semantic Model is connected to a Power BI On-Premise Data Gateway Connection&lt;BR /&gt;&lt;BR /&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;The On-Premise Data Gateway Connection connects to the Azure Databricks workspace using Private Endpoints. &lt;BR /&gt;&lt;BR /&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Azure Databricks provisions up a Serverless SQL Warehouse in ~5 seconds within the Serverless Data Plane within Azure.&amp;nbsp; SQL queries are executed on the Serverless SQL Warehouse. &lt;BR /&gt;&lt;BR /&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Unity Catalog gives the Serverless SQL Warehouse a read-only, down-scoped, and pre-signed URL to ADLS. &lt;BR /&gt;&lt;BR /&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Data is fetched from ADLS and placed on the Azure Databricks workspace’s managed storage account via a capability called &lt;A href="https://www.databricks.com/blog/2021/08/11/how-we-achieved-high-bandwidth-connectivity-with-bi-tools.html" target="_blank" rel="noopener"&gt;Cloud Fetch&lt;/A&gt;.&lt;BR /&gt;&lt;BR /&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Arrow Files are pulled from Cloud Fetch and delivered to the Power BI Service through the Data Gateway. &lt;BR /&gt;&lt;BR /&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Data in the Semanic Model is compressed and stored in Vertipaq In-Memory storage.&lt;/LI&gt;
&lt;/OL&gt;
&lt;H3&gt;Testing Results&lt;/H3&gt;
&lt;P&gt;The following grid outlines the scenarios tested and the results for each test.&amp;nbsp; We’ll review the different configurations tested below in specific sections.&lt;/P&gt;
&lt;img /&gt;
&lt;DIV class="styles_lia-table-wrapper__h6Xo9 styles_table-responsive__MW0lN"&gt;&lt;table border="1" style="border-width: 1px;"&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;Scenario&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;Gateway Scenario&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;Avg Refresh Duration Minutes&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;A&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;East US 2, Public Endpoint&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;17:01&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;B&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;West US, Public Endpoint&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;12:21&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;C&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;West US, Public Endpoint via IP Access List&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;15:19&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;D&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;West US, E VM Gateway Base&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;12:14&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;E&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;West US, E VM StreamBeforeRequestCompletes&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;07:46&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;F&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;West US, E VM StreamBeforeRequestCompletes + Logical Partitions&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;07:31&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;G&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;West US, E VM Spooler (D)&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;12:57&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;H&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;West US, E VM Spooler (E)&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;13:32&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;I&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;West US, D VM Gateway Base&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;16:47&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;J&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;West US, D VM StreamBeforeRequestCompletes&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;12:19&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;K&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;West US, PBI Managed Vnet&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;27:04&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;/DIV&gt;
&lt;P&gt;&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;DIV class="styles_lia-table-wrapper__h6Xo9 styles_table-responsive__MW0lN"&gt;&lt;table border="1" style="border-width: 1px;"&gt;&lt;tbody&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;Scenario&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;VM Configuration&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;D&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;Standard E8bds v5 (8 vcpus, 64 GiB memory) [NVMe, Accelerated Networking], C Drive default (Premium SSD LRS 127 GiB)&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;E&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;Standard E8bds v5 (8 vcpus, 64 GiB memory) [NVMe, Accelerated Networking], C Drive default (Premium SSD LRS 127 GiB)&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;F&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;Standard E8bds v5 (8 vcpus, 64 GiB memory) [NVMe, Accelerated Networking], C Drive default (Premium SSD LRS 127 GiB)&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;G&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;Standard E8bds v5 (8 vcpus, 64 GiB memory) [NVMe, Accelerated Networking], D drive&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;H&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;Standard E8bds v5 (8 vcpus, 64 GiB memory) [NVMe, Accelerated Networking], E Drive (Premium SSD LRS 600 GiB)&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;I&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;Standard D8s v3 (8 vcpus, 32 GiB memory), C Drive default (Premium SSD LRS 127 GiB)&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;tr&gt;&lt;td&gt;
&lt;P&gt;J&lt;/P&gt;
&lt;/td&gt;&lt;td&gt;
&lt;P&gt;Standard D8s v3 (8 vcpus, 32 GiB memory), C Drive default (Premium SSD LRS 127 GiB)&lt;/P&gt;
&lt;/td&gt;&lt;/tr&gt;&lt;/tbody&gt;&lt;/table&gt;&lt;/DIV&gt;
&lt;H2&gt;Performance Configurations&lt;/H2&gt;
&lt;H3&gt;1. Regional Alignment&lt;/H3&gt;
&lt;P&gt;Aligning your Power BI Premium/Fabric Capacity to the same region as your Azure Databricks deployment and your On-Premise Data Gateway VM helps reduce the overall network latency and data transfer duration. It should also eliminate cross-region networking charges. &lt;BR /&gt;&lt;BR /&gt;In scenario A, the Azure Databricks deployment was in East US 2 while the Fabric Capacity and On-Premise Data Gateway VM were in West US.&amp;nbsp; The Import processing time when using the public endpoint between the regions was 17:01 minutes.&amp;nbsp; In scenario B, while still using the public endpoint, there is complete regional alignment in the West US region and the Import times averaged 12:21 minutes which is a 27.4% decrease&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;2. Configure a Gateway Cluster&lt;/H3&gt;
&lt;P&gt;A Power BI Data &lt;A href="https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-high-availability-clusters" target="_blank" rel="noopener"&gt;Gateway Cluster&lt;/A&gt; configuration is highly recommended for Prouduction deployments but this configuration was not performance tested during this experiment. Data Gateway clusters can help with data refresh redundancy and for overall volume / throughput of data transfer.&amp;nbsp; This configuration is highly recommended for Production Power BI environments.&amp;nbsp;&amp;nbsp;&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;3. VM Family Selection&lt;/H3&gt;
&lt;P&gt;The Power BI documentation &lt;A href="https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-install#recommended" target="_blank" rel="noopener"&gt;recommends&lt;/A&gt; a VM with 8 cores, 8 GB of RAM, and an SSD for the VM used for the On-Premise Data Gateway.&amp;nbsp; Through testing, it can be proven that using a VM with good performance characteristics can provide immense value in the Import times.&amp;nbsp; &lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;In scenario D, data gateway tests were run using a Standard E8bds v5 with 8 cores and 64 GB RAM that also included NVMe, and Accelerated Networking, and a C drive using a Premium SSD.&amp;nbsp; &amp;nbsp; The import times for this scenario averaged 12:14 minutes which was slightly faster than the regionally aligned public endpoint test in scenario B.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;In scenario I, data gateway tests were run using a Standard D8s v3 with 8 cores and 32 GB RAM and a C drive using a Premium SSD.&amp;nbsp; The import times for this scenario averaged 16:47 minutes which was noticeably slower than using the regionally aligned public endpoint in cenario B which was a 35.96% performance degradation.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;More tests could certainly be done to determine which VM characteristics help the most with Import performance, but it is clear certain features can be helpful like:&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/virtual-machines/disks-types#disk-type-comparison" target="_blank" rel="noopener"&gt;Premium SSDs&lt;/A&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/virtual-network/accelerated-networking-overview?tabs=redhat" target="_blank" rel="noopener"&gt;Accelerated Networking&lt;/A&gt;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/virtual-machines/nvme-overview" target="_blank" rel="noopener"&gt;NVMe&lt;/A&gt; controller&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;A href="https://learn.microsoft.com/en-us/azure/virtual-machines/sizes/memory-optimized/e-family?tabs=epsv6%2Ceasv6%2Cev5%2Cedv5%2Ceasv5%2Cepsv5" target="_blank" rel="noopener"&gt;Memory optimized&lt;/A&gt; instances&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;And while the better &lt;A href="https://instances.vantage.sh/azure/vm/e8bds-v5" target="_blank" rel="noopener"&gt;E8bds v5&lt;/A&gt; Azure VM costs ~$820 per month in West US at list and the &lt;A href="https://instances.vantage.sh/azure/vm/d8s-v3" target="_blank" rel="noopener"&gt;D8s v3&lt;/A&gt; costs ~$610 per month at list (25% more expensive), this feels like a scenario where you pay the premium to get better performance and optimize through Azure VM &lt;A href="https://learn.microsoft.com/en-us/azure/cost-management-billing/reservations/save-compute-costs-reservations" target="_blank" rel="noopener"&gt;reservations&lt;/A&gt;.&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;4. StreamBeforeRequestCompletes&lt;/H3&gt;
&lt;P&gt;By default, the on-premise data gateway spools data to disk before sending it to Power BI.&amp;nbsp; Enabling the &lt;A href="https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-performance#optimize-performance-by-streaming-data" target="_blank" rel="noopener"&gt;StreamBeforeRequestCompletes&lt;/A&gt; setting to True can significantly improve gateway refresh performance as it allows data to be streamed directly to the Power BI Service without first being spooled to disk.&lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;In scenario E, when StreamBeforeRequestCompletes is set to True and restarted, you can see that the average Import times significantly improved to 07:46 minutes which is a 54% improvement compared to scenario A and a 36% improvement over the base VM configuration in scenario D.&lt;/P&gt;
&lt;H3&gt;5. Spooler Location&amp;nbsp;&lt;/H3&gt;
&lt;P&gt;As discussed above, when using the default setting for StreamBeforeRequestCompletes as False, Power BI spools the data to the data gateway spool directory before sending it to the Power BI Service.&amp;nbsp; &lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;In scenarios D, G, and H, StreamBeforeRequestCompletes is False and the &lt;A href="https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-configure-disk-space#gateway-spooling-data" target="_blank" rel="noopener"&gt;Spooler directory&lt;/A&gt; has been mapped to the C drive, D drive, and E drives respectively which all correspond to an SSD (of varying configuration) on the Azure VMs. &lt;BR /&gt;&lt;BR /&gt;In all scenarios, you can see the times are similar between 12:14, 12:57, and 13:32 minutes, respectively. In all three scenarios the tests were performed with SSDs on the E series VM configured with NVMe. Using this configuration mix, it doesn’t appear that the Spooler directory location provides significant performance improvements.&amp;nbsp; Since the C drive configuration gave the best performance it seems prudent to keep the C drive default configuration.&amp;nbsp; However, it is possible that that the Spooler directory setting might provide more value on a different VM configurations.&amp;nbsp;&lt;/P&gt;
&lt;H3&gt;6. Logical Partitioning&lt;/H3&gt;
&lt;P&gt;As outlined in the &lt;A href="https://github.com/yati1002/Power-BI-DatabricksSQL-QuickStart-Samples/tree/main/03.%20Logical%20Partitioning" target="_blank" rel="noopener"&gt;QuickStart samples guide&lt;/A&gt;, logical partitioning can often help with Power BI Import performance as multiple logical partitions in the Semantic Model can be processed at the same time. &lt;BR /&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In scenario F, logical partitions were created for the inventory and store_sales table to have 5 partitions each.&amp;nbsp; When combined with the StreamBeforeRequestCompletes setting, the benefit from adding Logical Partitions was negligible (15 second improvement) even though the parallelization settings were increased to 30 (Max Parallelism Per Refresh and Data Source Default Max Connections).&amp;nbsp; &lt;BR /&gt;&lt;BR /&gt;While logical partitions are usually a very valuable strategy, combining them with StreamBeforeRequestCompletes, the E series VM configurations, and a Fabric F64 capacity yielded diminishing returns. It is probably worth more testing at some point in the future.&lt;/P&gt;
&lt;H2&gt;Virtual Network Data Gateway Performance Testing&lt;/H2&gt;
&lt;P&gt;The configuration and performance of a Virtual Network Data Gateway was briefly tested.&amp;nbsp; A Power BI subnet was created in the same VNet as the Azure Databricks workspace and delegated to the Power BI Service.&amp;nbsp; A virtual network data gateway was created in the UI with 2 gateways (12 queries can run in parallel) and assigned to the Semantic Model.&amp;nbsp; &lt;BR /&gt;&lt;BR /&gt;&lt;/P&gt;
&lt;P&gt;In scenario K, an Import test was performed through the Virtual Network Data Gateway that took 27:04 minutes. More time was not spent trying to tune the Virtual Network Data Gateway as it was not the primary focus of this blog post.&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;H2&gt;The Best Configuration&amp;nbsp;&lt;/H2&gt;
&lt;H4&gt;The Best Configuration: Region Alignment + Good VM + StreamBeforeRequestsCompletes&lt;/H4&gt;
&lt;P&gt;&lt;BR /&gt;While the Import testing performed for this blog post isn’t definitive, it does provide good directional value in forming an opinion on how you can configure your Power BI On-Premise Data Gateway on an Azure Virtual Machine to get good performance. &lt;BR /&gt;&lt;BR /&gt;When looking at the tests performed for this blog, an Azure Virtual Machine, in the same region as the Azure Databricks Workspace and the Fabric Capacity, with Accelerated networking, an SSD, NVMe, and memory optimized compute provided performance that was faster than just using the public endpoint of the Azure Databricks Workspace alone.&amp;nbsp; Using this configuration, we improved our Import performance from 17:01 to 07:46 minutes which is a 54% performance improvement.&lt;/P&gt;</description>
      <pubDate>Mon, 24 Mar 2025 19:19:58 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/part-2-performance-configurations-for-connecting-pbi-to-a/ba-p/4396060</guid>
      <dc:creator>katiecummiskey</dc:creator>
      <dc:date>2025-03-24T19:19:58Z</dc:date>
    </item>
    <item>
      <title>6 critical phases to prepare for a successful Azure Databricks migration</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/6-critical-phases-to-prepare-for-a-successful-azure-databricks/ba-p/4386984</link>
      <description>&lt;P&gt;As organizations adopt advanced analytics and AI to drive decision-making, moving data applications to Azure Databricks has become a strategic and significant endeavor. This transition requires careful planning and execution to succeed. Based on numerous successful implementations, we’ve identified six critical phases that can help you prepare for a smooth migration.&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Phase 1: Infrastructure and workload assessment&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Starting with a thorough analysis of your current environment prevents unexpected issues during migration. Many organizations face setbacks by rushing ahead without a complete picture of their data estate.&lt;/P&gt;
&lt;P&gt;A comprehensive assessment includes:&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Data source and workload cataloging:&lt;/STRONG&gt; Use automated assessment tools to create a detailed inventory of your data assets. Track data volumes, update frequencies, and usage patterns.&amp;nbsp;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;ETL process analysis:&lt;/STRONG&gt; Record the business logic, scheduling dependencies, and performance characteristics of each ETL process. Focus on custom transformations that may need redesign in the Databricks environment.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;SQL code dependency mapping: &lt;/STRONG&gt;Build a dependency graph of SQL objects, including stored procedures, views, and user-defined functions. This identifies which elements need to migrate together and shows potential improvements.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Application interdependency analysis: &lt;/STRONG&gt;Monitor how applications interact with your data systems, including read/write patterns, API dependencies, and real-time processing needs.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Performance baseline: Document current performance metrics and SLA requirements to set a clear performance baseline and identify areas where Databricks can improve efficiency.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;Best practice: &lt;/STRONG&gt;Engage various tools that can speed up an assessment by automatically mapping your data estate.&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Phase 2: Strategic migration planning&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;With clear insights into your environment, develop an approach that balances risk management with business value. This phase helps secure stakeholder support and set realistic expectations.&lt;/P&gt;
&lt;P&gt;Your migration strategy should include:&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Workload prioritization framework:&lt;/STRONG&gt; Create a scoring system based on business impact, technical complexity, and resource needs. High-value, low-complexity workloads make excellent candidates for initial migration phases.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Timeline development:&lt;/STRONG&gt; Build a realistic schedule that considers dependencies, resource availability, and business cycles. Include extra time for addressing challenges and learning new processes.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Success criteria definition:&lt;/STRONG&gt; Set specific, measurable KPIs aligned with business goals, such as performance improvements, cost reductions, or new analytical capabilities.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Resource allocation planning: &lt;/STRONG&gt;Specify the skills and staff needed for each migration phase, including whether specific components might benefit from external expertise.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;Best practice:&lt;/STRONG&gt; Start with a pilot project using noncritical workloads to learn and refine processes before moving to business-critical applications.&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Phase 3: Technical preparation&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Technical preparation creates a foundation for successful migration through proper configuration and security. This phase needs attention to detail and collaboration between infrastructure, security, and development teams.&lt;/P&gt;
&lt;P&gt;Key preparation steps include:&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Environment configuration:&lt;/STRONG&gt; Create separate Azure Databricks environments for development, testing, and production. Configure cluster sizes, runtime versions, and autoscaling policies.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Security implementation: &lt;/STRONG&gt;Set up security controls, including network isolation, access management, and data encryption.&amp;nbsp;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Delta Lake implementation:&lt;/STRONG&gt; Use Delta Lake format for ACID compliance and features like time travel and schema enforcement to maintain data quality and consistency.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Connectivity setup: Create and test secure connections between Azure Databricks and source systems with sufficient bandwidth and minimal latency.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;Best practice:&lt;/STRONG&gt; Use &lt;A href="https://www.databricks.com/product/unity-catalog" target="_blank" rel="noopener"&gt;Azure Databricks Unity Catalog&lt;/A&gt; for precise access control and data governance.&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Phase 4: Data and code migration planning&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Moving data and code requires careful planning to maintain business operations and data integrity. This phase has two main components:&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;ETL migration strategy:&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Workflow mapping: &lt;/STRONG&gt;Map existing ETL processes to Azure Databricks equivalents, using native capabilities to improve efficiency.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Transformation logic conversion: &lt;/STRONG&gt;Convert legacy transformation logic to Spark SQL or PySpark to use Databricks’ distributed processing.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Data quality framework:&lt;/STRONG&gt; Add automated testing to verify data quality and completeness during migration.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Performance optimization: &lt;/STRONG&gt;Create strategies for optimizing workflows through proper partitioning, caching, and resource allocation.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;SQL code migration approach:&lt;/STRONG&gt;&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Code conversion process: &lt;/STRONG&gt;Create a systematic method for working with SQL stored procedures, handling vendor-specific SQL syntax.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Query optimization: &lt;/STRONG&gt;Apply best practices for Spark SQL performance with proper join strategies and partition pruning.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Version control integration: &lt;/STRONG&gt;Implement version control with Git integration for collaborative development and change tracking.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;Best practice:&lt;/STRONG&gt; Monitor the migration using Azure-native tools (such as Azure Monitoring and Azure Databricks Workflows) to identify and resolve bottlenecks in real-time.&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Phase 5: Validation and testing&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Complete testing ensures migration success. Create a testing strategy that includes:&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Data accuracy validation:&lt;/STRONG&gt; Compare migrated data to source systems using automated tools.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Performance validation: &lt;/STRONG&gt;Validate performance under various loads to ensure meeting or exceeding SLAs and previously established performance baseline.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Integration testing:&lt;/STRONG&gt; Check that all system components work together, including external applications.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;User acceptance testing: &lt;/STRONG&gt;Verify with business users that migrated systems meet their needs.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;Phase 6: Team enablement and governance&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Success requires more than technical implementation. Prepare your organization by:&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Role-based training: &lt;/STRONG&gt;Create specific training programs for each user type, from data engineers to business analysts.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Governance framework: &lt;/STRONG&gt;Apply comprehensive governance with Unity Catalog for data classification, access controls, and audit logging.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Support structure: &lt;/STRONG&gt;Define support channels and procedures for addressing issues after migration.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;STRONG&gt;Monitoring framework: &lt;/STRONG&gt;Add proactive monitoring to identify and fix potential issues before they affect operations.&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&lt;STRONG&gt;Best practice: &lt;/STRONG&gt;Schedule regular reviews of compliance and security measures to address evolving risks.&lt;/P&gt;
&lt;P&gt;&lt;STRONG&gt;Measuring success and future optimization&lt;/STRONG&gt;&lt;/P&gt;
&lt;P&gt;Success means delivering clear business value. Monitor key metrics:&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;Query performance improvements&lt;/LI&gt;
&lt;LI aria-level="1"&gt;ETL processing time reduction/data freshness improvement&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Resource utilization efficiency&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Cost savings versus previous systems&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;After migration, focus on ongoing improvements using Azure Databricks features:&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;Automated performance optimization&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Resource management for cost control&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Integration of advanced analytics and AI&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Improved real-time processing&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;A successful Azure Databricks migration requires careful planning across all six phases. This approach minimizes risks while maximizing the benefits of your modernized data platform. The goal extends beyond moving workloads, as it transforms your organization’s data capabilities.&lt;/P&gt;
&lt;P&gt;Want more information about planning your migration? Get our detailed e-book for in-depth guidance on strategies, governance, and business impact measurement. See how organizations improve their data infrastructure and prepare for advanced analytics.&lt;/P&gt;
&lt;P&gt;&lt;A class="lia-external-url" href="https://www.databricks.com/resources/ebook/modernize-your-data-estate-migrating-azure-databricks" target="_blank"&gt;Download the e-book&lt;/A&gt;.&lt;/P&gt;</description>
      <pubDate>Thu, 20 Mar 2025 21:03:46 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/6-critical-phases-to-prepare-for-a-successful-azure-databricks/ba-p/4386984</guid>
      <dc:creator>katiecummiskey</dc:creator>
      <dc:date>2025-03-20T21:03:46Z</dc:date>
    </item>
    <item>
      <title>Part 1: Power BI Service Connections to Azure Databricks with Private Networking</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/part-1-power-bi-service-connections-to-azure-databricks-with/ba-p/4384391</link>
      <description>&lt;H6&gt;This blog was written in conjunction with &lt;A class="lia-external-url" href="https://www.linkedin.com/in/leoafurlongiv/" target="_blank" rel="noopener"&gt;Leo Furlong&lt;/A&gt;, Lead Solutions Architect at Databricks.&lt;/H6&gt;
&lt;H2&gt;Enhancing Security and Connectivity: Azure Databricks SQL, Unity Catalog, and Power BI Integration&lt;/H2&gt;
&lt;P&gt;The combination of Azure &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/sql/" target="_blank"&gt;Databricks SQL&lt;/A&gt;, &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/data-governance/unity-catalog/" target="_blank"&gt;Unity Catalog&lt;/A&gt;, and &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/partners/bi/power-bi" target="_blank"&gt;Power BI&lt;/A&gt; offers an unparalleled set of capabilities for modern data analytics. However, as organizations increasingly prioritize security, many Azure Databricks customers are deploying their Databricks workspace with private networking requirements which requires additional configuration for allowing connections from BI tools like Power BI. This blog post explores the options available for secure Azure Databricks deployments and how to maintain Power BI connectivity in these scenarios.&lt;/P&gt;
&lt;H2&gt;Private Networking Options for Azure Databricks&lt;/H2&gt;
&lt;P&gt;When deploying Azure Databricks with enhanced security, customers can choose from three main private networking configurations:&lt;/P&gt;
&lt;OL&gt;
&lt;LI aria-level="1"&gt;Public Endpoint with an &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/ip-access-list" target="_blank"&gt;IP Access List&lt;/A&gt; for the &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/security/network/front-end/ip-access-list-workspace" target="_blank"&gt;Workspace&lt;/A&gt;: This option exposes a public endpoint for the Azure Databricks workspace but restricts access to specific IP ranges.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Azure &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/security/network/classic/private-link" target="_blank"&gt;Databricks Private Link&lt;/A&gt;: Front-end private link provides fully private connectivity, routing all traffic through private endpoints.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Hybrid Deployment: Combines front-end private link with a public endpoint protected by a Workspace IP Access List which is typically used for SaaS service connections.&lt;/LI&gt;
&lt;/OL&gt;
&lt;H2&gt;Connecting Power BI to a Private Azure Databricks Workspaces&lt;/H2&gt;
&lt;P&gt;While private networking enhances security, it can require additional connection configurations from SaaS services like Power BI. Power BI offers two primary methods for secure connections to data sources with private networking:&lt;/P&gt;
&lt;OL&gt;
&lt;LI aria-level="1"&gt;&lt;A href="https://learn.microsoft.com/en-us/data-integration/gateway/service-gateway-onprem" target="_blank"&gt;On-premises data gateway&lt;/A&gt;: an application that gets installed on a Virtual Machine that has a direct networking connection to the data source. It allows Power BI to connect to data sources that don’t allow public connections&lt;/LI&gt;
&lt;LI aria-level="1"&gt;&lt;A href="https://learn.microsoft.com/en-us/data-integration/vnet/overview" target="_blank"&gt;Virtual Network Data Gateway&lt;/A&gt;: a managed (virtual/serverless) data gateway that gets created and managed by the Power BI service.&amp;nbsp; Connections work by allowing Power BI to delegate into a VNet for secure connectivity to the data source.&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;While Power BI offers these two options, many customers prefer not to manage additional infrastructure or configurations required for these gateways. In such cases, Power BI can be allowed to access the private Azure Databricks workspace through the IP Access List.&lt;/P&gt;
&lt;H2&gt;Implementing Power BI Connectivity via IP Access List&lt;/H2&gt;
&lt;P&gt;To enable the Power BI Service connectivity to a private Azure Databricks workspace using an IP Access List:&lt;/P&gt;
&lt;OL&gt;
&lt;LI aria-level="1"&gt;Obtain the Power BI Public IPs:&lt;BR /&gt;Download the latest &lt;A href="https://www.microsoft.com/en-us/download/details.aspx?id=56519" target="_blank"&gt;Azure IP Ranges and Service Tags&lt;/A&gt; file from the Microsoft Download Center. This file is updated weekly and contains IP ranges for various Azure services, including Power BI.&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Add Power BI IPs to Azure Databricks Workspace IP Access List:&lt;BR /&gt;Extract the Power BI IP ranges from the downloaded file and add them to the Azure Databricks IP Access List using the &lt;A href="https://docs.databricks.com/api/workspace/ipaccesslists" target="_blank"&gt;API&lt;/A&gt; or &lt;A href="https://databricks-sdk-py.readthedocs.io/en/latest/workspace/settings/ip_access_lists.html" target="_blank"&gt;SDK&lt;/A&gt;.&amp;nbsp;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Regular Updates:&lt;BR /&gt;Since Power BI public IPs can change frequently, it's crucial to update the Workspace IP Access List regularly. This can be automated using a Databricks Job that periodically downloads the latest IP ranges and updates the Workspace IP Access List. The Job will need to be run by a Workspace Admin in order to set the configurations. You can run the Databricks Job as a Service Principal to make the updates. If you use the Databricks SDK from within a notebook in the Databricks Workspace, &lt;A href="https://databricks-sdk-py.readthedocs.io/en/latest/authentication.html#notebook-native-authentication" target="_blank"&gt;authentication&lt;/A&gt; is handled for you.&lt;/LI&gt;
&lt;/OL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The following sample code can be used to &lt;A href="https://github.com/yati1002/Power-BI-DatabricksSQL-QuickStart-Samples/blob/main/09.%20Private%20Connections/Turn%20on%20Workspace%20IP%20Access%20List.py" target="_blank"&gt;turn on your Workspace IP Access List&lt;/A&gt; which is more of a one-time operation. The &lt;A href="https://github.com/yati1002/Power-BI-DatabricksSQL-QuickStart-Samples/blob/main/09.%20Private%20Connections/Power%20BI%20IPs%20for%20IP%20Access%20List.py" target="_blank"&gt;Power BI IPs for IP Access List&lt;/A&gt; sample code can be used to refresh your Power BI IPs from a Databricks Workflow.&lt;/P&gt;
&lt;H2&gt;Conclusion&lt;/H2&gt;
&lt;P&gt;By leveraging IP Access Lists, organizations can maintain the security benefits of private Azure Databricks deployments while ensuring seamless connections from Power BI. This approach offers a balance between security and functionality with low maintenance overhead.&lt;/P&gt;</description>
      <pubDate>Fri, 21 Feb 2025 21:50:07 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/part-1-power-bi-service-connections-to-azure-databricks-with/ba-p/4384391</guid>
      <dc:creator>katiecummiskey</dc:creator>
      <dc:date>2025-02-21T21:50:07Z</dc:date>
    </item>
    <item>
      <title>Workarounds for Maven Json-smart 2.5.2 Release Breaking Azure Databricks Job Dependencies</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/workarounds-for-maven-json-smart-2-5-2-release-breaking-azure/ba-p/4377517</link>
      <description>&lt;P&gt;&lt;SPAN data-contrast="auto"&gt;A recent Maven library release corrupted json-smart’s central &lt;/SPAN&gt;&lt;A href="https://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Frepo.maven.apache.org%2Fmaven2%2Fnet%2Fminidev%2Fjson-smart%2Fmaven-metadata.xml&amp;amp;data=05%7C02%7Ckumardivyesh%40microsoft.com%7Cf28ba350adf94365234008dd4b5377ce%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C638749544163497429%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;amp;sdata=LFeZdqyrKahGFIZ6nh3nAls8PwSCXtY%2BifsdZSWUH98%3D&amp;amp;reserved=0" target="_blank"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;metadata&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;, ended up removing all previous versions of Maven library other than 2.5.2. This results in DBR ivy resolution failures if customers’ job has transitive dependency on previous version of json-smart. The impact to Azure Databricks customers is jobs failure and in some cases the job clusters could fail to start. More details and updates of the issue can be found here: &lt;/SPAN&gt;&lt;A href="https://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fgithub.com%2Fnetplex%2Fjson-smart-v2%2Fissues%2F240&amp;amp;data=05%7C02%7Ckumardivyesh%40microsoft.com%7Cf28ba350adf94365234008dd4b5377ce%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C638749544163480765%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;amp;sdata=6ARnkWNMXVf5wi7SbnCTJQny3T3SNyc5eyGYA35XkJg%3D&amp;amp;reserved=0" target="_blank"&gt;&lt;SPAN data-contrast="none"&gt;&lt;SPAN data-ccp-charstyle="Hyperlink"&gt;https://github.com/netplex/json-smart-v2/issues/240&lt;/SPAN&gt;&lt;/SPAN&gt;&lt;/A&gt;&lt;SPAN data-contrast="auto"&gt;.&amp;nbsp;&lt;/SPAN&gt;&lt;SPAN data-ccp-props="{}"&gt;&amp;nbsp;&lt;/SPAN&gt;&lt;/P&gt;
&lt;P&gt;This also affects&amp;nbsp;&lt;A href="https://nam06.safelinks.protection.outlook.com/?url=https%3A%2F%2Fmaven-central.storage.googleapis.com%2Fmaven2%2Fnet%2Fminidev%2Fjson-smart%2Fmaven-metadata.xml&amp;amp;data=05%7C02%7Ckumardivyesh%40microsoft.com%7Cf28ba350adf94365234008dd4b5377ce%7C72f988bf86f141af91ab2d7cd011db47%7C1%7C0%7C638749544163505875%7CUnknown%7CTWFpbGZsb3d8eyJFbXB0eU1hcGkiOnRydWUsIlYiOiIwLjAuMDAwMCIsIlAiOiJXaW4zMiIsIkFOIjoiTWFpbCIsIldUIjoyfQ%3D%3D%7C0%7C%7C%7C&amp;amp;sdata=7UFyZpgYetDJ13F7zO%2Bz3luPdnv9u2emUBQPEtMXcyY%3D&amp;amp;reserved=0" target="_blank"&gt;google’s maven mirror&lt;/A&gt;, which is used by DBR 11+ to resolve maven libraries, and maven central is used as the backup for google’s maven mirror.&lt;/P&gt;
&lt;P&gt;To mitigate the issue, you can take the workaround steps to install the customer’s requested library and smart-json separately&lt;/P&gt;
&lt;P&gt;Steps 1: Install the customer’s requested library (use azure-eventhubs-spark as an example) but exclude json-smart.&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;Step 2: Install net.minidev:json-smart:2.3 (or any other version that was needed)&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&lt;STRONG&gt;Workaround for run time&lt;/STRONG&gt;,&amp;nbsp; if you have a preferred Maven mirror that is not affected by this issue, or if you host a private Maven mirror, you can configure your Databricks environment to use it.&lt;/P&gt;
&lt;UL&gt;
&lt;LI&gt;&lt;STRONG&gt;Step&lt;/STRONG&gt;:&lt;/LI&gt;
&lt;UL&gt;
&lt;LI&gt;Set the Spark configuration parameter spark.databricks.driver.preferredMavenCentralMirrorUrl to your preferred mirror repository URL.&lt;/LI&gt;
&lt;LI&gt;&amp;nbsp;&lt;/LI&gt;
&lt;/UL&gt;
&lt;/UL&gt;</description>
      <pubDate>Wed, 12 Feb 2025 14:47:32 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/workarounds-for-maven-json-smart-2-5-2-release-breaking-azure/ba-p/4377517</guid>
      <dc:creator>LindseyAllen</dc:creator>
      <dc:date>2025-02-12T14:47:32Z</dc:date>
    </item>
    <item>
      <title>Replicating Azure Cosmos DB into Azure Databricks using CDC</title>
      <link>https://techcommunity.microsoft.com/t5/azure-databricks/replicating-azure-cosmos-db-into-azure-databricks-using-cdc/ba-p/4302079</link>
      <description>&lt;P&gt;This blog was written in conjunction with &lt;A class="lia-external-url" href="https://www.linkedin.com/in/dpoulet/" target="_blank" rel="noopener"&gt;David Poulet&lt;/A&gt;, Senior Solutions Architect at Databricks.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Microsoft’s NoSQL database-as-a-service, Azure Cosmos DB, is a key platform in Azure for storing non-relational, transactional data and vectors for applications with high throughput and availability requirements. This data often holds valuable business insights, and the ability to analyze this data at scale with Azure Databricks is a key requirement for many customers. Azure Cosmos DB is optimized for fast reads and writes of individual items. However, in common with other data stores of this type, it is not optimized for analytical workloads and this can create a challenge to analyzing stored data in a performant and cost-effective way.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Microsoft’s solution to this problem is the Analytical Store, which stores a copy of the Azure Cosmos DB data in a columnar format and keeps it up-to-date. However, until recently this feature stored the data in a proprietary format and a hidden location that could not be accessed except via Azure Synapse and was subject to a number of restrictions around the types of data structures and query types that it could handle.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;But there is now a flexible and open solution to this problem! Microsoft has a feature in Azure Data Factory that enables users to replicate the Azure Cosmos DB Analytical Store into their lakehouse in Delta format, automatically inserting/updating/deleting records as the source transactional database changes. The incremental nature of this offers significant cost savings vs pulling data directly from the transactional store and dealing with complex incremental ingestion logic in code. In this article, I’ll show how we can leverage this feature to create a simple process to continuously ingest operational data in Azure Cosmos DB into Azure Databricks’ powerful analytics and AI platform.&amp;nbsp;&lt;/P&gt;
&lt;H2&gt;Architecture Overview&lt;/H2&gt;
&lt;P&gt;The architecture we’ll discuss in this article will use the CDC capability for Azure Cosmos DB within Azure Data Factory to process changes in a Azure Cosmos DB container and then merge them into a Delta Lake table in the lakehouse. See the diagram below:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;Azure Data Factory (ADF) will read a container from Azure Cosmos DB (via the analytical store) and periodically replicate any changes from that container into a Delta Lake table in Azure Databricks. This incremental replication process will operate on a schedule that is defined within ADF.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;There are a couple of possibilities for how we ingest these changes into Azure Databricks: we could move the data to a staging area, and ingest into Bronze from there using a workflow or Delta Live Tables, but for simplicity we’ll write directly to a table in the Bronze layer of our medallion architecture from ADF.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Once the data is in our Bronze layer standard Azure Databricks patterns can be used to cleanse and transform the data into Silver/Gold layers.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The bulk of the activities happen in Azure Data Factory, but there are some prerequisites. Before we can create the CDC pipeline it’s assumed the following already exist:&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;UL&gt;
&lt;LI aria-level="1"&gt;A Azure Cosmos DB for NoSQL container, with Analytical Store enabled.&amp;nbsp;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;Azure Data Factory instance in which to create a CDC pipeline.&amp;nbsp;&lt;/LI&gt;
&lt;LI aria-level="1"&gt;An ADLS storage container to act as our staging area.&amp;nbsp;&lt;/LI&gt;
&lt;/UL&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;With these in place, we can create the CDC pipeline from ADF.&amp;nbsp;&amp;nbsp;&lt;/P&gt;
&lt;H2&gt;Setting Up The Pipeline&lt;/H2&gt;
&lt;P&gt;The feature in ADF that consumes the Azure Cosmos DB changes is in the Data Flows area, so we start by launching the ADF studio and creating a new data flow:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The new data flow needs a Source and a Sink. The source will be our Azure Cosmos DB container and the Sink will be our Delta Table in Bronze.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;First we’ll create and configure the Source to consume from our Azure Cosmos DB container. Click to Add Source in the new Dataflow. In the source settings we have to set the Source Type to Inline and the Inline Dataset Type to Azure Cosmos DB for NoSQL. The Store Type should be set to Analytical.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;The Linked Service should be set to a linked service for Azure Cosmos DB that has been set up to connect to our source container. For details on how to create an ADF Linked Service see the &lt;A href="https://learn.microsoft.com/en-us/azure/data-factory/connector-azure-cosmos-db?tabs=data-factory#create-a-linked-service-to-azure-cosmos-db-using-ui" target="_blank" rel="noopener"&gt;getting started documentation for Azure Cosmos DB&lt;/A&gt;.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In the Source Options for the Data Flow, there are some settings that are important to control the behavior of the reads from the source feed.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;The Container name field is where we select the Azure Cosmos DB container we are interested in. In this example we have a container with some simple customer related data in.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;The Start from field allows us to synchronize ALL the data in the container from the start of its life, or you can select to only sync changes from now on (or from a given timestamp).&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;You have the option to capture intermediate updates, if you want to maintain a history of all the changes, but we are just going to capture the latest state so this is unselected. Capture Deletes ensures that deleted items from the source are also deleted in our Bronze table. Capture Transactional store TTLs means that if items are expired from the Azure Cosmos DB transactional store by the Time-To-Live function, they will also be deleted from our copy of the data. This is enabled by default but many people may not want this functionality as TTL is often used to reduce the data size of the transactional store at the cost of losing historical data, but in the analytics world this historical data is often important. We’ll leave it at the default though for now.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Next we’ll add a Sink to publish the change data to. Click the + button next to the source icon and search for the Sink option.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;We then need to configure the Sink to point to our Bronze table in the lakehouse.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;In the Sink settings we select our incoming source stream (there is only one in this case, the one we just created). We again select Inline for Sink type. And the Inline dataset type is Delta. Once again the Linked service is an ADF linked service which points to a blob container/folder that will store our Bronze table. You can read the documentation for creating an ADF blob linked service (or ADLS, either will work) &lt;A href="https://learn.microsoft.com/en-us/azure/data-factory/connector-azure-blob-storage?tabs=data-factory#create-an-azure-blob-storage-linked-service-using-ui" target="_blank" rel="noopener"&gt;on this page&lt;/A&gt;.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Next, the Settings page for our Sink has some important options to control the behavior of the table we are creating.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;First we need to select the correct Folder path for the folder in the blob container that will store our Bronze table data. Here we have a simple folder called customer where ADF will put the Delta Lake files.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;We also need to think about the Update method field. In this case we will allow Insert (to put new rows in the table as they are added in the source), Delete (to remove rows in the table as they are deleted in the source) and Update (updating existing rows to match changes in the source). To do this ADF needs a unique field in the source that it can match in the target table - so we select List of columns and put {_rid} in the column field. _rid is a system field in Azure Cosmos DB that uniquely identifies a data item.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;At this point we are actually ready to run this Data Flow to start syncing Azure Cosmos DB changes to our Bronze table. To do this we need to create a Pipeline in ADF to run the Data Flow defined above.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In the ADF studio resources section, under Pipelines create a new pipeline, and in that pipeline drag a single action onto the pipeline edit canvas - a Data Flow action.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Once we’ve created a pipeline with a Data Flow action, we will edit the Data Flow action settings to trigger the CDC Data Flow we created above. Here all we need to do is select our data flow in the Data Flow drop-down.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Then, like all ADF pipelines we need a trigger to start the pipeline and we’re ready to start ingesting data. From the pipeline editor menu select&amp;nbsp; Add Trigger and then New/Edit - this will bring up the trigger menu below.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;We’ll set our trigger to run on creation and then run every 5 minutes after that. This means that every 5 minutes the pipeline will get the latest changes from Azure Cosmos DB and push them into our Bronze table.&lt;/P&gt;
&lt;H2&gt;Using The Target Table&lt;/H2&gt;
&lt;P&gt;With the pipeline running, we should start to see data flowing into our target Delta Lake table. I have created a simple customer data set for this example, with three items in the container. After the pipeline has run these items are pushed into a Delta Lake table in our target ADLS container.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In a notebook in Azure Databricks, we can load that Delta Lake table and see its contents:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;We can already access the data in the target Delta table from Azure Databricks. Each time the pipeline in ADF runs, it will update this table with whatever inserts/updates/deletes have happened in the source container.&amp;nbsp;&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;To really make the best use of this as a Bronze table in Azure Databricks, we’re going to create an external table in Unity Catalog to integrate this data with the rest of our UC resources and in this way make it securely accessible to all our Azure Databricks users.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;First in the Catalog view in Azure Databricks we create a new external location:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Then we configure the external location to point to our target ADLS folder.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;In the new external location dialog, we give the location a name, we select the storage credential that we’ll use to access the external container (in this case the managed identity that is assigned to my Azure Databricks workspace), and the URL to the actual storage container itself. Note that if you have not already done so you will have to ensure that the managed identity for your Azure Databricks workspace has been assigned the relevant permissions to access the storage container. For more information on configuring external locations in Azure Databricks &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/connect/unity-catalog/external-locations" target="_blank" rel="noopener"&gt;see this documentation&lt;/A&gt;.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;Finally we can create an external table over our target storage container location so that we can access the table in UC. Inside an Azure Databricks notebook we can do this very simply:&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;P&gt;In the above example this creates the bronze table in the cdc_demo schema of my cdc_catalog &amp;nbsp;catalog. Once this is done we can query this table like any other table in Unity Catalog, and view the data that’s being replicated from Azure Cosmos DB by our ADF pipeline. We can then continue to enrich, clean and merge this data downstream using standard Azure Databricks processes for example &lt;A href="https://learn.microsoft.com/en-us/azure/databricks/delta-live-tables/cdc" target="_blank" rel="noopener"&gt;as shown in the documentation here&lt;/A&gt;.&lt;/P&gt;
&lt;P&gt;&amp;nbsp;&lt;/P&gt;
&lt;img /&gt;
&lt;P&gt;So we can see that with a simple pipeline in ADF, we have created a robust way of opening up our Azure Cosmos DB transactional data to whatever complex analytical processes we want to use in Azure Databricks without reading the transactional data store itself, thus reducing cost and “noisy neighbor” risks.&amp;nbsp;&lt;/P&gt;</description>
      <pubDate>Thu, 21 Nov 2024 15:11:37 GMT</pubDate>
      <guid>https://techcommunity.microsoft.com/t5/azure-databricks/replicating-azure-cosmos-db-into-azure-databricks-using-cdc/ba-p/4302079</guid>
      <dc:creator>katiecummiskey</dc:creator>
      <dc:date>2024-11-21T15:11:37Z</dc:date>
    </item>
  </channel>
</rss>

