Part 2: Migrate Azure Analysis Services to Power BI Premium using Azure Databricks - How-To
Published May 29 2024 04:02 PM 4,195 Views

This post is authored in conjunction with Leo Furlong, Senior Solutions Architect, at Databricks.

 

Many customers choose to migrate their Azure Analysis Services semantic models to Power BI Premium due to the benefits and enhanced functionality described in the Power BI documentation. As customers migrate their semantic models to Power BI, native connections to Azure Databricks SQL become available due to the built-in Databricks SQL connector in Power BI. Databricks SQL Serverless combined with Power BI semantic models can provide customers with a number of benefits including a separation of compute and storage; instant, elastic compute; data refreshes charged by the second; and enterprise Data Warehouse functionality, query performance, and concurrency at any scale. The remainder of this article will focus on the in-and-outs of how to accomplish this migration.

 

Requirements

 

Limitations

  • AAS models migrated to Power BI Premium can’t be edited using web authoring per the documented limits, but as of writing this post, it does appear to allow web authoring on migrated models without issues/errors 
  • AAS models migrated to Power BI can’t be downloaded to Power BI Desktop
  • Parallel connection configuration is only available in model Compatibility Level 1600+

 

Assumptions

  • Your data model in Databricks SQL is in the same structure as your current Azure Analysis Services model in terms of table/column names and column data types.  
  • You are using the same Power BI storage modes as the previous model. 

 

Migrate AAS Models to Power BI Premium

You can migrate your Azure Analysis Services models to Power BI Premium in two primary ways: using the migration utility built into the Power BI Service or using Tabular Editor projects. The purpose of this blog post is not to focus on these migrations but on the post-migration conversation of the model to use Databricks SQL. Due to this fact, we’ll only briefly describe the AAS migration to Power BI Premium below.  

 

Migrate AAS Models using the Power BI Migration Utility 

Microsoft provides a utility in the Power BI Service for migrating Azure Analysis Services models to Power BI Premium. For instructions on how to use the migration utility, see the following documentation page.

  

Migrate AAS Models using Tabular Editor

Tabular models can also be migrated and deployed to Power BI Premium using Tabular Editor. 

 

  1. Open the model.bim file using Tabular Editor and deploy it to Power BI Premium using the XMLA Endpoint. 
  2. If you don’t have a Visual Studio project, open the Azure Analysis Services model directly using Tabular Editor and save it to a file. 
  3. Open and deploy the semantic model file to Power BI Premium using the XMLA Endpoint.

Instructions on how to perform these steps are in the Ongoing Maintenance and Deployment using Tabular Editor section below.

 

Converting your post-migration Power BI Premium Semantic Model to use Databricks SQL Serverless

After you've completed your migration to Power BI Premium, you’ll want to convert your Semantic Model to use Databricks SQL. This requires altering the PowerQuery code that forms the bedrock of your Semantic Model tables. The sections below describe how to perform this conversation using Tabular Editor. This is the primary focus of this post and we’ll review these conversation steps in detail. These steps could easily be automated using the Tabular Object Model, but this will be the topic of a future post.    

 

Update M Code using Tabular Editor

 

Tabular Editor 3 Flow

1) Post Azure Analysis Services migration to Power BI Premium, obtain the “Workspace connection” from the Power BI Workspace settings Premium tab. Instructions here.

 

2) After opening the Tabular Editor 3 application, go to File -> Open -> Model from DB…

KatieCummiskeyDatabricks_0-1715182591080.png

 

 

3) In the Load Semantic Model from Database window, enter the workspace connection string obtained from step 1 into the Server field. Keep Authentication selected to Integrated and click OK.  

KatieCummiskeyDatabricks_1-1715182590770.png

 

 

4) Authenticate to the Power BI Service using your Microsoft Entra ID credentials.

 

5) Select the row for the model you want to open and click OK.

KatieCummiskeyDatabricks_2-1715182590806.png

 

 

6) Once the model is open, click on the Model icon in the TOM Explorer window. In the properties section, expand the Database section. Change the Compatibility Level to 1600+ and save the model.

KatieCummiskeyDatabricks_3-1715182590828.png

 

 

Save the model by clicking the “save changes back to the currently connected database” button.

KatieCummiskeyDatabricks_4-1715182590767.png

 

 

7) While still in model properties, optionally change the parallelism settings for the model which are explained in more detail in the Power BI documentation and PBI blog post. This step is recommended because Databricks SQL can handle query parallelism sent from Power BI beyond the default configurations. Possible values are in the grid below, but your mileage may vary and you may need to test.

 

Make sure to Save the model post configuration change

 

Model Properties for Parallelism

Possible Values

Data Source Default Max Connections

value between 1 and 30

Max Parallelism Per Query

value between 0 and 30

Max Parallelism Per Refresh

value between -1 and 30

 

KatieCummiskeyDatabricks_5-1715182590760.png

 

 

8) Create Shared Expressions (also known as Power BI Parameters in the UI) for the Databricks SQL connection information. For each expression, set the Kind property to M and set the expression values using the appropriate M formulas (examples below). Get the Server Hostname and HTTP Path for your SQL Warehouse using these steps. 

 

KatieCummiskeyDatabricks_6-1715182590830.png

 

 

Expression Name

Kind

Expression

Server_hostname

M

"adb-5343834423590926.6.azuredatabricks.net" meta [IsParameterQuery=true, Type="Text", IsParameterQueryRequired=true]

HTTP_path

M

"/sql/1.0/warehouses/66d1c1444dg06346" meta [IsParameterQuery=true, Type="Text", IsParameterQueryRequired=true]

Catalog

M

"adventureworks" meta [IsParameterQuery=true, Type="Text", IsParameterQueryRequired=true]

Schema

M

"adventureworksdw" meta [IsParameterQuery=true, Type="Text", IsParameterQueryRequired=true]

 

KatieCummiskeyDatabricks_7-1715182590792.png

 

 

KatieCummiskeyDatabricks_8-1715182590758.png

 

 

9) For each semantic model table and each partition in the table, change the M Expression and Mode to their appropriate values. An example of the M Expression is below that references the created expressions and leverages the native Databricks SQL connector in Power BI. For Mode, choose the correct Power BI Storage Mode for your semantic model table/partition based upon your use case.  

 

Save the model after each table/partition modification.  

 

KatieCummiskeyDatabricks_9-1715182590747.png

 

 

KatieCummiskeyDatabricks_10-1715182591380.png

 

 

M Expression Example for the Databricks SQL Connector using Expressions

let

    Source = Databricks.Catalogs(Server_hostname, HTTP_path, null),

    Database = Source{[Name=Catalog,Kind="Database"]}[Data],

    Schema = Database{[Name=Schema,Kind="Schema"]}[Data],

    Data = Schema{[Name="<your table or view name>",Kind="Table"]}[Data]

in

    Data

 

10) Navigate to the Power BI Workspace and select settings for the semantic model being migrated from using the … menu. Click Edit and configure the data source credentials using the following documentation for reference.

 

11) Navigate back to Tabular Editor and “Update table schema…” by right-clicking on each table in the semantic model. If there are any metadata changes between the old and new data source, Tabular Editor will fix the definitions. Save the model.

 

KatieCummiskeyDatabricks_11-1715182590757.png

 

 

Example of TE3 detecting a new column added to the table in Databricks SQL.  

KatieCummiskeyDatabricks_12-1715182590730.png

 

 

12) Delete the old data sources under the Data Sources folder in Tabular Editor. Save the model. Note, Databricks SQL will not show up as a data source in this folder.

 

Using Tabular Editor 2 and Limitations

All UI options and screens in Tabular Editor 2 will be similar to the Tabular Editor 3 steps above. Note, that Tabular Editor 2 doesn’t support automatic table metadata updates for Power Query data sources (new column data types, added or dropped columns, etc…). For information on how to update metadata manually, see the TE2 docs on Power Query data sources.

 

Ongoing Maintenance and Deployment using Tabular Editor

Post-migration, one-time, and ongoing maintenance of your semantic model should be performed using Tabular Editor. Developers and organizations can choose to manage the lifecycle of their semantic model projects in their chosen Git repository tool. Development teams can work with their DevOps teams to implement CICD workflows within their DevOps tooling as required. 

 

Manual Deployment

Tabular Editor provides a Deployment wizard via the UI for manual deployments.

 

1) Obtain the “Workspace connection” from the Power BI Workspace settings Premium tab. Instructions here.

 

2) Open your semantic model project from File -> Open -> From File…

KatieCummiskeyDatabricks_13-1715182590744.jpeg

 

 

3) After making changes to your semantic model that you’d like to deploy, enter into the Deployment wizard from Model -> Deploy…

KatieCummiskeyDatabricks_14-1715182590764.jpeg

 

 

4) Enter the workspace connection (XMLA endpoint) string from your Power BI Workspace. Use Azure AD login Authentication and click Next >. Authenticate using your Entra credentials and MFA. 

KatieCummiskeyDatabricks_15-1715182590749.jpeg

 

 

5) Select the existing Database you want to deploy into and Click Next >.

KatieCummiskeyDatabricks_16-1715182590808.jpeg

 

 

6) Select your deployment options and click Next >.

KatieCummiskeyDatabricks_17-1715182590751.jpeg

 

 

7) In the final step, you can choose to deploy directly from the UI by clicking Deploy. You can also export your deployment to a TMSL Script and execute it from SQL Server Management Studio or another compatible IDE.  

KatieCummiskeyDatabricks_18-1715182590816.jpeg

 

 

8) If you deploy from the UI, a message will be displayed in the bottom left-hand corner of Tabular Editor if the deployment was successful.

KatieCummiskeyDatabricks_19-1715182590766.jpeg

 

 

All UI options and screens in Tabular Editor 2 will be similar to the Tabular Editor 3 steps above. 

 

Automated Deployment and CICD integration 

Tabular Editor provides automated deployment options, including CICD integration from DevOps tools like Azure DevOps or GitHub Actions via command line deployments. Please refer to the following blog posts and GitHub Repo for more information and examples of these deployment options. 

1 Comment
Version history
Last update:
‎May 08 2024 08:48 AM