Querying multiple Log analytics workspace at once.
Published Nov 06 2019 08:51 PM 19.8K Views

Hello folks,


We’ve been in Orlando all week at Microsoft Ignite Orlando, and it has been a busy week.  Today, I meet with a sysadmin who wanted to know the best option to query multiple Azure Log Analytics workspace.


Here is the scenario he was looking at.


“Our company deploys a solution to different subscriptions. 1 per customer.  So a new customer is on-boarded by creating a new subscription, deploying the solution in it and providing the new URL of the service we provide to the customer.”


They have decided to do this to be able to separate the billing per subscriptions cleanly.  Their first idea was to ingest all the data from all the Log Analytics workspace in a “Master” workspace.




From there, they would write all the queries they need for their dashboards and alert without having to run them in each workspace.  While this is possible, it’s not the most efficient way of doing it, and it could become costly because they would then be ingesting the data twice, and this would affect the pricing. (remember that the first 5GB of data ingested is free in a Pay-As-You-Go model)


What we came up with was to start using cross-resource log queries.  This allows them to query not only across multiple Log Analytics workspaces, but also data from Application Insights in the same resource group, another resource group, or another subscription.


This is acceptable to them, but if you’re considering this solution for yourself, remember that there are some limitations:


Cross-resource query limits (Excerpt from Docs.Microsoft.com)

  • The number of Application Insights resources and Log Analytics workspaces that you can include in a single query is limited to 100.
  • The cross-resource query is not supported in View Designer. You can Author a query in Log Analytics and pin it to Azure dashboard to visualize a log query.
  • Cross-resource query in log alerts is supported in the new scheduledQueryRules API. By default, Azure Monitor uses the legacy Log Analytics Alert API for creating new log alert rules from Azure portal, unless you switch from legacy Log Alerts API. After the switch, the new API becomes the default for new alert rules in Azure portal, and it lets you create cross-resource query log alerts rules. You can create cross-resource query log alert rules without making the switch by using the Azure Resource Manager template for scheduledQueryRules API – but this alert rule is manageable though scheduledQueryRules API and not from Azure portal.


For the sysadmin, I was speaking with those limitations were not an issue.  But this is a stop-gap measure until they can figure out a permanent solution.  (they really hope to have more than 100 customers…)


To query multiple workspaces, you need to reference the workspace in your query, using the workspace identifier, and for an app from Application Insights, use the app identifier.


The identifiers can be multiple types:

  • Resource name or Component Name



  • Qualified name. It’s like the fully qualified name in this format “subscriptionName/resourceGroup/componentName”.  Considering that component names may not be unique, this is a good option.



  • The Workspace ID. It’s the unique identifier assigned to each workspace represented as a globally unique identifier (GUID). This is a better option since it is unique, but in my opinion, it can be confusing since very few of us can actually remember a GUID.




  • The Azure Resource ID. The Azure-defined unique identity of the workspace. This is the best option since it’s unique and easy to recognize.




If they wanted to query Application Insights instead of Log Analytics, the query would start with “app()” instead of “workspace().”


In the end, all their queries will still need to be modified to add the proper cross-query information like the example below.




You can find more info on Azure Data Explorer Reference here.  And as for the Query Language, there is a detailed reference which you can find here.  Or you can visit this page for a tutorial.


In the end, they might end up ingesting all the data in a master log analytics workspace. But that’s the subject of my next post.




Pierre Roman


Version history
Last update:
‎Nov 06 2019 08:51 PM
Updated by: