Blog Post

Azure Observability Blog
1 MIN READ

View and query Log Analytics with Kibana and Azure Data Explorer

OREN_SALZBERG's avatar
OREN_SALZBERG
Icon for Microsoft rankMicrosoft
Jan 28, 2021

View and Query Log Analytics in Kibana dashboard using Azure Data Explorer

This experience enables you to query Azure Log Analytics in Kibana, using the Azure Data Explorer and Kibana integration and the cross-service query ability between Azure Data Explorer and Azure Log Analytics (see more info here) so you could join and analyze all your data in one place.

 

 

Follow the instructions to set up an integration between your Kibana instance and your Azure Data Explorer cluster, than, create a function in Azure Data Explorer with the following pattern:

 

cluster('https://ade.loganalytics.io/subscriptions/<subscription-id>/resourcegroups/<resource-group-name>/providers/microsoft.operationalinsights/workspaces/<workspace-name>').database('<workspace-name>').<tablename>

 

 

Now, the function is in Kibana and you can configure an index pattern and query your Log Analytics data.

 

Note that the ability to query Log Analytics from Azure Data Explorer is in public preview, for any questions please contact ADXProxy team.

Updated Apr 08, 2022
Version 2.0
  • Hi marco_italy , and zwang716

    You should grant access from the service principal to the Log Analytics workspaces.
    To do so, go to your Log Analytics resource on the Azure portal.
    Click on Access Control (IAM) and then Add.
    Click "Add role assignment."
    Set the Role to Reader.
    On the "Select" dropdown, select the service principal created earlier, and make sure it appears under Selected members.
    Click Save.

  • marco_italy's avatar
    marco_italy
    Copper Contributor

    Hi,

    thanks for the article.

    I've created the DataExplorer, the function to connect the LogAnalytics cluser, the service principal to grant access to elastic search and succesfully deployed the containers on my Azure Aks.

     

    Kibana can see the DataExplorer cluster and is also listing the functions that I have created to map the LogAnalytics tables.

     

    I am stuck at the point where Kibana can't find the time field even if the table I am trying to import does contain it with the correct data type (DateTime).

     

    I have also tried to import the index definition manually through a Json

    [
      {
        "_id": "kubeventsindex",
        "_type": "index-pattern",
        "_source": {
          "title": "DBNAME:LogAnalytics_KubeEvents",
          "timeFieldName": "TimeGenerated",
          "fields": "[{\"name\":\"Message\",\"type\":\"string\",\"count\":0,\"scripted\":false,\"searchable\":true,\"aggregatable\":true,\"readFromDocValues\":true},{\"name\":\"TimeGenerated\",\"type\":\"date\",\"count\":0,\"scripted\":false,\"searchable\":true,\"aggregatable\":true,\"readFromDocValues\":true}]"
        }
      }
    ]

    Kibana creates the index correctly

     

    but then the query crashes with this message:

     

    It looks like an missing authorization, but I can't see where to add this authorization.

    Do I also need to add the scope of the LogAnalytics to the service principal I use to access the DataExplorer ?

     

    thanks

    Marco

      

     

     

  • zwang716's avatar
    zwang716
    Copper Contributor

    Hi, OREN_SALZBERG  @Noa Kuperberg

     

    First of all, this a great article. I really appreciated you have taken the time to do the testing and post the information. 

    I've created a Log Analytics workspace (There are data), a Data Explore cluster, and deployed k2dbridge and the Kibana instances. unfortunately, I'm having issues with seeing data in Kibana or new indices in k2dbridge. I do see the data from Data Explore. 

     

    I wonder what would be the culprit.  would this solution be supported by MS support? 

     

    Thanks