SOLVED

Skipping Records via Log Analytics API

Copper Contributor

Good Morning,

 

I've been working with Power BI and Log Analytics to visualise the data that we are storing.

I've finally got Power BI to do everything I need it to before I realised that there appears to be no "skip" operator to allow me to paginate throughout the records.

 

I believe there used to be this function, certainly in OMS, so I'm perplexed that it has been overlooked from Log Analytics.

 

Is there an alternative operator that I could use to do the same thing or some workaround to perform record skipping?


Thanks for any and all assistance!

 

Chris

3 Replies

Hi,

I am not sure what exactly is your scenario but the official answer on this is located in take operator documentation:

https://docs.microsoft.com/en-us/azure/kusto/query/takeoperator

A note on paging through a large resultset (or: the lack of a skip operator)

Kusto does not support the complementary skip operator. This is intentional, as take and skip together are mainly used for thin client paging, and have a major performance impact on the service. Application builders that want to support result paging are advised to query for several pages of data (say, 10,000 records at a time) and then display a page of data at a time to the user.

 

 

With that said paging is still possible but you will have to use take and slightly different approach. Probably is also good to do some sorting on time column take 10 000 records. If 10 000 records are return get the time of the last record and shorten the time range. If less 10 000 records are returned you've got all records for that time frame.

Thanks Stanislav and apologies for the delay in responding,

 

I seem to remember reading this now that you've highlighted it for me.  However, I've seen so many contradicting examples of doing this (i.e. when skip seemed to still be an acceptable option) it's hard to keep up with the "current" revision.

 

To provide more information, I am looking at collecting roughly 50 million results to further analyse in PowerBI.

This is seemingly something that Log Analytics does not want/intend for me to do.  

 

So I presume the ways forward would be: -

A) Look into splitting this down into time based segments (still pulling 50 million records!)

B) Store the logs elsewhere that does not have such restrictions/limits,

C) Do more filtering at the Log Analytics side of things and run multiple (smaller) queries to get what you want.

 

I'm interested in what others have done for datasets this vast.

best response confirmed by Chris Tout (Copper Contributor)
Solution

Ok. I get the scenario better now. Not sure if I can give any more advice as I haven't done so much of data export to PowerBI and I would assume you are not using the default export option. My honest opinion is that Log Analytics is made to be the end point for your data or at least the point where you can summarize some results and send them to other services like PowerBI. Hoping that you will find solution for your problem.

1 best response

Accepted Solutions
best response confirmed by Chris Tout (Copper Contributor)
Solution

Ok. I get the scenario better now. Not sure if I can give any more advice as I haven't done so much of data export to PowerBI and I would assume you are not using the default export option. My honest opinion is that Log Analytics is made to be the end point for your data or at least the point where you can summarize some results and send them to other services like PowerBI. Hoping that you will find solution for your problem.

View solution in original post