Forum Discussion
Skipping Records via Log Analytics API
- Feb 07, 2019
Ok. I get the scenario better now. Not sure if I can give any more advice as I haven't done so much of data export to PowerBI and I would assume you are not using the default export option. My honest opinion is that Log Analytics is made to be the end point for your data or at least the point where you can summarize some results and send them to other services like PowerBI. Hoping that you will find solution for your problem.
Thanks Stanislav and apologies for the delay in responding,
I seem to remember reading this now that you've highlighted it for me. However, I've seen so many contradicting examples of doing this (i.e. when skip seemed to still be an acceptable option) it's hard to keep up with the "current" revision.
To provide more information, I am looking at collecting roughly 50 million results to further analyse in PowerBI.
This is seemingly something that Log Analytics does not want/intend for me to do.
So I presume the ways forward would be: -
A) Look into splitting this down into time based segments (still pulling 50 million records!)
B) Store the logs elsewhere that does not have such restrictions/limits,
C) Do more filtering at the Log Analytics side of things and run multiple (smaller) queries to get what you want.
I'm interested in what others have done for datasets this vast.
Ok. I get the scenario better now. Not sure if I can give any more advice as I haven't done so much of data export to PowerBI and I would assume you are not using the default export option. My honest opinion is that Log Analytics is made to be the end point for your data or at least the point where you can summarize some results and send them to other services like PowerBI. Hoping that you will find solution for your problem.