Forum Widgets
Latest Discussions
How to use multiple sliding_window_counts in a single query
Hi Team, I am using a query like below let Start = startofday(ago(60d)); let End = startofday(now()); let window=30d; //MAU 30d, WAU 7d, DAU 1d let bin = 1d; customEvents | where timestamp >= Start and timestamp <= End | project user_Id, timestamp | evaluate sliding_window_counts(user_Id, timestamp, Start, End, window, bin) | project timestamp, mau=Dcount | render timechart In this query the value will change for window 30d for monthly active users, 7d for weekly active users, 1d for daily active users. I want to print all these 3 generated graphs in a single window it self. is it possible ?SolvedankasaniFeb 15, 2023Microsoft925Views0likes2CommentsAzure AD User attributes from KQL
Is it possible to query Azure AD using KQL. For example to fetch group membership or other user attributes within a KQL query.SolvedratoorNov 10, 2022Copper Contributor6.1KViews0likes1CommentKusto: Complex query for streaming IoT data
Hi, I am wasting hours trying to solve a query in Kusto, I think this requires the expertise of some expert. I have a simplified IoT dataset for the example. Suppose there are two physical locations (Warehouse1 and Warehouse2), each with a temperature and humidity sensor. In addition, there is a pushbutton, a button that when pressed sends a signal with a value "1". I need to know the last value of temperature and humidity received at the time the button is pressed, differentiating by each location. This would be an example of the dataset: let iotdata = datatable(location:string, sensorName: string, value: real, timestamp:datetime) [ "Warehouse1", "Temperature", 19.1,"2022-09-17 08:01:32", "Warehouse1", "Temperature", 19.3,"2022-09-17 08:20:32", "Warehouse1", "Temperature", 19.5,"2022-09-17 08:28:32", "Warehouse1", "Temperature", 19.7,"2022-09-17 08:45:32", "Warehouse1", "Temperature", 20.2,"2022-09-17 09:01:25", "Warehouse1", "Temperature", 20.2,"2022-09-17 09:15:32", "Warehouse1", "Temperature", 20.3,"2022-09-17 09:20:17", "Warehouse1", "Temperature", 20.4,"2022-09-17 09:22:12", "Warehouse1", "Temperature", 20.7,"2022-09-17 09:45:32", "Warehouse1", "Temperature", 20.6,"2022-09-17 09:46:32", "Warehouse1", "Temperature", 20.7,"2022-09-17 09:57:32", "Warehouse1", "Temperature", 21.1,"2022-09-17 10:01:32", "Warehouse1", "Temperature", 21.2,"2022-09-17 10:10:32", "Warehouse1", "Humidity", 60,"2022-09-17 08:01:34", "Warehouse1", "Humidity", 59,"2022-09-17 08:20:34", "Warehouse1", "Humidity", 59,"2022-09-17 08:28:34", "Warehouse1", "Humidity", 58,"2022-09-17 08:45:34", "Warehouse1", "Humidity", 58,"2022-09-17 09:01:27", "Warehouse1", "Humidity", 57,"2022-09-17 09:15:34", "Warehouse1", "Humidity", 57,"2022-09-17 09:20:19", "Warehouse1", "Humidity", 56,"2022-09-17 09:22:14", "Warehouse1", "Humidity", 55,"2022-09-17 09:45:34", "Warehouse1", "Humidity", 54,"2022-09-17 09:46:34", "Warehouse1", "Humidity", 53,"2022-09-17 09:57:34", "Warehouse1", "Humidity", 52,"2022-09-17 10:01:34", "Warehouse1", "Humidity", 51,"2022-09-17 10:10:34", "Warehouse1", "Button", 0,"2022-09-17 7:10:34", "Warehouse1", "Button", 1,"2022-09-17 9:00:01", "Warehouse1", "Button", 0,"2022-09-17 9:00:03", "Warehouse1", "Button", 1,"2022-09-17 10:00:01", "Warehouse2", "Temperature", 19.0,"2022-09-17 08:00:32", "Warehouse2", "Temperature", 19.1,"2022-09-17 08:19:32", "Warehouse2", "Temperature", 19.2,"2022-09-17 08:27:32", "Warehouse2", "Temperature", 19.5,"2022-09-17 08:46:32", "Warehouse2", "Temperature", 20.0,"2022-09-17 09:05:25", "Warehouse2", "Temperature", 20.1,"2022-09-17 09:13:32", "Warehouse2", "Temperature", 20.2,"2022-09-17 09:21:17", "Warehouse2", "Temperature", 20.3,"2022-09-17 09:23:12", "Warehouse2", "Temperature", 20.4,"2022-09-17 09:46:32", "Warehouse2", "Temperature", 20.5,"2022-09-17 09:47:32", "Warehouse2", "Temperature", 20.5,"2022-09-17 09:58:32", "Warehouse2", "Temperature", 20.8,"2022-09-17 10:02:32", "Warehouse2", "Temperature", 21.2,"2022-09-17 10:11:32", "Warehouse2", "Humidity", 61,"2022-09-17 08:01:34", "Warehouse2", "Humidity", 59,"2022-09-17 08:20:34", "Warehouse2", "Humidity", 58,"2022-09-17 08:28:34", "Warehouse2", "Humidity", 59,"2022-09-17 08:45:34", "Warehouse2", "Humidity", 58,"2022-09-17 09:01:27", "Warehouse2", "Humidity", 57,"2022-09-17 09:15:34", "Warehouse2", "Humidity", 56,"2022-09-17 09:20:19", "Warehouse2", "Humidity", 56,"2022-09-17 09:22:14", "Warehouse2", "Humidity", 54,"2022-09-17 09:45:34", "Warehouse2", "Humidity", 53,"2022-09-17 09:46:34", "Warehouse2", "Humidity", 52,"2022-09-17 09:57:34", "Warehouse2", "Humidity", 51,"2022-09-17 10:01:34", "Warehouse2", "Humidity", 50,"2022-09-17 10:10:34", "Warehouse2", "Button", 0,"2022-09-17 7:10:34", "Warehouse2", "Button", 1,"2022-09-17 9:00:02", "Warehouse2", "Button", 0,"2022-09-17 9:00:03", "Warehouse2", "Button", 1,"2022-09-17 10:00:01", ] Thank you in advance for whoever manages to solve this challenge.SolvedamolinamartinezSep 17, 2022Copper Contributor1.2KViews0likes2CommentsKQL - Convert one object with different key value pairs to a list
Hi! I have the following KQL: tableName | where TimeGenerated > ago(1h) | project-rename State=State_s, Status=Status_s, startTime=startTime_t, itemsFailed=itemsFailed_d, itemsProcessed=itemsProcessed_d, name=name_s, durationInSec=durationInSec_d | project TimeGenerated, Status, startTime, State,itemsFailed,itemsProcessed, durationInSec, name | where itemsProcessed != 0 | summarize AvgItemsProccessed = round(avg(itemsProcessed)), MinItemsProccessed = round(min(itemsProcessed)), MaxItemsProccessed = round(max(itemsProcessed)) It returns: How can I have a list instead? I would like to get something like: AvgItemsProcessed 17,369 MinItemsProcessed 16,900 MaxItemsProcessed 17,861 I experiented with make_list, mv_expand but with no luck. Thanks in advance! RubenSolvedRubenJZAug 10, 2022Copper Contributor2KViews0likes2CommentsTooling for managing ADX deployment.
Is there some tooling planned for maning ADX deployment like it is the case for Azure SQL Database ? I am not talking about deploying the ADX Azure resource but to deploy new versions of a function or update a mapping ... With Azure SQL Database, there is some Microsoft or external provider tooling to do state based or migration based deployment, with the use of sqlproj for instance, I did not find anything regarding ADX. I have seen there is an Azure DevOps task to deploy kql scripts but that's all and it implies that we should take care of making the kql scripts indempotent to avoid mistakes when the pipeline is replayed for another deployment. How do you advise to handle an ADX project, store kql files describing the "structure" of the database in a git repository and use an Azure DevOps pipeline to regularly deploy that ? Do you have samples on how you are handling that ?SolvedalexandrenedelecFeb 01, 2021Brass Contributor720Views0likes1CommentUpdate Existing data in ADX
Hello, I'm new to the community and Azure Data Explorer. I'm trying to find a way to update the existing records in the kusto table. I can across some blog posts which suggest that it is not possible. Did anyone find a way around to solve such a problem? Best, PrateekSolvedPrateekNCRJan 28, 2021Copper Contributor3.5KViews0likes1CommentWhy doesn't datetime(tostring(now())) work?
I ask b/c I am using parameters like let start_time = datetime({startTime}); let end_time = datetime({endTime}); and since you don't support datetime(datetime) so I could pass, say, a startTime=now(), I try to pass the tostring(now()) but that doesnt' work either. What is the proper way to pass a parameter and construct a datetime out of the parameter value, if I am testiong in Kusto explorer and want to manually drive my query? The query must remain as-is, in accepting a datetime({startTime}) style as it is all throughout our team's parameterized queries in dashboards, etc.SolveddudeedDec 16, 2020Former Employee997Views0likes1Commenthow to configure process of continuous export of data from Azure Data Explorer to Azure Data Lake v1
Hi Is is possible to configure continuous export of data from Azure Data Explorer to Azure Data Lake v1 in background? For example I have created table .create external table ExternalTableADL01 (name:string, age:int, [date]:datetime) kind=adl dataformat=csv ( h@'adl://xxxxx.azuredatalakestore.net/folder;impersonate' ) .create-or-alter continuous-export ContinuousExportDemo01 to table ExternalTableADL01 with (intervalBetweenRuns=5m ) <| TestCursor But I have error An admin command cannot be executed due to an invalid state: State='External table 'ExternalTableADL01' cannot be used for continuous export as it uses impersonate authentication type' based on official docs there is only one way https://docs.microsoft.com/en-us/azure/data-explorer/kusto/api/connection-strings/storage Append ;token=AadToken to the URI, with AadToken being a base-64 encoded AAD access token (make sure the token is for the resource https://management.azure.com/) but it is kind of interactive, because token is expiring.Solvedalexander tikhomirovSep 15, 2020Brass Contributor2.6KViews0likes2Comments
Resources
Tags
- Azure Data Explorer (Kusto)66 Topics
- Kusto language36 Topics
- AMA16 Topics
- Ingestion8 Topics
- Azure Data Explorer6 Topics
- announcements5 Topics
- Azure Data Explorer AMA5 Topics
- microsoft fabric5 Topics
- Cluster Management4 Topics