Forum Widgets
Latest Discussions
Unable to retrieve query data using Log Analytics API
I have been trying to access Azure KQL data with the help of Log Analytics REST API, the connection is successful showing a 200 response but I am only getting the table headers and not getting any data in the table. Does anyone know how to resolve this? Code snippet: import requests import urllib3 from azure.identity import DefaultAzureCredential from datetime import datetime, timedelta, timezone import certifi import os os.environ["REQUESTS_CA_BUNDLE"] = certifi.where() verify_cert = certifi.where() credential = DefaultAzureCredential() # Set the start and end time for the query end_time = datetime.now(timezone.utc) start_time = end_time - timedelta(hours=6) # Set the query string query = ''' KubePodInventory | take 5 ''' # Set the workspace ID workspace_id = "XXXXXXXXXXXXXXXXXXXXXXXX" # Set the API endpoint api_endpoint = f"https://api.loganalytics.io/v1/workspaces/{workspace_id}/query" # Set the request payload payload = { "query": query, "timespan": f"{start_time.isoformat()}Z/{end_time.isoformat()}Z" } # Set the request headers headers = { "Content-Type": "application/json" } # Disable SSL certificate verification urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning) # Authenticate the request using the Azure credential access_token = credential.get_token("https://api.loganalytics.io/.default").token headers["Authorization"] = f"Bearer {access_token}" # Send the POST request response = requests.post(api_endpoint, json=payload, headers=headers, verify=False) # Check the response status if response.status_code == 200: data = response.json() tables = data.get('tables', []) if tables: table = tables[0] # Assuming there is only one table returned columns = table.get('columns', []) rows = table.get('rows', []) if columns and rows: for row in rows: for i, column in enumerate(columns:( column_name = column['name'] column_type = column['type'] row_value = row[i] print(f"Column name: {column_name}, Data type: {column_type}, Value: {row_value}") else: print("Empty table or no data in table") else: print("No tables found in the response") else: print(f"Request failed with status code: {response.status_code}") print(f"Error message: {response.text}")Krishna1994Jan 24, 2025Copper Contributor379Views0likes1CommentData archiving of delta table in Azure Databricks
Hi all, Currently I am researching on data archiving for delta table data on Azure platform as there is data retention policy within the company. I have studied the documentation from Databricks official (https://docs.databricks.com/en/optimizations/archive-delta.html) which is about archival support in Databricks. It said "If you enable this setting without having lifecycle policies set for your cloud object storage, Databricks still ignores files based on this specified threshold, but no data is archived." Therefore, I am thinking how to configure the lifecycle policy in azure storage account. I have read the documentation on Microsoft official (https://learn.microsoft.com/en-us/azure/storage/blobs/lifecycle-management-overview) Let say the delta table data are stored in "test-container/sales" and there are lots of "part-xxxx.snappy.parquet" data file stored in that folder. Should I simply specify "tierToArchive", "daysAfterCreationGreaterThan: 1825", "prefixMatch: ["test-container/sales"]? However, I am worried that will this archive mechanism impact on normal delta table operation? Besides, I am worried that what if the parquet data file moved to archive tier contains both data created before 5 years and after 5 years, it is possible? Will it by chance move data earlier to archive tier before 5 years? Highly appreciate if someone could help me out with the questions above. Thanks in advance.Brian_169Jan 05, 2025Copper Contributor119Views0likes1CommentUnderstanding Data Ingestion for Log Analytics and Sentinel Workspace
I'm trying to understand how data ingestion works for both Log Analytics and Microsoft Sentinel. Every time we notice a spike in data ingestion costs for Log Analytics, we see a similar increase in Sentinel costs as well. It seems like data is being ingested into both workspaces, potentially doubling the ingestion and driving up our costs. Can someone explain if this is expected behavior, or if there's a way to optimize and avoid duplicate data ingestion between Log Analytics and Sentinel?DeletedDec 26, 2024161Views0likes1CommentUnderstanding Data Ingestion for Log Analytics and Sentinel Workspace
I'm trying to understand how data ingestion works for both Log Analytics and Microsoft Sentinel. Every time we notice a spike in data ingestion costs for Log Analytics, we see a similar increase in Sentinel costs as well. It seems like data is being ingested into both workspaces, potentially doubling the ingestion and driving up our costs. Can someone explain if this is expected behavior, or if there's a way to optimize and avoid duplicate data ingestion between Log Analytics and Sentinel?ram512Dec 20, 2024Copper Contributor158Views0likes1CommentFinetuning Sentinel alert for user added to Azure Active Directory Privileged Groups
I am here to ask for help with finetuning an alert in Sentinel. The alert is called "User added to Azure Active Directory Privileged Groups", and it's triggered when a user is added to a privileged group in Azure Active Directory. The problem is that this alert is also triggered when IAM team members activate their PIM roles for day-to-day activities. This is not a security incident, and it's causing us to get a lot of false positives. I'm trying to figure out a way to finetune the alert so that it's only triggered for actual security incidents. I am looking for ways or any ideas for finetuning the alert. to proceed with this further. If you have any suggestions, please let me know.Monkey_D_LuffyDec 16, 2024Copper Contributor2KViews0likes2Comments- triebehehNov 18, 2024Iron Contributor13Views0likes1Comment
How are you able to prepare for Azure certification or any other exams with full time job.
I am 29 M and i had been working in cloud since 4 years now , i have worked on azure mostly but i guess now its time for me to look for another jobs in another organization as my salary has been constant since a long time. I feel like getting certified will give more opportunity and better probability of getting my resume shortlisted. Please share any hacks or tips if you haveNithin_khannaNov 18, 2024Copper Contributor54Views0likes2CommentsAzure Log Analytics workspace
Hi All, I have a requirement to keep log analytics workspace in standalone mode. i want o remove or break the communication between Azure resources with the log analytics workspace and keep it in standalone mode to protect the logs which are collected before. i want to keep those logs for an interval of 1 Year. is there any possible way to achieve this requirement. Please suggest. Note:- This workspace is integrated with Sentinel as well.veerakumare22Nov 07, 2024Copper Contributor32Views0likes1Comment
Resources
Tags
- AMA18 Topics
- azure6 Topics
- Log Analytics6 Topics
- azure monitor3 Topics
- Synapse3 Topics
- Azure Log Analytics2 Topics
- Azure Synapse Analtyics2 Topics
- Application Insights2 Topics
- Databricks2 Topics
- Log Analytics Workspace2 Topics