Forum Widgets
Latest Discussions
Understanding Data Ingestion for Log Analytics and Sentinel Workspace
I'm trying to understand how data ingestion works for both Log Analytics and Microsoft Sentinel. Every time we notice a spike in data ingestion costs for Log Analytics, we see a similar increase in Sentinel costs as well. It seems like data is being ingested into both workspaces, potentially doubling the ingestion and driving up our costs. Can someone explain if this is expected behavior, or if there's a way to optimize and avoid duplicate data ingestion between Log Analytics and Sentinel?DeletedOct 11, 2024118Views0likes0CommentsUse Azure Screenshot for my thesis
Dear Microsoft Team, In my bachelor's thesis, I plan to implement a BI process and would like to use screenshots from the development environments of Microsoft Fabric, Azure Data Factory, Azure Data Lake Storage Gen2, Azure Synapse Analytics, and PowerBI. Therefore, I would like to ask if I am allowed to use these types of screenshots in my thesis free of charge. Best regardsmarius1106Jul 15, 2024Copper Contributor266Views0likes0CommentsUnable to retrieve query data using Log Analytics API
I have been trying to access Azure KQL data with the help of Log Analytics REST API, the connection is successful showing a 200 response but I am only getting the table headers and not getting any data in the table. Does anyone know how to resolve this? Code snippet: import requests import urllib3 from azure.identity import DefaultAzureCredential from datetime import datetime, timedelta, timezone import certifi import os os.environ["REQUESTS_CA_BUNDLE"] = certifi.where() verify_cert = certifi.where() credential = DefaultAzureCredential() # Set the start and end time for the query end_time = datetime.now(timezone.utc) start_time = end_time - timedelta(hours=6) # Set the query string query = ''' KubePodInventory | take 5 ''' # Set the workspace ID workspace_id = "XXXXXXXXXXXXXXXXXXXXXXXX" # Set the API endpoint api_endpoint = f"https://api.loganalytics.io/v1/workspaces/{workspace_id}/query" # Set the request payload payload = { "query": query, "timespan": f"{start_time.isoformat()}Z/{end_time.isoformat()}Z" } # Set the request headers headers = { "Content-Type": "application/json" } # Disable SSL certificate verification urllib3.disable_warnings(urllib3.exceptions.InsecureRequestWarning) # Authenticate the request using the Azure credential access_token = credential.get_token("https://api.loganalytics.io/.default").token headers["Authorization"] = f"Bearer {access_token}" # Send the POST request response = requests.post(api_endpoint, json=payload, headers=headers, verify=False) # Check the response status if response.status_code == 200: data = response.json() tables = data.get('tables', []) if tables: table = tables[0] # Assuming there is only one table returned columns = table.get('columns', []) rows = table.get('rows', []) if columns and rows: for row in rows: for i, column in enumerate(columns:( column_name = column['name'] column_type = column['type'] row_value = row[i] print(f"Column name: {column_name}, Data type: {column_type}, Value: {row_value}") else: print("Empty table or no data in table") else: print("No tables found in the response") else: print(f"Request failed with status code: {response.status_code}") print(f"Error message: {response.text}")Krishna1994Jun 20, 2024Copper Contributor332Views0likes0Commentscustomizing Stakeholders's setting so that they can only see the shared dashboard in Azure Devops!
hi every body how can I customize the access settings of Stakeholders in a way that they can NOT see the backlog/board details, especially "assigned to" field and they can follow the project only from the dashboards and analytic views or board/backlog without detailsrebornlatelyJun 15, 2024Copper Contributor178Views0likes0CommentsADF Pipeline Data flow issue
I've created a data flow where the source is ADLS and the sink is an ADLS delta file. I attempted to run the data flow, but I encountered the following issue Job failed due to reason: com.microsoft.dataflow.Issues: DF-EXPR-200 - Function 'outputs' applies to only sink transformation with right saveOrder - [418 464 564 737],[207 319 418 464 564 737],EXE-0001, surrogateKey1 derive( ) ~> derivedColumn1,Dataflow cannot be analyzed as a graph,[62 207 319 418 464 564 737] DF-EXPR-200 - Function 'outputs' applies to only sink transformation with right saveOrder - [418 464 564 737],EXE-0001, surrogateKey1 derive( ) ~> derivedColumn1,Dataflow cannot be analyzed as a graph,[209 319 418 464 564 737]Sheeraz27Apr 18, 2024Copper Contributor469Views0likes0CommentsHow to rigger an azure synapse pipeline after a file is dropped into an azure file-share.
I am currently looking for ideas to trigger an azure synapse pipeline after a file is dropped into an azure file-share. I feel that azure functions might be a suitable candidate to implement this. Azure synapse pipelines don't natively support this at the moment. Microsoft do offer a custom events trigger extension capability. However, so far I have found very little evidence demonstrating how to leverage this capability to trigger my synapse pipeline. Any assistance with approaches to solving this would be greatly appreciated. Thanks.mikejmintoApr 05, 2024Copper Contributor264Views0likes0CommentsAzure synapse analytics: understand how to find cost per pipeline
We use the synapse pipelines to integrate data from multiple sources as per project requirement. Since we deal with multiple customers, we need price breakdown for each pipeline execution. Is there anyway how to achieve it?kavitha_eswarappaMar 13, 2024Copper Contributor433Views0likes0CommentsAI-900: Microsoft Azure AI Fundamentals Study Guide
With recent announcements about ChatGPT and OpenAI being integrated into services such as Bing and Microsoft Edge, and app development assistant solutions like Copilot from GitHub, Artificial Intelligence (AI) and Machine Learning (ML) are rapidly transforming the software development landscape, making them essential skills for students who want to be at the forefront of this change. The "AI-900: Microsoft Azure AI Fundamentals" exam provides candidates with an excellent opportunity to demonstrate their understanding of AI and ML concepts, as well as their familiarity with associated Microsoft Azure services. Students having a basic understanding of cloud computing and client-server applications will have an advantage, although experience in data science and software engineering is not essential. This study guide will assist students in understanding what to expect on the exam, as well as the topics covered and extra resources. Study Resources To help students prepare for the AI-900 exam, Microsoft provides a number of resources, including: Microsoft Learn self-pace curriculum: Microsoft Azure AI Fundamentals: Get started with artificial intelligence: Artificial intelligence (AI) enables incredible new ideas and experiences, and Microsoft Azure provides simple services to get you started. GitHub Study Materials 2024lewisharvey01102Mar 11, 2024Copper Contributor1.6KViews0likes0CommentsAzure application insights workspace based migration related questions
Hello, We have migrated our classic application insights instances to workspace based successfully. After migration, I see that new data post migration is getting stored in linked log analytics workspace (which is expected) and also new data is getting stored at classic application insights as well. As per your documentation, after migration, old data will remain in classic application insight and new data will be stored in linked log analytics workspace. Migrate an Application Insights classic resource to a workspace-based resource - Azure Monitor - Azure Monitor | Microsoft Learn Questions Why new data is still getting stored in old classic app insights after migration? This is not mentioned in documentation. Let us assume that it is getting stored to support the backward compatibility. How many days this is supported after migration? We have existing powerbi reports which are pulling data from classic application insights. After migration, let us suppose if I want some data from old app insights and some from new app insights, in this case, I have to write two separate queries and combine the results. Is my understanding correct?Kiran_HoliFeb 21, 2024Copper Contributor402Views0likes0CommentsLogic Flow name in Azure Log Analytics
dependencies | where type == "Ajax" | where success == "False" | where name has "logicflows" | project timestamp, name, resultCode, duration, type, target, data, operation_Name, appName | order by timestamp desc This KQL query in Azure Application Insights> Azure Log Analytics is used to get errors for logicflows. It returns the data but, I cannot see the logicflow name or ID anywhere. Is there any way to fetch logicflow ID? The azure app insight is registered for a power app, where we are using automate flows to call apis. We need the flow's name in analytics. I tried looking the database, there is no field for logic flow's name or ID. Though when seen in user>sessions, it shows name in requestHeaders.oseverma9Feb 15, 2024Copper Contributor311Views0likes0Comments
Resources
Tags
- AMA18 Topics
- azure6 Topics
- Log Analytics6 Topics
- Synapse3 Topics
- azure monitor3 Topics
- Azure Synapse Analtyics2 Topics
- Azure Log Analytics2 Topics
- Log Analytics Workspace2 Topics
- Application Insights2 Topics
- Contact2 Topics