Forum Widgets
Latest Discussions
Should I ingest AADNonInteractiveUserSignInLogs from Entra ID to a LAW
As the title says, I am interested in expert opinions on whether I should include the AADNonInteractiveUserSignInLogs from Entra ID in a LAW, as this table dwarfs the SignInLogs in terms of the amount of data (by a factor of 8x) and therefore creates higher costs. Secondly, I am curious if there are ways to reduce the amount of non-interactive SignInLogs that are generated in the first place.CSIMar 20, 2025Copper Contributor52Views0likes0CommentsUse Azure Screenshot for my thesis
Dear Microsoft Team, In my bachelor's thesis, I plan to implement a BI process and would like to use screenshots from the development environments of Microsoft Fabric, Azure Data Factory, Azure Data Lake Storage Gen2, Azure Synapse Analytics, and PowerBI. Therefore, I would like to ask if I am allowed to use these types of screenshots in my thesis free of charge. Best regardsmarius1106Jul 15, 2024Copper Contributor295Views0likes0Commentscustomizing Stakeholders's setting so that they can only see the shared dashboard in Azure Devops!
hi every body how can I customize the access settings of Stakeholders in a way that they can NOT see the backlog/board details, especially "assigned to" field and they can follow the project only from the dashboards and analytic views or board/backlog without detailsrebornlatelyJun 15, 2024Copper Contributor193Views0likes0CommentsAzure synapse analytics: understand how to find cost per pipeline
We use the synapse pipelines to integrate data from multiple sources as per project requirement. Since we deal with multiple customers, we need price breakdown for each pipeline execution. Is there anyway how to achieve it?kavitha_eswarappaMar 13, 2024Copper Contributor465Views0likes0CommentsAI-900: Microsoft Azure AI Fundamentals Study Guide
With recent announcements about ChatGPT and OpenAI being integrated into services such as Bing and Microsoft Edge, and app development assistant solutions like Copilot from GitHub, Artificial Intelligence (AI) and Machine Learning (ML) are rapidly transforming the software development landscape, making them essential skills for students who want to be at the forefront of this change. The "AI-900: Microsoft Azure AI Fundamentals" exam provides candidates with an excellent opportunity to demonstrate their understanding of AI and ML concepts, as well as their familiarity with associated Microsoft Azure services. Students having a basic understanding of cloud computing and client-server applications will have an advantage, although experience in data science and software engineering is not essential. This study guide will assist students in understanding what to expect on the exam, as well as the topics covered and extra resources. Study Resources To help students prepare for the AI-900 exam, Microsoft provides a number of resources, including: Microsoft Learn self-pace curriculum: Microsoft Azure AI Fundamentals: Get started with artificial intelligence: Artificial intelligence (AI) enables incredible new ideas and experiences, and Microsoft Azure provides simple services to get you started. GitHub Study Materials 2024lewisharvey01102Mar 11, 2024Copper Contributor1.8KViews0likes0CommentsAzure application insights workspace based migration related questions
Hello, We have migrated our classic application insights instances to workspace based successfully. After migration, I see that new data post migration is getting stored in linked log analytics workspace (which is expected) and also new data is getting stored at classic application insights as well. As per your documentation, after migration, old data will remain in classic application insight and new data will be stored in linked log analytics workspace. Migrate an Application Insights classic resource to a workspace-based resource - Azure Monitor - Azure Monitor | Microsoft Learn Questions Why new data is still getting stored in old classic app insights after migration? This is not mentioned in documentation. Let us assume that it is getting stored to support the backward compatibility. How many days this is supported after migration? We have existing powerbi reports which are pulling data from classic application insights. After migration, let us suppose if I want some data from old app insights and some from new app insights, in this case, I have to write two separate queries and combine the results. Is my understanding correct?Kiran_HoliFeb 21, 2024Copper Contributor443Views0likes0CommentsLogic Flow name in Azure Log Analytics
dependencies | where type == "Ajax" | where success == "False" | where name has "logicflows" | project timestamp, name, resultCode, duration, type, target, data, operation_Name, appName | order by timestamp desc This KQL query in Azure Application Insights> Azure Log Analytics is used to get errors for logicflows. It returns the data but, I cannot see the logicflow name or ID anywhere. Is there any way to fetch logicflow ID? The azure app insight is registered for a power app, where we are using automate flows to call apis. We need the flow's name in analytics. I tried looking the database, there is no field for logic flow's name or ID. Though when seen in user>sessions, it shows name in requestHeaders.oseverma9Feb 15, 2024Copper Contributor332Views0likes0CommentsIdentify classic Application Insights for migration to workspaces
Hello everyone, As many of you may already be aware, the classic version of Application Insights will be deprecated on February 29, 2024. To assist with this transition, I've written a PowerShell script that can identify these classic Application Insights resources on a per-subscription basis. I believe this script could be beneficial to others. # Connect-AzAccount Set-AzContext -Subscription "SUBSCRIPTIONNAME" | Out-Null $AppInsights = Get-AzApplicationInsights $ClassicAppInsights = @() ForEach($AppInsight in $AppInsights) { # This try-catch is needed, since sometimes .IngestionMode seems to reference to nothing # Object reference not set to an instance of an object. try { if ($null -eq $AppInsight.IngestionMode -or $AppInsight.IngestionMode -eq "" -or $AppInsight.IngestionMode -ne "LogAnalytics"){ $ClassicAppInsights += $AppInsight } } catch { $ClassicAppInsights += $AppInsight } } $ClassicAppInsights | ft Kudos to Azure PowerShell - Find Classic Application Insights | IT Should Just Work (isjw.uk) for providing the initial inspiration for this script. Although their approach didn't list all the affected Application Insights in my case, it served as an excellent starting point for my work.ved-leachimJan 30, 2024Copper Contributor701Views0likes0CommentsF&O Dataverse Limitation?
Hello Azure Community, Quick question: If a customer has significant data volumes and feel that they may consume large amounts of data with F&O Dataverse, will they need to purchase additional Dataverse database to use Synapse? I reviewed the following MS Learn Docs and do not find information on limitations: Azure Synapse Link - Power Apps | Microsoft Learn Create an Azure Synapse Link for Dataverse with your Azure Synapse Workspace - Power Apps | Microsoft Learn Frequently asked questions about exporting Microsoft Dataverse table data to Azure Synapse Analytics and Azure Data Lake - Power Apps | Microsoft Learn Thank you in advance for your assistance. LCLicensingConcierge1Jan 11, 2024Former Employee446Views0likes0CommentsKusto Query to Show VM NIC Bandwidths?
A very basic traditional metric for VMs outside of Azure is the bandwidth, in bits per second, of all the VM's NICs. I'm trying to replicate this in Azure (we have an NVA where we need to know which NICs have what kind of bandwidth utilization.) The metrics the diagnostic settings for the NIC write into a log analytics workspace are of this type (a bunch of other columns too but these are relevant here) TimeGenerated Resource MetricName Average TimeGrain <timestamp> NVA NIC1 BytesSentRate 123 PT1M <timestamp> NVA NIC1 BytesReceivedRate 123 PT1M <timestamp> NVA NIC2 BytesSentRate 123 PT1M <timestamp> NVA NIC2 BytesReceivedRate 123 PT1M I'm trying to wrap my head around a Kusto query which would allow me to create a single time series of all four of those metrics. Currently the best I've found is to use "let" to create four data sets, filter each for the resource and specific metric name, project it from the generic metric names (like "BytesSentRate" ) to "nic1BytesSent" and "nic2BytesReceived", then join all four, and then summarize it into a time series that can be rendered. This seems like an absurdly complicated and resource-intensive way of doing it, and one that's a pain to integrate into workbooks and the like. Is there a better way to get a graph of the network traffic in bps from a NIC to put in a log analytics workbook or Azure dashboard? The metric documentation: https://learn.microsoft.com/en-us/azure/azure-monitor/reference/supported-metrics/microsoft-network-networkinterfaces-metricsToivoJan 11, 2024Copper Contributor399Views0likes0Comments
Resources
Tags
- AMA18 Topics
- Log Analytics6 Topics
- azure6 Topics
- Synapse3 Topics
- azure monitor3 Topics
- Log Analytics Workspace3 Topics
- Stream Analytics2 Topics
- azure databricks2 Topics
- Azure Synapse Analtyics2 Topics
- Azure Log Analytics2 Topics