Forum Widgets
Latest Discussions
Looking for an Azure solution for time series validation solution
Hi folks, I am planning to replace one of my old on prem based time series based process validation tool with Azure based solution. Required solution should be able to perform validation of time series coming from various different sources like DBs or CSV file etc(eventually data is coming from different plants in those DBs or CSV files) and it should be able to perform certain calculations(based on some defined mathematical functions) on those time series. Post that it should be able to send it to some centralized Datawarehouse where reporing etc can be performed on that gathered data with tools like Power BI or Graphana. I found one similar sort of product which Azure timeseries but it is going to be out of support in 2025.Prashant_2405May 07, 2026Copper Contributor330Views0likes1CommentIntegrate Jenkins with Azure Databricks & GitHub into VSCode
Hello Team, Greetings of the Day!!! Hope you have a great day ahead!!! We have installed extension of Azure Databricks, GitHub & Jenkins in VSCode. Now the configuration parts come into the picture, so we have configured Azure Databricks & Logged in GitHub in VSCode. Now Turn comes of Jenkins. We want to know that how can we configure Jenkins with GitHub. All Notebooks from Azure Databricks will be version controlled in GitHub for doing that we want to use Jenkins. There is no documentation to do so. Can you guide us how to do it. Reference Link :- https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/ci-cd-jenkins Thank You in advance for any Support or Suggestion : ) Looking forward for your valuable input. Regards, Niral Dave.N-SPECMay 04, 2026Copper Contributor480Views0likes1CommentSend custom data to Log Analytics in Azure Functions
When I send custom biz data to Azure Log Analytics in an Azure Function(C# code), I want to use a OperationalInsightsDataColloector, but I can't find the class in Microsoft.Azure.Operationallnsight package from nuget. Anyone knows whehter it is obseleted?eugene1710May 03, 2026Copper Contributor490Views0likes1CommentHi everyone!
Hi everyone! 👋 I’m new to this community and currently learning Azure Analytics. I’m really excited to be here and connect with people who have experience in this field. I believe the discussions and knowledge shared by members here are very valuable, and I’m looking forward to learning from all of you. If you have any advice, resources, or tips for someone starting with Azure Analytics, I’d really appreciate it. Happy to be part of this community! 😊Thinuka99Mar 13, 2026Copper Contributor56Views2likes0CommentsLog Analytics query the logs that are not in IP range
Hi All, I'm struggling with writing a query that will find sign-ins in logs that are not in IP ranges. So we have Log Analytics Workplace which is collecting sign-in logs. And we want to trigger an alert when an account is signed in from an IP that is not in one of our IP ranges. We have a lot of known network rages and we have to use an external repository like github with a txt file of those rages. I've tried to use the function "ipv4_is_match()", but from my understanding, it's looking just like to like, but not looking foreach. That being said I've tried something like this, but it doesn't work. Does anyone experienced here can help with writing such a query, or even answer if it's possible? let ipList = externaldata (IPAddress:string) [ @"https://raw.githubusercontent.com/NameOfRepository/IPv4Range.txt" ]; SigninLogs | where UserPrincipalName contains "email address removed for privacy reasons" | where IsInteractive == true | where not (ipv4_is_match(IPAddress , ipList)Denys_bezshkuryiMar 06, 2026Copper Contributor704Views0likes1CommentAPI Query Results Different from Azure Portal
Hello Team, i 'm running a query that i have connect an API with excel. The results for example for a specific user for a specific a day for a conditional access that blocks legacy authentication are more than the results i m getting from azure portal. What results i 'll trust?ankilo2011Feb 23, 2026Copper Contributor454Views0likes1CommentCopy data to Oracle destination
We are trying to copy data to an Oracle DWH, and we are facing issue when trying with different setups on the “Write Batch Size” parameter. The copy activity works when we set the “Write Batch Size” to 1, but of course performances are bad, it writes about 10.000 rows in 5 minutes. To speed up copy we are trying to set the parameter to the default value of 10.000 But in this case, copy data fails with the following error: Failure happened on 'Sink' side. ErrorCode=UserErrorOdbcOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ERROR [HY000] [Microsoft][ODBC Oracle Wire Protocol driver][Oracle]ORA-00604: error occurred at recursive SQL level 1 ORA-01031: insufficient privileges Error in parameter 1.,Source=Microsoft.DataTransfer.ClientLibrary.Odbc.OdbcConnector,''Type=Microsoft.DataTransfer.ClientLibrary.Odbc.Exceptions.OdbcException,Message=ERROR [HY000] [Microsoft][ODBC Oracle Wire Protocol driver][Oracle]ORA-00604: error occurred at recursive SQL level 1 ORA-01031: insufficient privileges Error in parameter 1.,Source=msora28.dll,' So far we have INSERT privileges on Oracle Schema (in fact writing works with parameter = 1 and using direct SQL), but it looks like something different is used with the default value on Write Batch Size We don’t want to focus on the error message, it is obvious that it has been raised on the Oracle Side. But we need more information in order to understand what’s causing the issue. It looks like ADF is using two different ways to copy data depending on the value of the parameter. Any help would be greatly appreciated. Thanks in advance Alessandroalessandro_horsaFeb 23, 2026Copper Contributor833Views0likes2CommentsAzure application insights workspace based migration related questions
Hello, We have migrated our classic application insights instances to workspace based successfully. After migration, I see that new data post migration is getting stored in linked log analytics workspace (which is expected) and also new data is getting stored at classic application insights as well. As per your documentation, after migration, old data will remain in classic application insight and new data will be stored in linked log analytics workspace. https://learn.microsoft.com/en-us/azure/azure-monitor/app/convert-classic-resource Questions Why new data is still getting stored in old classic app insights after migration? This is not mentioned in https://learn.microsoft.com/en-us/azure/azure-monitor/app/convert-classic-resource. Let us assume that it is getting stored to support the backward compatibility. How many days this is supported after migration? We have existing powerbi reports which are pulling data from classic application insights. After migration, let us suppose if I want some data from old app insights and some from new app insights, in this case, I have to write two separate queries and combine the results. Is my understanding correct?Kiran_HoliFeb 20, 2026Copper Contributor592Views0likes2CommentsAzure Synapse Analytics Pricing Meters - VCore
Hi, I am currently trying to understand the billing meters Service for Azure Synapse Analytics and here are my questions : - Does it exist any API or website that we can deep dive into each meters information? - Does anyone can explain to me what is the VCore meters for Azure Synapse Analytic Regard,Marc-AndrRgimbald2015Feb 20, 2026Copper Contributor508Views0likes1Comment
Tags
- AMA18 Topics
- Log Analytics6 Topics
- azure6 Topics
- Synapse3 Topics
- azure monitor3 Topics
- Log Analytics Workspace3 Topics
- Stream Analytics2 Topics
- azure databricks2 Topics
- Azure Synapse Analtyics2 Topics
- Azure Log Analytics2 Topics