Forum Widgets
Latest Discussions
Log Analytics query the logs that are not in IP range
Hi All, I'm struggling with writing a query that will find sign-ins in logs that are not in IP ranges. So we have Log Analytics Workplace which is collecting sign-in logs. And we want to trigger an alert when an account is signed in from an IP that is not in one of our IP ranges. We have a lot of known network rages and we have to use an external repository like github with a txt file of those rages. I've tried to use the function "ipv4_is_match()", but from my understanding, it's looking just like to like, but not looking foreach. That being said I've tried something like this, but it doesn't work. Does anyone experienced here can help with writing such a query, or even answer if it's possible? let ipList = externaldata (IPAddress:string) [ @"https://raw.githubusercontent.com/NameOfRepository/IPv4Range.txt" ]; SigninLogs | where UserPrincipalName contains "email address removed for privacy reasons" | where IsInteractive == true | where not (ipv4_is_match(IPAddress , ipList)Denys_bezshkuryiMar 06, 2026Copper Contributor607Views0likes1CommentAPI Query Results Different from Azure Portal
Hello Team, i 'm running a query that i have connect an API with excel. The results for example for a specific user for a specific a day for a conditional access that blocks legacy authentication are more than the results i m getting from azure portal. What results i 'll trust?ankilo2011Feb 23, 2026Copper Contributor419Views0likes1CommentCopy data to Oracle destination
We are trying to copy data to an Oracle DWH, and we are facing issue when trying with different setups on the “Write Batch Size” parameter. The copy activity works when we set the “Write Batch Size” to 1, but of course performances are bad, it writes about 10.000 rows in 5 minutes. To speed up copy we are trying to set the parameter to the default value of 10.000 But in this case, copy data fails with the following error: Failure happened on 'Sink' side. ErrorCode=UserErrorOdbcOperationFailed,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=ERROR [HY000] [Microsoft][ODBC Oracle Wire Protocol driver][Oracle]ORA-00604: error occurred at recursive SQL level 1 ORA-01031: insufficient privileges Error in parameter 1.,Source=Microsoft.DataTransfer.ClientLibrary.Odbc.OdbcConnector,''Type=Microsoft.DataTransfer.ClientLibrary.Odbc.Exceptions.OdbcException,Message=ERROR [HY000] [Microsoft][ODBC Oracle Wire Protocol driver][Oracle]ORA-00604: error occurred at recursive SQL level 1 ORA-01031: insufficient privileges Error in parameter 1.,Source=msora28.dll,' So far we have INSERT privileges on Oracle Schema (in fact writing works with parameter = 1 and using direct SQL), but it looks like something different is used with the default value on Write Batch Size We don’t want to focus on the error message, it is obvious that it has been raised on the Oracle Side. But we need more information in order to understand what’s causing the issue. It looks like ADF is using two different ways to copy data depending on the value of the parameter. Any help would be greatly appreciated. Thanks in advance Alessandroalessandro_horsaFeb 23, 2026Copper Contributor787Views0likes2CommentsAzure application insights workspace based migration related questions
Hello, We have migrated our classic application insights instances to workspace based successfully. After migration, I see that new data post migration is getting stored in linked log analytics workspace (which is expected) and also new data is getting stored at classic application insights as well. As per your documentation, after migration, old data will remain in classic application insight and new data will be stored in linked log analytics workspace. https://learn.microsoft.com/en-us/azure/azure-monitor/app/convert-classic-resource Questions Why new data is still getting stored in old classic app insights after migration? This is not mentioned in https://learn.microsoft.com/en-us/azure/azure-monitor/app/convert-classic-resource. Let us assume that it is getting stored to support the backward compatibility. How many days this is supported after migration? We have existing powerbi reports which are pulling data from classic application insights. After migration, let us suppose if I want some data from old app insights and some from new app insights, in this case, I have to write two separate queries and combine the results. Is my understanding correct?Kiran_HoliFeb 20, 2026Copper Contributor530Views0likes2CommentsAzure Synapse Analytics Pricing Meters - VCore
Hi, I am currently trying to understand the billing meters Service for Azure Synapse Analytics and here are my questions : - Does it exist any API or website that we can deep dive into each meters information? - Does anyone can explain to me what is the VCore meters for Azure Synapse Analytic Regard,Marc-AndrRgimbald2015Feb 20, 2026Copper Contributor464Views0likes1CommentSentinel and limitation of 100 Sentinel instances per page as MSSP
Hi, Are there any plans to increase the amount of Sentinel instances from 100 to a higher number? If you are MSSP with more than 100 customers using Sentinel, and you monitor alerts/incidents, you are limited to 100 customers per view/page.SorenAndersenNov 14, 2025Copper Contributor446Views0likes1CommentFetching JSON object from Cosmos DB in Azure Synapse Studio fails
Hi everyone! Facing an issue while fetching document from linked Cosmos DB in Azure Synapse Studio I've tried: fetching this object as: VARCHAR(MAX) NVARCHAR(MAX) identifying full path in WITH clause: failing_complex_object VARCHAR(MAX) '$.failing_complex_object' trying to decompose it in SELECT clause: ,JSON_QUERY([target_table].[failing_complex_object]) ,JSON_VALUE(failing_complex_object, '$.failing_complex_object') It always returns NULL. Complex objects without issue are bigger size and one of them is JSON object, the other one is JSON array. Complex object with issue is a valid JSON object. The question is: WHY two complex objects are fetching fine but this one fails. Data examples are below. Any help would be greatly appreciated. P.S. Due to NDA cannot share with you the actual DB data but I assure you that SCHEME is absolutely the same. P.S. P.S. Sorry for JSON and SQL examples. Couldn't load all the examples using "Insert code sample" tool. It didn't work. fetch example Document example { "field_01": "title", "field_02": "12345678", "complex_object_01": [ { "field_01": 1, "field_02": "data" }, { "field_01": 2, "field_02": "data" } ], "complex_object_02": { "rows": [ { "field_01": 0.1, "field_02": 0.0, "field_03": 0.1, "rowIndex": 0 } ] }, "failing_complex_object": { "data_01": { "field_01": 0, "field_02": 0 }, "data_02": { "field_01": 0, "field_02": 0 } } } script example SELECT field_01, field_02, complex_object_01, complex_object_02, failing_complex_object FROM OPENROWSET ( PROVIDER = 'CosmosDB', CONNECTION = '', OBJECT = 'target_table') WITH ( field_01 VARCHAR(MAX), field_02 VARCHAR(MAX), complex_object_01 VARCHAR(MAX), complex_object_02 VARCHAR(MAX), failing_complex_object VARCHAR(MAX) ) AS [target_table] WHERE field_01 = 'title' AND field_02 = '12345678'kamiljalilNov 14, 2025Copper Contributor540Views0likes1CommentADF Copy activity runs for too long
My ADF copy activity takes 4 hours to run and ultimately times out after 2 hours of write time from interim tables. Source=Mariadb On-prem, Destination=AZURE SQL. I am running it on Self-Hosted IR on a VM with max disk size. What can be the possible reasons? How to increase write time on ADF?AbhijsrwalaNov 11, 2025Copper Contributor506Views0likes1CommentAzure Pipeline Template times out when trying to create a table that already exists.
Hi All, I have set up an Azure Pipeline using the template 'Copy Dataverse data into Azure SQL using Synapse Link'. I have set this up successfully on another database / environment. This time, the pipeline fails on the script 'Create table'. It is trying to create a table that already exists. Any ideas?lewis_carrNov 11, 2025Copper Contributor397Views0likes1Comment
Tags
- AMA18 Topics
- Log Analytics6 Topics
- azure6 Topics
- Synapse3 Topics
- azure monitor3 Topics
- Log Analytics Workspace3 Topics
- Stream Analytics2 Topics
- azure databricks2 Topics
- Azure Synapse Analtyics2 Topics
- Azure Log Analytics2 Topics