Forum Widgets
Latest Discussions
Cost-effective alternatives to control table for processed files in Azure Synapse
Hello, good morning.In Azure Synapse Analytics, I want to have a control table for the files that have already been processed by the bronze or silver layers. For this, I wanted to create a dedicated pool, but I see that at the minimum performance level it charges 1.51 USD per hour (as I show in the image), so I wanted to know what other more economical alternatives I have, since I will need to do inserts and updates to this control table and with a serverless option this is not possible.JuanMahechaJun 25, 2025Copper Contributor82Views0likes2CommentsShould I ingest AADNonInteractiveUserSignInLogs from Entra ID to a LAW
As the title says, I am interested in expert opinions on whether I should include the AADNonInteractiveUserSignInLogs from Entra ID in a LAW, as this table dwarfs the SignInLogs in terms of the amount of data (by a factor of 8x) and therefore creates higher costs. Secondly, I am curious if there are ways to reduce the amount of non-interactive SignInLogs that are generated in the first place.CSIMar 20, 2025Copper Contributor52Views0likes0CommentsData archiving of delta table in Azure Databricks
Hi all, Currently I am researching on data archiving for delta table data on Azure platform as there is data retention policy within the company. I have studied the documentation from Databricks official (https://docs.databricks.com/en/optimizations/archive-delta.html) which is about archival support in Databricks. It said "If you enable this setting without having lifecycle policies set for your cloud object storage, Databricks still ignores files based on this specified threshold, but no data is archived." Therefore, I am thinking how to configure the lifecycle policy in azure storage account. I have read the documentation on Microsoft official (https://learn.microsoft.com/en-us/azure/storage/blobs/lifecycle-management-overview) Let say the delta table data are stored in "test-container/sales" and there are lots of "part-xxxx.snappy.parquet" data file stored in that folder. Should I simply specify "tierToArchive", "daysAfterCreationGreaterThan: 1825", "prefixMatch: ["test-container/sales"]? However, I am worried that will this archive mechanism impact on normal delta table operation? Besides, I am worried that what if the parquet data file moved to archive tier contains both data created before 5 years and after 5 years, it is possible? Will it by chance move data earlier to archive tier before 5 years? Highly appreciate if someone could help me out with the questions above. Thanks in advance.Brian_169Jan 03, 2025Copper Contributor251Views0likes1Comment- triebehehNov 18, 2024Iron Contributor31Views0likes1Comment
How are you able to prepare for Azure certification or any other exams with full time job.
I am 29 M and i had been working in cloud since 4 years now , i have worked on azure mostly but i guess now its time for me to look for another jobs in another organization as my salary has been constant since a long time. I feel like getting certified will give more opportunity and better probability of getting my resume shortlisted. Please share any hacks or tips if you haveNithin_khannaNov 16, 2024Copper Contributor79Views0likes2CommentsAzure Log Analytics workspace
Hi All, I have a requirement to keep log analytics workspace in standalone mode. i want o remove or break the communication between Azure resources with the log analytics workspace and keep it in standalone mode to protect the logs which are collected before. i want to keep those logs for an interval of 1 Year. is there any possible way to achieve this requirement. Please suggest. Note:- This workspace is integrated with Sentinel as well.veerakumare22Nov 06, 2024Copper Contributor48Views0likes1CommentUnderstanding Data Ingestion for Log Analytics and Sentinel Workspace
I'm trying to understand how data ingestion works for both Log Analytics and Microsoft Sentinel. Every time we notice a spike in data ingestion costs for Log Analytics, we see a similar increase in Sentinel costs as well. It seems like data is being ingested into both workspaces, potentially doubling the ingestion and driving up our costs. Can someone explain if this is expected behavior, or if there's a way to optimize and avoid duplicate data ingestion between Log Analytics and Sentinel?ram512Oct 11, 2024Copper Contributor185Views0likes1Comment
Resources
Tags
- AMA18 Topics
- Log Analytics6 Topics
- azure6 Topics
- Synapse3 Topics
- azure monitor3 Topics
- Log Analytics Workspace3 Topics
- Stream Analytics2 Topics
- azure databricks2 Topics
- Azure Synapse Analtyics2 Topics
- Azure Log Analytics2 Topics