Forum Widgets
Latest Discussions
Sentinel to Detect Storage Account Created
Hi Everyone, When trying to generate query to show storage account when they are created, I'm having bad luck of not been able to see it in Sentinel. The KQL query I have is: AzureActivity | where ResourceProviderValue == "MICROSOFT.STORAGE" | where OperationNameValue == "Microsoft.Storage/storageAccounts/write" When the query is running, it generate no output. Even if take away "| where OperationNameValue == "Microsoft.Storage/storageAccounts/write"", I can see the storage account but not specifically when I spun up a test storage account to detect it. I'll appericate if anyone can help me get this query to work. Side note: I have Azure Monitor Alert enable and I get email but I want those alerts to be shown in Sentinel as an incident.cybersal82Aug 13, 2025Copper Contributor51Views0likes2CommentsHow to archive diagnostic logs sent to storage account
I need help understanding storage append blobs created by diagnostic settings. When a diagnostic setting is configured to log to a storage account, the logs are created as append blobs. I have compliance requirements that mean I need to retain these blobs in immutable storage for 6 years, however, it seems I cannot use the blob lifecycle management feature to change the access tier of the append blobs to "archive tier". It is only supported for block blobs. This page states "Setting the access tier is only allowed on Block Blobs. They are not supported for Append and Page Blobs." https://learn.microsoft.com/en-au/azure/storage/blobs/access-tiers-overview I feel like the lifecycle management feature is often touted as the answer to how to change the access tier for long-term storage scenarios, but it seems that it does not even work with diagnostic logs, which is pretty baffling. How does Microsoft recommend changing diagnostic logs in a storage account to archive tier storage? The only answer I can see would be to implement some an azure function or logic app to read each blob as it's written and write it back to another storage account as a block blob. But the, how do you when the new file has finished being written to. Nevermind the fact that this violates my immutability requirement.reavopJul 10, 2025Copper Contributor160Views0likes3CommentsUnable to upload .exe files to Azure blob container in Azure Storage Account
Hi I am trying to upload an exe file to my Azure Blob container but I am getting below ( forbidden ) error. Also when I tried with .txt, .ppt .ps1 extensions I am able to upload those. Is there any restriction MS has for (.exe files). I am having storage I am having owner permission for the Storage Account resource. Kindly Help Thanks and Regards, HirdeshLancelotbaghel30Mar 03, 2025Copper Contributor79Views0likes1CommentEnable ADDS authetication fails
I am trying to execute the Enable ADDS authentication using Azhybridfiles module. when I try to execute the command Join-AzStorageAccount ` -ResourceGroupName $ResourceGroupName ` -StorageAccountName $StorageAccountName ` -SamAccountName $SamAccountName ` -DomainAccountType $DomainAccountType ` -OrganizationalUnitDistinguishedName $OuDistinguishedName it keeps on progressing with below warning but the progress stuck without completing or any error message.KartikDograFeb 26, 2025Brass Contributor70Views0likes1CommentAzure Table Storage not accessable
Since yesterday, 6th February 2025, access from Excel to Azure Tables with Entra User is no longer possible. The Storage Account Logs shows AuthenticationErrors. In Excel, access to the Storage Account is done via: https://saname.table.core.windows.net Access to the SA is done with Entra User. The users have the permission as Storage Table Data Reader. We tested various storage accounts and locations but encountered the same issue. Access via Azure Explorer, is still possible without any issues.lboeFeb 07, 2025Copper Contributor204Views0likes1CommentAzure file Share Performance issue
Hi Team, I have recently migrate on-premise file server to Azure file share and setup AD authentication. It's public endpoint but we i am copied like 6 GB data then it was taken more then 2 hours even my internet bandwidth very good. I have setup hot tier but still performance is remain the same. support team is also not able to provide proper resolution.pk_TechJan 31, 2025Copper Contributor394Views0likes3Commentshow can we give access to specific folder only within blob in azure storage account container
I am trying to grant access to specific folder and its contents within a blob in a container. I have tried using ACL and SAS URLs but it gives access to whole container.Zer0cool114Jan 16, 2025Copper Contributor1.1KViews0likes3CommentsAzure NetApp Files | Azure cli command to suspend/resume backup at volume level
I'm looking for the corresponding azure cli for the following action https://learn.microsoft.com/en-us/azure/azure-netapp-files/backup-manage-policies#suspend-a-backup-policy-for-a-specific-volume I do see a cli (at policy level) az netappfiles account backup-policy update, which has got the following parameter [--enabled {0, 1, f, false, n, no, t, true, y, yes}] The property to decide policy is enabled or not. Accepted values: 0, 1, f, false, n, no, t, true, y, yes But this is at the netapp account > policy level I'm unable to find the azure cli to do this at specific volume level. Is there a cli for this action at Volume level in the Configure Backups dialog box > Policy State { Suspend/Resume}, how shall we achieve this if we have programmatically do this step.DurgalakshmiSivadasNov 04, 2024Copper Contributor133Views0likes3CommentsAdvise on upload processing
Hello, I am here to seek some advise regarding the architecture of an application. It is written in .net. The application has a part where tenants can upload fotos into their account (blob storage). They can do that either via Web UI or by connecting an uploader application (like Azure Storage Explorer). Each tenant basically has some kind of image-inbox. Those uploaded images need to be processed afterwards. It is worth to mention that we use Aspire and Azurite when developing. My issue starts with the processing of those images. First I was using azure functions to process those images and they have been triggered by storage blob events. This worked but was kind of slow. When we installed Aspire it was even not possible to use Azure Functions anymore. This was the time when we removed Azure Functions and created our own "Worker" application. This application polled the Azure Blob Storage and if there were images started a sub-worker per tenant to process those files. This was already much faster and more reliable. But after deploying to production we quickly found out, that the cost sky-rocketed due to the polling. I then looked into then new GetChangeFeedClient approach which seems to be a less expensive version to get informed about new data in the blob storage. I developed that just to find out, that it is not supported in Azurite, thus we can not even run in locally. At this point I do not even understand why Azurite does not support all of the Azure features. How can anyone develop something without a simulator? Anyway this seems to be a dead end. I also thought about Azure Event Grid but that is also not supported on Azurite. At this point I am not sure what to do. Should I wait for .net 9 and go back to Azure Functions because they will be supported and will I live with the slow processing and overhead? Or should I ditch Azure Blob Storage and find another storage solution? Any advise is appreciated...andreas1415Nov 03, 2024Copper Contributor134Views0likes2Comments
Resources
Tags
- azure blob storage22 Topics
- azure blob13 Topics
- azure files9 Topics
- azure backup7 Topics
- storage explorer6 Topics
- Azure Data Lake4 Topics
- queue storage3 Topics
- Azure Archive3 Topics
- updates2 Topics
- azure disks1 Topic