Forum Widgets
Latest Discussions
Copy from on-prem to Azure Archive Tier
Hi All, We have an on-prem NetApp File Share. We are thinking of copying data from on-prem NetApp file share to Azure Archive Tier ( Blob) for long term backup. The files in Azure Archive will not be accessed unless there is some special need. We are ok with the 15 hour rehydration time. If we need, we want to be able to rehydrate the files from Archive to a hot/cold blob and access them from NetApp. As I understand we can use CLI in batch mode to upload large number of files to Azure Archive. However how easy or difficult is it going to be to rehydrate the files into a hot/cold blob and read the same through NetApp file share. Essentially ( If I am correct) the rehydrated files will need to be mounted to NetApp. if there any solution for this problem? Thanks PallavPallav415Nov 05, 2025Copper Contributor359Views0likes1CommentSFTP storage event trigger stopped in live Azure synapse mode
My Azure synapse account is linked with git and i am working on my feature branch and created storage event trigger that get files from SPTP server when i run pipeline manual. In master and feature branch it status is started but when i publish it into live mode its status become stop and while publishing i get this error message Forbidden. Role based access check failed for resource /subscriptions Some one know this issuejunaid5Nov 05, 2025Copper Contributor313Views0likes1Comment- JohnTMNov 04, 2025Brass Contributor30Views0likes2Comments
Differences between PowerShell and Browser when upload file
Hi All, Anybody have noticed similar behavior? When uploading the file into the storage account that is working find. But if on the same workstation you try to do this using the PowerShell command: Set-AzStorageBlobContent the if fails to: ErrorCode: AuthorizationPermissionMismatch Here is also the longer trace: $sa = get-azstorageAccount -ResourceGroupName RG01 -Name storage01 $strCTX = New-AzStorageContext -StorageAccountName $sa.StorageAccountName $strCTX | Set-AzStorageBlobContent -File C:\temp\test.txt -Container delate -Blob test.txt -Verbose VERBOSE: Performing the operation "Set" on target "test.txt". Set-AzStorageBlobContent: This request is not authorized to perform this operation using this permission. HTTP Status Code: 403 - HTTP Error Message: This request is not authorized to perform this operation using this permission. ErrorCode: AuthorizationPermissionMismatch ErrorMessage: This request is not authorized to perform this operation using this permission. RequestId: 3150eeb6-761e-0096-2edd-56e8bc000000 Time: Tue, 30 Sep 2025 10:25:51 GMT VERBOSE: Transfer Summary -------------------------------- Total: 1. Successful: 0. Failed: 1. Some thing which makes this a bit more odd, is, when I'm looking for the roles and their data accesses, they both looks like following: So I'm not even sure how I do have access to that SA.Petri-XOct 01, 2025Bronze Contributor41Views0likes1CommentSentinel to Detect Storage Account Created
Hi Everyone, When trying to generate query to show storage account when they are created, I'm having bad luck of not been able to see it in Sentinel. The KQL query I have is: AzureActivity | where ResourceProviderValue == "MICROSOFT.STORAGE" | where OperationNameValue == "Microsoft.Storage/storageAccounts/write" When the query is running, it generate no output. Even if take away "| where OperationNameValue == "Microsoft.Storage/storageAccounts/write"", I can see the storage account but not specifically when I spun up a test storage account to detect it. I'll appericate if anyone can help me get this query to work. Side note: I have Azure Monitor Alert enable and I get email but I want those alerts to be shown in Sentinel as an incident.cybersal82Aug 18, 2025Copper Contributor82Views0likes2CommentsHow to archive diagnostic logs sent to storage account
I need help understanding storage append blobs created by diagnostic settings. When a diagnostic setting is configured to log to a storage account, the logs are created as append blobs. I have compliance requirements that mean I need to retain these blobs in immutable storage for 6 years, however, it seems I cannot use the blob lifecycle management feature to change the access tier of the append blobs to "archive tier". It is only supported for block blobs. This page states "Setting the access tier is only allowed on Block Blobs. They are not supported for Append and Page Blobs." https://learn.microsoft.com/en-au/azure/storage/blobs/access-tiers-overview I feel like the lifecycle management feature is often touted as the answer to how to change the access tier for long-term storage scenarios, but it seems that it does not even work with diagnostic logs, which is pretty baffling. How does Microsoft recommend changing diagnostic logs in a storage account to archive tier storage? The only answer I can see would be to implement some an azure function or logic app to read each blob as it's written and write it back to another storage account as a block blob. But the, how do you when the new file has finished being written to. Nevermind the fact that this violates my immutability requirement.reavopJul 11, 2025Copper Contributor193Views0likes3CommentsAzure NetApp Files | Azure cli command to suspend/resume backup at volume level
I'm looking for the corresponding azure cli for the following action https://learn.microsoft.com/en-us/azure/azure-netapp-files/backup-manage-policies#suspend-a-backup-policy-for-a-specific-volume I do see a cli (at policy level) az netappfiles account backup-policy update, which has got the following parameter [--enabled {0, 1, f, false, n, no, t, true, y, yes}] The property to decide policy is enabled or not. Accepted values: 0, 1, f, false, n, no, t, true, y, yes But this is at the netapp account > policy level I'm unable to find the azure cli to do this at specific volume level. Is there a cli for this action at Volume level in the Configure Backups dialog box > Policy State { Suspend/Resume}, how shall we achieve this if we have programmatically do this step.DurgalakshmiSivadasMay 14, 2025Copper Contributor150Views0likes3CommentsAzure Storage Account feedback
Hey, After having a support case with the engineering team for Azure storage, I was encouraged to leave feedback here as well. Given the current state of Azure storage accounts, the website for Azure storage should be updated. An Azure Storage Account does not in any way replace an old-fashioned file server we have been using for shared drives for all users. The Azure storage account does not support metadata well, so the storage account won't work well to sharing containing metadata given that just about all files are placed on a file. The users will see delays when opening files and browsing file shares. I know a metadata caching add-on is on the way, but for now, that is only for premium accounts. Having a premium account for Excel, Word, PowerPoint, etc., is way too expensive and unnecessary.mracketApr 13, 2025Copper Contributor441Views0likes1CommentUnable to upload .exe files to Azure blob container in Azure Storage Account
Hi I am trying to upload an exe file to my Azure Blob container but I am getting below ( forbidden ) error. Also when I tried with .txt, .ppt .ps1 extensions I am able to upload those. Is there any restriction MS has for (.exe files). I am having storage I am having owner permission for the Storage Account resource. Kindly Help Thanks and Regards, HirdeshLancelotbaghel30Mar 04, 2025Copper Contributor118Views0likes1Comment
Resources
Tags
- azure blob storage23 Topics
- azure blob14 Topics
- azure files9 Topics
- azure backup7 Topics
- storage explorer6 Topics
- Azure Data Lake4 Topics
- queue storage3 Topics
- Azure Archive3 Topics
- updates2 Topics
- azure disks1 Topic