Forum Widgets
Latest Discussions
Inherited VM data disk setup query
Hi there, I've inherited a VM in Azure: OS Premium SSD disk of 127GB 4 x 4095 GB Standard HDD In the VM the data drive totals 10.2TB with 7.6TB of space. I want to reduce the size but realize I can't, I'd have to create new smaller drives then move data to new drive letter in WIndows. My confusion is why are there four identically sized disks? Also how do they only total 10.2TB when in the Windows partition manager there is no unallocated space? Any suggestions if I've missed anything would be great. THank you.SlevinKelevraFeb 05, 2026Copper Contributor334Views0likes1CommentUnable change the custom name in Azure share File URl
Hi all, I am trying to change the custom name in my Azure File Share URL from its current format (//<Storage Account Name>.file.core.windows.net/<file share name>) to desired format (//AFS/<file share name>) without having to do any changes in on-premises side. but unable to do this work. I would like to access Azure file share from on-premises system file explorer using (//AFS/<file share name>) URL. If any possibility is there, please let me know your feedback. ThanksRajesh605Jan 29, 2026Former Employee404Views0likes1CommentAzure file sync, avoid duplicate files
Hello guys, I've a share with 2 server endpoint with Cloud Tiering enabled in Azure file sync environment, unfortunately on this share are created duplicate files with the name of the server endpoint so it means that the same file are opened by users on both endpoint. My question is if it's possibile to set priority on these endpoint so the users can connect only to the first endpoint server unless this is unavailable then the second endpoint can be use. Or there are other method to avoid this duplicated files? thanksAventisJan 14, 2026Copper Contributor1.7KViews1like1CommentAzure Logic App Creating multiple containers
Hi guys, I have a simple workflow in logic apps that reads data from library in sharepoint and then create the file inside of blob container. The problem is that a container is being created for each time the file is updated. So the logic app is creating the file normally but is also creating a new container with ID codes as a name. Has anyone experienced this before? ThanksFabData00Jan 12, 2026Copper Contributor486Views0likes1CommentStatus of uploading file
Hi, is it possible to get status of uploading file from blob storage? I have a case when user A is uploading large file to storage via SAS to myContainer/uploadingBlob. My server has information that someone wants to upload there but there is no way to directly tell that user is actually uploading anything there. Is it possible to get that information from storage? What if user cancels the upload?Dominik_GolebiowskiDec 29, 2025Copper Contributor856Views0likes1CommentEnable Performance Plus for Azure Disk Storage
We want to take advantage of this additional perfromance and wanted to see if it's GA (General Availibility) ? I see that, it's been almost 1 year since introducing of this feature but still in public preview. We want to enable this for our production workload. Do we need to wait for GA or can we go-ahead with it. Need your gudiance at the earliest.Venkata2340Dec 28, 2025Copper Contributor400Views0likes1Comment- JohnTMNov 24, 2025Brass Contributor181Views0likes3Comments
Manage/restore metadata when blob is updated
I'm using an Azure Storage container of block blobs as a data source for an Azure AI Search Index, and I'm using Blob metadata key value pairs for some custom data. But metadata gets wiped when a blob is updated. How are folks managing that? For reference, I've got a CosmosDB set up also for now with a cross-reference I can restore from, but it's manual. I considered using Cosmos as my data source instead, but I also need a place to store/serve media files from related to these records.kathyhurchlaNov 19, 2025Copper Contributor114Views0likes1CommentCopy from on-prem to Azure Archive Tier
Hi All, We have an on-prem NetApp File Share. We are thinking of copying data from on-prem NetApp file share to Azure Archive Tier ( Blob) for long term backup. The files in Azure Archive will not be accessed unless there is some special need. We are ok with the 15 hour rehydration time. If we need, we want to be able to rehydrate the files from Archive to a hot/cold blob and access them from NetApp. As I understand we can use CLI in batch mode to upload large number of files to Azure Archive. However how easy or difficult is it going to be to rehydrate the files into a hot/cold blob and read the same through NetApp file share. Essentially ( If I am correct) the rehydrated files will need to be mounted to NetApp. if there any solution for this problem? Thanks PallavPallav415Nov 04, 2025Copper Contributor445Views0likes1CommentSFTP storage event trigger stopped in live Azure synapse mode
My Azure synapse account is linked with git and i am working on my feature branch and created storage event trigger that get files from SPTP server when i run pipeline manual. In master and feature branch it status is started but when i publish it into live mode its status become stop and while publishing i get this error message Forbidden. Role based access check failed for resource /subscriptions Some one know this issuejunaid5Nov 04, 2025Copper Contributor380Views0likes1Comment
Resources
Tags
- azure blob storage24 Topics
- azure blob15 Topics
- azure files9 Topics
- azure backup7 Topics
- storage explorer7 Topics
- Azure Data Lake4 Topics
- queue storage3 Topics
- Azure Archive3 Topics
- updates2 Topics
- azure disks1 Topic