azure blob storage
22 TopicsStorage Account, grant SAS tokens but not Entra ID
Hi there, I was playing with Entra and storage account, and I do have permissions in my subscription to generate SAS tokens for sharing access. But when I'm trying to grant Entra ID accesses. that seems to be blocked: Just wondering how I could set the access tokens, but not granting access. What could be the missing role I'm not having?299Views0likes2CommentsDisaster Recovery options for Premium Block Blob HNS storage
Looking to implement a Data Lakehouse in Azure and DR is becoming a large focus. Ideally we'd use an Azure Storage Account with Hierarchical Namespace enabled and then also use Premium and not Standard to provide better read performance, then the block blob option. However, there is no GRS option for any Azure Storage Account Premium. Are there any DR options to get this data to another region in Azure? Well beside a dual write (2 phased commit) into each storage account.597Views0likes6CommentsSecuring Terraform State in Storage Account
Hey guys, we need to push our terraform state file into an azure remote backend (Storage Container). While we read about many resources like here, we are curious to know if there are any tutorials and explanations for securing the access of the storage account from outside. Basically, we wanted to disallow any public access and only allow specific IP Addresses, but I guess that's a bit difficult to manage when running CI/CD pipelines. We also wanted to allow only a specific virtual network, but then the question is how to connect from our local machine to this virtual network, same is for private endpoints. Are there any suggestions and resources? Many thanks!678Views0likes1CommentUsing Azure Storage Blob as a CDN using Azure CDN - Anonymous Public Access
Hi all, I am wondering if anyone here has any answers to help solve for this problem. We have a CDN which is used by a public website. The CDN is using Azure CDN / Frontdoor with a custom domain and is connected to the Azure storage blob using a VPN. The Storage blob is setup to only accept connections from the VPN and specified IP addresses and to add new content to the storage blob you need to request elivated permissions via PIM to an RBAC role specific to this storage blob. The problem is the company I am working for want to enforce disabling anonymous public access to the storage blob and we can'd use SAS tokens from the website. Our tests show that doing so stops anyone accessing the website from being able to load assets from our CDN. Is there some way to configure the storage blob to accept anonymous public access only from the CDN while still turning off the anonymous public access on the storage blob?543Views0likes1CommentAzure Storage Container - Soft Delete Monitoring
Hi All, Can someone let me know if there is a way to export all the soft deleted items in my container on a daily basis into a csv or any file format I could connect PBI with? I want to monitor all Active and Soft Deleted items on a Power Bi report I have created and at this moment I can't seem to find a way to get a list of all the items that have been deleted.. Thanks471Views0likes1CommentCreate and use a SAS (Shared Access Signature) with the PowerShell in Azure!
Hi Azure friends, I used the PowerShell ISE for this configuration. But you are also very welcome to use Visual Studio Code, just as you wish. Please start with the following steps to begin the deployment (the Hashtags are comments): #The first two lines have nothing to do with the configuration, but make some space below in the blue part of the ISE Set-Location C:\Temp Clear-Host #So that you can carry out the configuration, you need the necessary cmdlets, these are contained in the module Az (is the higher-level module from a number of submodules) Install-Module -Name Az -Force -AllowClobber -Verbose #Log into Azure Connect-AzAccount #Select the correct subscription Get-AzContext Get-AzSubscription Get-AzSubscription -SubscriptionName "your subscription name" | Select-AzSubscription #Variables $location = "westeurope" $rgname = "twstoragedemo" #A file we use later $today = Get-Date New-Item -ItemType file -Path C:\Temp\test.txt -Force -value $today #Create a Resource Group New-AzResourceGroup -Name $rgname -Location $location #Create a Storage Account New-AzStorageAccount -Location $location -ResourceGroupName $rgname -Name twstorage75 -SkuName Standard_LRS #We need at least one Storage Account Key $keys = Get-AzStorageAccountKey -Name twstorage75 -ResourceGroupName $rgname #Now we need to create Storage context $context = New-AzStorageContext -StorageAccountName twstorage75 -StorageAccountKey $keys[0].Value #Once we have it, let’s create a storage container New-AzStorageContainer -Context $context -Name bilder #Now we have required pre-requisites to create an SAS $token = New-AzStorageContainerSASToken -Context $context -Name bilder -Permission rwd #Now we need to create Storage Container context $containercontext = New-AzStorageContext -SasToken $token -StorageAccountName twstorage75 #Let's upload a file to the Storage Container Set-AzStorageBlobContent -Context $containercontext -Container bilder -File C:\Temp\test.txt #List the blobs in the container Get-AzStorageBlob -Container bilder -Context $context | select Name, Blobtype, LastModified Now you have used the PowerShell to create an Azure Storage Account and an Shared Access Signature! Congratulations! #Delete all resources (when you no longer need it) Remove-AzResourceGroup -Name $rgname -Force I hope this article was useful. Best regards, Tom Wechsler P.S. All scripts (#PowerShell, Azure CLI, #Terraform, #ARM) that I use can be found on github! https://github.com/tomwechsler5.2KViews0likes2CommentsCan an azure blob with malware infect other blobs in the storage account?
Hi, I need to know whether it can be infected to other blobs or files if I accidently uploaded a malware into azure storage account or all the blobs are isolated from each other. If it can be infected, does it infect the blob/files only within the container or to the entire storage account?1.7KViews0likes4CommentsRead large volume of files from Blob Storage
Hi! I have a Blob Storage containing approximately 1 billion documents, split across different folders. I'm currently facing challenges in finding an ideal solution to retrieve the files, read their metadata, and update a tag to indicate whether the document has been processed or not. Could you provide any suggestions on how to address this problem? Currently, I am using the blobContainerClient.listBlobsByHierarchy(delimiter, options, null) method to retrieve the files. However, there are cases where I am unable to retrieve all the documents within a folder. Full implementation (Java): ... List<Map<String, Map<String, String>>> listOfMetadata = new ArrayList<>(); String delimiter = "/"; log.info("Setting details"); BlobListDetails blobListDetails = new BlobListDetails().setRetrieveMetadata(true).setRetrieveTags(true); log.info("Setting options"); ListBlobsOptions options = new ListBlobsOptions().setPrefix(prefix).setDetails(blobListDetails).setMaxResultsPerPage((int) listLimit); blobContainerClient.listBlobsByHierarchy(delimiter, options, null).stream().parallel().limit(listLimit).forEach(blob -> { Map<String, String> mapOfTags = blob.getTags(); if (mapOfTags.containsKey("wasRead") && mapOfTags.get("wasRead").equals("false")) { Map<String, Map<String, String>> mapOfBlobs = new HashMap<>(); mapOfBlobs.put(blob.getName(), blob.getMetadata()); listOfMetadata.add(mapOfBlobs); } }); return listOfMetadata; ...3.3KViews0likes2Comments