Azure Data Lake
4 TopicsDisaster Recovery options for Premium Block Blob HNS storage
Looking to implement a Data Lakehouse in Azure and DR is becoming a large focus. Ideally we'd use an Azure Storage Account with Hierarchical Namespace enabled and then also use Premium and not Standard to provide better read performance, then the block blob option. However, there is no GRS option for any Azure Storage Account Premium. Are there any DR options to get this data to another region in Azure? Well beside a dual write (2 phased commit) into each storage account.604Views0likes6CommentsHow to change LRS to GRS for Azure Data Lake Storage storage
Hi all, I would like to change an existing storage account from LRS to GRS - however, when I try to do it via Azure portal -> Data management -> Redundancy, I get a message below: "Failed to update redundancy for storage account 'tplrepository'. Error: Storage account XXXX is LFS enabled and may not be converted to a georeplication enabled sku" Is there a way to disable LFS for existing storage ... or change to GRS with LFS enabled? Thank you.594Views0likes1CommentHow to Get ADLS Gen1 Instance Size Using any Script
Hi Team, I want to fetch ADLS Gen1 Instance size using any API or script, Can you please suggest if any meta data resides about Instance size. I know How to get folders size and add sum gives ADLS Instance size currently this what I am doing but calculating each folder is taking around 10 hrs because my ADLS instance size bit bigger and too many nested folders. Is there anyway we can calculate ADLS Gen1 size with calculating subfolders and all ? I mean is there any meta data available ? I am using below Powershell script but it is not optimal. $rootdir = '/' $adlsactname = "**********" Get-AzDataLakeStoreChildItemSummary -Account $adlsactname -Path $rootdir -Concurrency 128 Azure portal providing ADLS Instance size on daily basis so we can get this info export using any metadata or script or API ? Thanks, Brahma1.5KViews0likes1Comment