Forum Widgets
Latest Discussions
Can't add IP address to storage account
I can no longer add an IP address to a storage account networking using the portal. This started within the last week. I get the message "Invalid argument: 'The prefix must be between 0 and 30.'." I've tried a valid IP address and CIDR notation. (71.59.0.15 or 71.59.0.15/32). Is this a problem with the Azure portal?Tom-S-2060Mar 30, 2021Copper Contributor33KViews2likes20CommentsCan't open a link with 'blob.core.windows.net'
Hello folks, I am running a graph query for exporting endpoint manager reports into a 'csv' file. The thing is it giving me the csv in a zip file and that zip file is only accessible by the below link, which is not opening in the browser: https://amsub0102repexpstorage.blob.core.windows.net/19cc0b0a-e24d-4cab-8cf8-c9fc8a981ff1/#Device_Id_Here#.zip?sv=2019-07-07&sr=b&sig=7v%2Fae.......................................... #Device_Id_Here :- the device id is residing there, just hided it for security reasons. Can anybody please help me on how I can access the csv file?17KViews0likes0CommentsAzure Files Private Endpoint Private IP Address / Pingable?
Hello, i'm setting up some Azure File Shares and using on-premises Active Directory authentication over SMB. I'm looking to make these Azure File Shares accessible using only a private IP address so you have to set up a "private endpoint" on the Azure storage account. Is anyone familiar with Microsoft's DirectAccess (DAS) for remote access into your network? If someone connects using DirectAccess and they need to access a network resource, the DAS server first has to be able to "ping" the network resource that the user is attempting to access. If it pings, the user can access the network resource. If it doesn't ping, the user can't access it. The million dollar question, "Can the Azure storage account private endpoint private IP address be set up to be pinged and provide a ping reply?". Any feedback appreciated.J_BushJun 30, 2020Copper Contributor11KViews0likes1CommentHow to increase size of unmanaged disk attached to classic VM
Greetings. I have a classic VM (using the old Service Manager model) that I need to expand the storage on. The disk in question is a data disk, not an OS disk, and it is an Unmanaged disk of type Premium (SSD). What are my options for extending the size of this drive? Or, would it be easier to just add a new drive to this VM, assuming the VM has capacity for it? I've searched this site with no meaningful hits. Also, Googling isn't enormously helpful. I did see https://docs.microsoft.com/en-us/azure/virtual-machines/windows/expand-os-disk from docs.microsoft.com, but it seems to only speak to Resource Manager VMs. I would appreciate any thoughts. Thanks, BrianSolvedAzureBrianMar 30, 2020Brass Contributor11KViews0likes4CommentsHow to see what resources are associated with a storage account?
Is there a quick way to see what resources are associated with a storage account? ThxSolvedJeff WalzerMar 18, 2020Iron Contributor11KViews0likes2CommentsAzure Files - Map file share on Azure AD joined machine
Hello, We know articles says there is restrictions for mapping Azure File share on Azure AD Joined machine as per MS article. "Neither Azure AD DS authentication nor on-premises AD DS authentication is supported against Azure AD-joined devices or Azure AD-registered devices." Is there any workaround ? Or Does anyone know if that feature is in development or coming soon as preview ? Thanks!Amit_Trivedi112214Jun 19, 2020Copper Contributor10KViews2likes11CommentsHow to view logs of blob storage reads and writes?
We want to see what's reading and writing from/to our blob storage, which means 2 classic storage accounts. I've been into the diagnostics setup and logging seems to be active, and I've attached the two storage accounts to a Log Analytics workspace, but can't see anything there. Is this the right way to go about viewing logs?trevsk1Aug 05, 2020Copper Contributor9.3KViews0likes1CommentUnable to write csv to azure blob storage using Pyspark
Hi there, I am trying to write a csv to an azure blob storage using pyspark but receiving error as follows: Caused by: com.microsoft.azure.storage.StorageException: One of the request inputs is not valid. at com.microsoft.azure.storage.StorageException.translateException(StorageException.java:89) at com.microsoft.azure.storage.core.StorageRequest.materializeException(StorageRequest.java:305) at com.microsoft.azure.storage.core.ExecutionEngine.executeWithRetry(ExecutionEngine.java:175) at com.microsoft.azure.storage.blob.CloudBlob.startCopy(CloudBlob.java:883) at com.microsoft.azure.storage.blob.CloudBlob.startCopyFromBlob(CloudBlob.java:825) at org.apache.hadoop.fs.azure.StorageInterfaceImpl$CloudBlobWrapperImpl.startCopyFromBlob(StorageInterfaceImpl.java:399) at org.apache.hadoop.fs.azure.AzureNativeFileSystemStore.rename(AzureNativeFileSystemStore.java:2449) ... 22 more The code used is : def put_data_to_azure(self, df, fs_azure, fs_account_key, destination_path, file_format, repartition): self.code_log.info('in put_data_to_azure') try: self.sc._jsc.hadoopConfiguration().set("fs.azure", fs_azure) self.sc._jsc.hadoopConfiguration().set("fs.wasbs.impl", "org.apache.hadoop.fs.azure.NativeAzureFileSystem") self.sc._jsc.hadoopConfiguration().set("fs.azure.account.key.%s.blob.core.windows.net" % fs_azure, fs_account_key) df.repartition(repartition).write.format(file_format).save(destination_path) except Exception as e: error1 = str(e).splitlines()[:2] exception = "Exception in put_data_to_azure: " + ''.join(error1) raise ExceptionHandler(exception) The destination path of azure is ' wasbs://<container>@<storage account>.blob.core.windows.net/folder' Jars used are hadoop-azure-2.7.0.jar and azure-storage-2.0.0.jar. I have used these as I found them to be stable versions and could read from azure successfully but could not write. I have also tried newer versionsAshwini_AkulaAug 05, 2020Copper Contributor8.4KViews0likes5CommentsAzure Files VS Azure NetApp Files
Azure Files VS Azure NetApp Files Hi, Has anyone tested both as a replacement for on-prem file servers? Any feedback? (Azure files/ Azure File Sync/Azure NetApp Files) We run 7 servers on 3 continents, with 13TB of data. Our on-prem Internet speeds: A- 2GB (~200users) B- 500GB(~30 users) C- 140GB (~100users) We are wondering if: 1- NetApp offers an on-prem "Sync" option as Azure Files does. 2- Which servie would better answer our needs as stated above.SamSONACAJan 26, 2021Copper Contributor8.1KViews0likes1CommentLift and Shift with NTFS permissions
I've been tasked with looking into Azure File to gradually move our file server to the cloud. This file server has well over 2 million files and close to 5TB of Data. The NTFS permissions are a mess in terms of broken inheritance. I've gone though and setup a File share and had a look at File Sync and Azcopy appears to be for blob storage only right now. With File sync it doesn't appear to carry over the NTFS permissions from my file server. With the File share i went through the process of setting up Azure Share to use AD but ran into the issue of port 445 being blocked on my ISP. I will have to look into the alternatives Azure P2S VPN, Azure S2S VPN, or Express Route to tunnel SMB traffic over a different port. My question is simply will either option maintain the NTFS permissions from my file server to Azure cloud for when I eventually map that share.JWJOct 05, 2020Copper Contributor8.1KViews0likes2Comments
Resources
Tags
- azure blob storage22 Topics
- azure blob13 Topics
- azure files9 Topics
- azure backup7 Topics
- storage explorer6 Topics
- Azure Data Lake4 Topics
- queue storage3 Topics
- Azure Archive3 Topics
- updates2 Topics
- azure disks1 Topic