data + storage
110 TopicsData Factory v2 HTTP connector timeout
I have an activity that makes HTTP request to Azure functions app. Function does some processing and returns the result after that. Processing can last up to 15 minutes. But Data Factory activity fails with error "The remote server returned an error: (500) Internal Server Error" if request time is more than 225 seconds even if "Request timeout" property is set to 30 min and Azure function shows that it ran successfully . It doesn't depend on destination connector(tried Azure SQL DB, Data Lake Store etc). Are there any undocumented timeouts or tricks to avoid this?2.9KViews4likes1CommentAzure Storage Account Larger File Shares
Microsoft Announce General Available Larger file shares available in the storage account. Azure Files is secure, fully managed public cloud file storage with a full range of data redundancy options and hybrid capabilities using Azure File Sync. All premium file shares are available with 100 TiB capacity. Visit Azure Files scale limits documentation to get more details. What’s new? Large file shares now has: Ability to upgrade existing general purpose storage accounts and existing file shares. Ability to opt in for larger files shares at a storage account instead of subscription level. Expanded regional coverage. Support for both locally redundant and zonal redundant storages. Improvements in the performance and scale of sync to work better with larger file shares. Visit Azure File Sync scalability targets to keep informed of the latest scale. New storage account Create a new general-purpose storage account in one of the supported regions on a supported redundancy option. While creating storage account, go to Advanced tab and enable Large file shares feature. See detailed steps on how to enable large file shares support on a new storage account. All new shares created under this new account will, by default, have 100 TiB capacity with increased scale. Existing storage account On an existing general-purpose storage account that resides on one of the supported regions, go to Configuration, enable Large file shares feature, and hit Save. You can now update quota for existing shares under this upgraded account to more than 5 TiB. All new shares created under this upgraded account will, by default, have 100 TiB capacity with increased scale.13KViews1like0CommentsAzure Private Endpoint (Azure Private Link)
Azure Private Endpoint (Azure Private LInk) – Preview Availability is a network interface that connects you privately and securely to a service powered by Azure Private Link. Private Endpoint uses a private IP address from your VNet, effectively bringing the service into your VNet. The service could be an Azure service such as Azure Storage, SQL, etc. Azure Private Link enables you to access Azure PaaS Services (for example, Azure Storage and SQL Database) and Azure-hosted customer/partner services over a Private Endpoint in your virtual network. Traffic between your virtual network and the service traverses over the Microsoft backbone network, eliminating exposure from the public Internet. You can also create your own Private Link Service in your virtual network (VNet) and deliver it privately to your customers. The setup and consumption experience using Azure Private Link is consistent across Azure PaaS, customer-owned, and shared partner services. Access to a private link resource using an approval workflow You can connect to a private link resource using the following connection approval methods: Automatically approved when you own or have permission on the specific private link resource. The permission required is based on the private link resource type in the following format: Microsoft.<Provider>/<resource_type>/privateEndpointConnectionApproval/action Manual request when you don’t have the permission required and would like to request access. An approval workflow will be initiated. The private endpoint and subsequent private endpoint connection will be created in a “Pending” state. The private link resource owner is responsible to approve the connection. After it’s approved, the private endpoint is enabled to send traffic normally, as shown in the following approval workflow diagram. Configure the Steps This Example I am selecting an existing storage account creating a private endpoint – Enter the Name and Region. Click – Next The Resource Selection – Connection method – can open all Azure Resources in my Directory or Selected Resource ID only. Then Resource Type Currently Only Available (Storage/Network/SQL). Resource Select from the list and Target source. Click Next Configuration Tab – Select VNET and Subnet also you can Create Private DNS integration or No Click Next Add TAGS. Validate Passed Click Create.4.6KViews1like0CommentsSecuring developer machines with Azure Bastion and DevTest Labs
Introduction Recently I have seen multiple enterprise customers struggling with the same issue which was how to give developers the right tools and connectivity to do their development work against Azure resources. Luckily, most of Azure services are reachable on HTTPS, which in most cases will be allowed and only a request to allow the URLS in the web proxy should be sufficient. But what with the non-https connections like RDP, SSH, SQL, Redis, etc.? Most of the time, non-HTTPS outbound traffic is blocked by the corporate firewall. When you initiate the same connection from another location - outside the corporate network or your mobile phone - these connection will work without any issue. Of course, with the assumption IP ACLs and Service Endpoints are correctly configured on the Azure resources to accept the traffic from this location. I have seen this very specific issue on how to allow TCP/1433 to an Azure SQL Database generating friction and frustrations for the developers, opening again the traditional discussions between the digital teams and infrastructure teams. I considered several options to solve this problem, but each of them hade pro's and cons: TCP/1433 over Internet: Pro: Easiest solution Con: Security teams have each time refused this. Query Editor in the Portal: Pro: Integrated in the Portal Con: Reduced feature set compared to SSMS Traffic is also initiated on port 1433 Go for Managed Instances Pro: Azure SQL available on private IP address, no hassle with outbound connectivity Con: High consumption costs Microsoft Peering on the ExpressRoute Pro: No outbound connectivity Con: Challenging to get this properly configured (Express Route, ISP, Route Filters, BGP announcements, etc.) Azure Bastion in combination with DevTest Labs With the announcement of the preview of Microsoft Azure Bastion, it's possible to create an seamless RDP and SSH session over HTTPS (TCP/443). As you can see, there is no real solution for traffic other than RDP and SSH, but we can use this Bastion host as in-between to access the Azure SQL from a developer virtual machine running in Azure. The first feedback from customers is that there is again an IaaS component (DevTest Labs) present, which I fully can understand, but for now there is no alternative to give a developer a full SSMS console than hosting it on a VM. Managing golden images like they used to do is not something they have in their plan when adopting PaaS Services. I'm convinced Azure DevTest Labs is a good alternative. Large list of available base images that can be customized with formulas (SKU, VNET, Private IP, etc.) Artifacts can be used to install developer tools (Git, Visual Studio Code, SQL Server Management Studio, etc.) Machine Auto-Start and Auto-Stop Integrated (What is turned off is less vulnerable) In combination with the DevTest Labs, I have decided to configure a service endpoint on the subnet where the DevTest Lab VMS reside. In this way, only the machines running in this VNET will be able to access the Azure SQL. Another Service Endpoint can be created for subnets that need access too. Additionally, an NSG can be applied on the AzureBastionSubnet where the access to the Bastion Host can be limited. In my case, I limited it to the IP addresses of the web proxy of the customer. This will cover connections from on-premise clients, clients connecting from home over VPN and any other traffic leaving the corporate proxy. Setup: I hope you find this solution helpful.4.7KViews1like0CommentsSlow files server write speeds over the network
Setup: a pretty simple three server system: domain controller, file server and XenApp server on a single virtual network with a virtual gateway to allow remote access via XenApp. Problem: users on XenApp are reporting very slow file access, esp. in Sage. It's got noticeable worse. Diagnostics carried out: copy 512MB test file between the various servers: XenApp server to file server: horrible 600KB/s (so writes across virtual network are slow) File Server to XenApp server: acceptable 50MB/s (so reads are fine) File Server to File Server local copy: 20MB/s which is expected due to read/write Download of 1GB file from web browser on file server: very slow, like 1.5 hours XenApp server to domain controller: acceptable 40MB/s So I surmise here that everything is fine except data coming in through the network. Write speeds on the file server are okay. Reads from file server to other are fine. Any ideas how I go about isolating the fault?1.7KViews1like2CommentsIs it possible to have no data loss (RPO=0)?
Is it possible to have no Data Loss (RPO = 0) in Azure? Usually, Azure paired regions are set at a distance and thus not sure if it still possible to have RPO = 0. (Azure Region: Azure Australia East & Australia SouthEast) Note: Currently, no Availability Zone in Australia Region Business Requirement: To Host SAP services in Azure with ZERO data Loss3.3KViews1like1CommentA Complete Guide to Azure Database Migration Strategies, Tools, and Best Practices
As organizations increasingly adopt cloud-first strategies, migrating databases to the cloud has become a crucial step toward scalability, security, and cost-efficiency. Microsoft Azure, with its powerful ecosystem, offers a range of services and tools to simplify and streamline this process. In this blog, we’ll explore Azure Database Migration, including its benefits, strategies, tools, and best practices. https://dellenny.com/a-complete-guide-to-azure-database-migration-strategies-tools-and-best-practices/74Views1like0CommentsStorage Accounts - Networking
Hi All, Seems like a basic issue, however, I cannot seem to resolve the issue. In a nutshell, a number of storage accounts (and other resources) were created with the Public Network Access set as below: I would like to change them all to them all to Enabled from selected virtual networks and IP addresses or even Disabled. However, when I change to Enabled from selected virtual networks and IP addresses, connectivity from, for example, Power Bi to the Storage Account fails. I have added the VPN IP's my local IP etc. But all continue to fail connection or authentication. Once it is changed back to Enabled for All networks everything works, i.e. Power Bi can access the Azure Blob Storage and refresh successfully. I have also enabled 'Allow Azure services on the trusted services list to access this storage account'. But PBI fails to have access to the data. data Source Credentials error, whether using Key, Service Principal etc, it fails. As soon as I switch it back to Enable From All Networks, it authenticates straight away. One more idea I had was to add ALL of the Resource Instances, as this would white list more Azure services, although PBI should be covered by enabling 'Allow Azure services on the trusted services list to access this storage account'. I thought I might give it a try. Also, I created an NSG and used the ServiceTags file to create an inbound rule to allow Power BI from UK South. Also, I have created a Private Endpoint. This should all have worked but still can’t set it to restricted networks. I must be missing something fundamental or there is something fundamentally off with this tenant. When any of the two restrictive options are selected, do they also block various Microsoft services? Any help would be gratefully appreciated.148Views1like2CommentsPowershell Script to remove all Blobs from Storage account
With large number of Blobs in Storage Account, the manual cleanup from the Portal is more complicated and time consuming, as it's per set of 10'000. This script is simple and and can be executed in background to clean all items from a defined Blob Container. You have to specify the Storage Account connection string and the blob container name. [string]$myConnectionString = "DefaultEndpointsProtocol=https;AccountName=YourStorageAccountName;AccountKey=YourKeyFromStorageAccountConnectionString;EndpointSuffix=core.windows.net" [string]$ContainerName = "YourBlobContainerName" [int]$blobCountAfter = 0 [int]$blobCountBefore = 0 $context = New-AzStorageContext -ConnectionString $myConnectionString $blobCountBefore = (Get-AzStorageBlob -Container $ContainerName -Context $context).Count Write-Host "Total number of blobs in the container Before deletion: $blobCount" -ForegroundColor Yellow Get-AzStorageBlob -Container $ContainerName -Context $context | ForEach-Object { $_ | Remove-AzureStorageBlob # or: Remove-AzureStorageBlob -ICloudBlob $_.ICloudBlob -Context $ctx } $blobCountAfter = (Get-AzStorageBlob -Container $ContainerName -Context $context).Count Write-Host "Total number of blobs in the container After deletion : $blobCount" -ForegroundColor Green It was used for large blob storage container with more than 5 millions of blob items. Sources: https://learn.microsoft.com/en-us/powershell/module/az.storage/new-azstoragecontext?view=azps-13.0.0#examples https://learn.microsoft.com/en-us/answers/questions/1637785/what-is-the-easiest-way-to-find-the-total-number-o https://stackoverflow.com/questions/57119087/powershell-remove-all-blobs-in-a-container Fab415Views1like1CommentWhitepaper Achieving Compliant Data Residency and Security with Azure
Introduction Security and compliance–basic elements of the trusted cloud–are top priorities for organizations today. This paper is designed to help customers ensure that their data is handled in a manner that meets their data protection, regulatory, and sovereignty requirements on the global cloud architecture of Microsoft Azure. Transparency and control are also essential to establishing and maintaining trust in cloud technology. Microsoft recognizes that restricted and regulated industries require additional details for their risk management and to ensure compliance at all times. Microsoft provides an industry-leading security and compliance portfolio. Security is built into the Azure platform, beginning with the development process, which is conducted in accordance with the Security Development Lifecycle (SDL), and includes technologies, controls and tools that address data management and governance, Active Directory identity and access controls, network and infrastructure security technologies and tools, threat protection, and encryption to protect data in transit and at rest. Microsoft also provides customers with choices to select and limit the types and locations of data storage on Azure. With the innovation of the security and compliance frameworks, customers in regulated industries can successfully run mission-critical workloads in the cloud and leverage all the advantages of the Microsoft hyperscale cloud. This simple approach can assist customers in meeting the data protection requirements of government regulations or company policies by helping them to: Understand data protection obligations. Understand the services and controls that Azure provides to help its customers meet those obligations. Understand the evidence that customers need to assert compliance. The paper is structured into these three sections, with each diving deeper into the security and technologies that help Microsoft customers to meet data protection requirements. The final section discusses specific requirements to which industries and organizations in selected European markets are subject. Download this Awesome whitepaper, “Achieving compliant data residency and security with Azure.” Learn here more on Compliance, Trust, Security and Responsibilities3.4KViews1like0Comments