azure data lake storage
13 TopicsTLS 1.0 and 1.1 support will be removed for new & existing Azure storage accounts starting Feb 2026
To meet evolving technology and regulatory needs and align with security best practices, we are removing support for Transport Layer Security (TLS) 1.0 and 1.1 for both existing and new storage accounts in all clouds. TLS 1.2 will be the minimum supported TLS version for Azure Storage starting February 2026. Azure Storage currently supports TLS 1.0 and 1.1 (for backward compatibility) and TLS 1.2 on public HTTPS endpoints. TLS 1.2 is more secure and faster than older TLS versions. TLS 1.0 and 1.1 do not support modern cryptographic algorithms and cipher suites. Many of the Azure storage customers are already using TLS 1.2 and we are sharing this guidance to expedite the transition for customers currently on TLS 1.0 and 1.1. Customers must secure their infrastructure by using TLS 1.2+ with Azure Storage by Jan 31, 2026. The older TLS versions (1.0 and 1.1) are being deprecated and removed to meet evolving standards (FedRAMP, NIST), and provide improved security for our customers. This change will impact both existing and new storage accounts using TLS 1.0 and 1.1. To avoid disruptions to your applications connecting to Azure Storage, you must migrate to TLS 1.2 and remove dependencies on TLS version 1.0 and 1.1, by Jan 31, 2026. Learn more about how to migrate to TLS1.2. As best practice, we also recommend using Azure policy to enforce a minimum TLS version. Learn more here about how to enforce a minimum TLS version for all incoming requests. If you already use Azure Policy to enforce TLS version, minimum supported version after this change rolls out will be TLS 1.2. Help and Support If you have questions, get answers from community experts in Microsoft Q&A. If you have a support plan and you need technical help, create a support request: For Issue type, select Technical. For Subscription, select your subscription. For Service, select My services. For Service type, select Blob Storage. For Resource, select the Azure resource you are creating a support request for. For Summary, type a description of your issue. For Problem type, select Connectivity For Problem subtype, select Issues using TLS.57KViews2likes5CommentsThe Dremio Open Lakehouse Platform and Microsoft Provide a Solution for Cloud Data Analytics
Cloud data lakes represent the primary storage destination for a growing volume and variety of data. For Microsoft customers, Azure Data Lake Storage (ADLS) provides a flexible, scalable, cost-effective, secure, cloud-native analytics file system for a variety of data sources. The challenge for many organizations is making that data available for Business Intelligence (BI) and reporting. In this article, I’ll share how the Dremio Open Lakehouse Platform simplifies data architectures and accelerates access to insights on ADLS, and enables ad hoc analysis and exploration with Power BI.12KViews1like0CommentsCopy Dataverse data from ADLS Gen2 to Azure SQL DB leveraging Azure Synapse Link
A new template has been added to the ADF and Azure Synapse Pipelines template gallery. This template allows you to copy data from ADLS (Azure Data Lake Storage) Gen2 account to an Azure SQL Database.8.7KViews1like1CommentDremio Cloud on Microsoft Azure enables customers to drive value from their data more easily
Dremio Cloud on Microsoft Azure enables customers to drive value from their data more easily. It helps overcoming challenges of grown data lake and database landscapes. Particularly in hybrid environments it allows to shield change from business while at the same time tightening security and eases application integration.2.4KViews1like0CommentsAzure Function with Blob Output Binding returning 404 on GetProperties check before writing the Blob
Hi. This question is similar: https://stackoverflow.com/questions/64546302/how-to-disable-blob-existence-check-in-azure-function-output-binding But I'm wondering if there are other answers or comments out there, and more recent. I have an Azure Function with an HTTP Trigger input binding and a Blob Storage output binding. For every execution, the output binding looks like it tries to get Blob Properties first, resulting in a 404. Quite rightly, as the data to be written is going to a new Blob. But this will always fail and in this case is redundant. It takes time to go through these steps - admittedly milliseconds, but still. Presumably it's also logging somewhere, so that's a storage cost - might be negligible now, but something to not be ignored. I'm not 100% sure where that logging would be stored, either, to go and manage it. The positive is that the overall function execution is fine. But it's still recording all these failures, and we're getting 10s of thousands through it a day. Is there a way to use the concise output binding code but not do this prior if-exists-get-properties check? My options seem to be live with it, or rewrite to use BlobContainer, BlobClient and so on instead of the Blob attribute output binding. Anyone got some clever ideas?1.9KViews0likes1CommentMicrosoft Purview Protection Policies for Azure Data Lake & Blob Storage Available in All Regions
Organizations today face a critical challenge: ensuring consistent and automated data governance across rapidly expanding data estates. Driven by the growth of AI and the increasing reliance on vast data volumes for model training, Chief Data Officers (CDOs) and Chief Information Security Officers (CISOs) must prevent unintentional exposure of sensitive data (PII, credit card information) while adhering to data and legal regulations. Many organizations rely on Azure Blob Storage and ADLS for storing vast amounts of data, offering scalable, secure, and highly available cloud storage solutions. While solutions like RBAC (role-based access control), ABAC (attribute-based access control), and ACLs (Access Control Lists) offer secure ways to manage data access, they can operate on metadata such as file paths, tags, or container names. These mechanisms are effective for implementing restrictive data governance by controlling who can access specific files or containers. However, there are scenarios were implementing automatic access controls based on the sensitivity of the content itself is necessary. For example, identifying and protecting sensitive information like credit card numbers within a blob requires more granular control. Ensuring that sensitive content is restricted to specific roles and applications across the organization is crucial, especially as enterprises focus on building new applications and infusing AI into current solutions. This is where integrated solutions like Microsoft Information Protection (MIP) come into play. Microsoft Information Protection (MIP) protection policies provide a solution by enabling organizations to scan and label data based on the content stored in the blob. This allows for applying access controls directly related to the data asset content across storage accounts. By eliminating the need for in-house scanning and labeling, MIP streamlines compliance and helps in applying consistent data governance using a centralized solution. The Solution: Microsoft Purview Information Protection (MIP) Protection Policies for Governance & Compliance Microsoft Purview Information Protection (MIP) provides an efficient and centralized approach to data protection by automatically restricting access to storage data assets based on sensitivity labels discovered through automated scanning and leveraging Protection policies (learn more). This feature builds upon Microsoft Purview's existing capability (learn more) to scan and label sensitive data assets, ensuring robust data protection. This not only enhances data governance but also ensures that data is managed in a way that protects sensitive information, reducing the risk of unauthorized access and maintaining the security and trust of customers. Enhancing Data Governance with MIP Protection policies: Contoso, a multinational corporation, handles large volumes of data stored in Azure Storage (Blob/ADLS). Different users, such as financial auditors, legal advisors, compliance officers, and data analysts, need access to different blobs in the Storage account. These blobs are updated daily with new content, and there can be sensitive data across these blobs. Given the diverse nature of the stored data, Contoso needed an access control method that could restrict access based on data asset sensitivity. For instance, data analysts access the blob named "logs" where log files are uploaded. If these files contain PII or financial data, which should only be accessed by financial officers, the access permissions need to be dynamically updated based on the changing sensitivity of the stored data. MIP protection policies can address this challenge efficiently by automatically limiting access to data based on sensitivity labels found through automated scanning. Key Benefits: Auto-labelling: Automatically apply sensitivity labels to Azure Storage based on detection of sensitive information types. Automated Protection: Automatically restrict access to data with specific sensitivity labels, ensuring consistent data protection. Storage Data Owners can selectively enable specific storage accounts for policy enforcement, providing flexibility and control. Like a protection policy that restricted access to data labeled as "Highly Confidential" to only specific groups or users. For instance, blobs labeled with "logs" were accessible only to data analysts. With MIP, the labels are updated based on content changes, and the protection policy can deny access if the content if any “Highly Confidential” data is identified. Enterprise-level Control: Information Protection policies are applied to blobs and resource sets, ensuring that only authorized Azure Entra ID users or M365 user groups can access sensitive data. Unauthorized users will be prevented from reading the blob or resource set. Centralized Policy Management: Create, manage, and enforce protection policies across Azure Storage from a single, unified interface in Microsoft Purview. Enterprise admins have granular control over which storage accounts enforce protection coverage based on the account’s sensitivity label. By using Microsoft Purview Information Protection (MIP) Protection Policies, Contoso was able to achieve secure and consistent data governance, and centralized policy management, effectively addressing their data security challenges Prerequisites Microsoft 365 E5 licenses and setup of pay as you go billing model. To understand pay as you go billing by assets protected, see the pay-as-you-go billing model. For information about the specific licenses required, see this information on sensitivity labels. Microsoft 365 E5 trial licenses can be attained for your tenant by navigating here from your environment. Getting Started The public preview of Protection Policies supports the following Azure Storage services: Azure Blob Storage Azure Data Lake Storage To enable Protection Policies for your Azure Storage accounts: Navigate to the Microsoft Purview portal> Information Protection card > Policies. Configure or use an existing sensitivity label in Microsoft Purview Information Protection that’s scoped to “Files & other data assets” Create an auto-labelling to apply a specific sensitivity label to scoped assets in Azure Storage based on Microsoft out-of-the-box sensitive info types detected. Run scans on assets for auto-labelling to apply. Create a protection policy and associate it with your desired sensitivity labels. Apply the policy to your Azure Blob Storage or ADLS Gen2 accounts. Limitations During the public preview, please note the following limitations: Currently a maximum of 10 storage accounts are supported in one protection policy, and they must be selected under Edit for them to be enabled. Changing pattern rules will re-apply labels on all storage accounts. During the public preview, there might be delays in label synchronization, which could prevent MIP policies from functioning effectively. If customer storage account enables CMK, the storage account MIP policy will not work. Next Steps With the Public Preview, MIP Protection policies is now available in all regions, and any storage account registered on the Microsoft Purview Data Map can create and apply protection policies to implement consistent data governance strategies across their data in Azure Storage. We encourage you to try out this feature and provide feedback. Your input is crucial in shaping this feature as we work towards general availability.1.8KViews0likes0CommentsUnable to load large delta table in azure ml studio
I am writing to report an issue that I am currently experiencing while trying to read a delta table from Azure ML. I have already created data assets to register the delta table, which is located at an ADLS location. However, when attempting to load the data, I have noticed that for large data sizes it is taking an exceedingly long time to load. I have confirmed that for small data sizes, the data is returned within few seconds, which leads me to believe that there may be an issue with the scalability of the data loading process. I would greatly appreciate it if you could investigate this issue and provide me with any recommendations or solutions to resolve this issue. I can provide additional details such as the size of the data, the steps I am taking to load the data, and any error messages if required. I'm following this document: https://learn.microsoft.com/en-us/python/api/mltable/mltable.mltable?view=azure-ml-py#mltable-mltable-from-delta-lake Using this command to read delta table using data asset URI from mltable import from_delta_lake mltable_ts = from_delta_lake(delta_table_uri=<DATA ASSET URI>, timestamp_as_of="2999-08-26T00:00:00Z", include_path_column=True)548Views0likes0Comments