azure storage
119 TopicsHow to configure directory level permission for SFTP local user
SFTP is a feature which is supported for Azure Blob Storage with hierarchical namespace (ADLS Gen2 Storage Account). As documented, the permission system used by SFTP feature is different from normal permission system in Azure Storage Account. It’s using a form of identity management called local users. Normally the permission which user can set up on local users while creating them is on container level. But in real user case, it’s usual that user needs to configure multiple local users, and each local user only has permission on one specific directory. In this scenario, using ACLs (Access control lists) for local users will be a great solution. In this blog, we’ll set up an environment using ACLs for local users and see how it meets the above aim. Attention! As mentioned in Caution part of the document, the ACLs for local users are supported, but also still in preview. Please do not use this for your production environment. Preparation Before configuring local users and ACLs, the following things are already prepared: One ADLS Gen2 Storage Account. (In this example, it’s called zhangjerryadlsgen2) A container (testsftp) with two directories. (dir1 and dir2) One file uploaded into each directory. (test1.txt and test2.txt) The file system in this blog is like: Aim The aim is to have user1 which can only list files saved in dir1 and user2 which can only list files saved in dir2. Both of them should be unable to do any other operations in the matching directory (dir1 for user1 and dir2 for user2) and should be unable to do any operations in root directory and the other directory. Configuring local users From Azure Portal, it’s easy to enable SFTP feature and create local users. Here except user1 and user2, another additional user is also necessary. It will be used as the administrator to assign ACLs on user1 and user2. In this blog, it’s called admin. While creating the admin, its landing directory should be the root directory of the container and the permissions should be all given. While creating the user1 and user2, as the permission will be controlled by using ACLs, the containers and permissions should be left empty and the Allow ACL authorization should be checked. The landing directory should be configured to the directory which this user should have permission later. (In this blog, user1 should be on dir1 and user2 should be on dir2.) User1: User2: After local users are created, one more step which is needed before configuring ACL is to note down the user ID of user1 and user2. By clicking the created local user, a page as following to edit local user should show out and the user ID will be included there. In this blog, the user ID of user1 is 1002 and user ID of user2 is 1003. Configuring ACLs Before starting configuring ACLs, clarifying which permissions to assign is necessary. As explained in this document, the ACLs contains three different permissions: Read(R), Write(W) and Execute(X). And from the “Common scenarios related to ACL permissions” part of the same document, there is a table which contains most operations and their corresponding required permissions. Since the aim of this blog is to allow user1 only to list the dir1, according to table, we know that correct permission for user1 should be X on root directory, R and X on dir1. (For user2, it’s X on root directory, R and X on dir2). After clarifying the needed permissions, the next step is to assign ACLs. The first step is to connect to the Storage Account using SFTP as admin: (In this blog, the PowerShell session + OpenSSL is used but it’s not the only way. Users can also use any other way to build SFTP connection to the Storage Account.) Since assigning ACLs for local users is not possible to a specific user, and the owner of root directory is a built-in user which is controlled by Azure, the easiest way here is to give X permissions to all other users. (For concept of other users, please refer to this document) Next step is to assign R and X permission. But considering the same reason, it’s impossible to give R and X permissions for all other users again. Because if it’s done, user1 will also have R and X permissions on dir2, which does not match the aim. The best way here is to change the owner of the directory. Here we should change the owner of dir1 to user1 and dir2 to user2. (By this way, user1 will not have permission to touch dir2.) After above configurations, while connecting to the Storage Account by SFTP connection using user1 and user2, only listing file operation under corresponding directory is allowed. User1: User2: (The following test result proves that only list operation under /dir2 is allowed. All other operations will return permission denied or not found error.) About landing directory What will happen if all other configurations are correct but the landing directory is configured as root directory for user1 or user2? The answer to the above question is quite simple: The configuration will still work, but will impact the user experience. To show the the result of that case, one more local user called user3 with user ID 1005 is created but its landing directory is configured as admin, which is on root directory. The ACL permission assigned on it is same as user2 (change owner of dir2 to user3.) While connecting to the Storage Account by SFTP using user3, it will be landing on root directory. But per ACLs configuration, it only has permission to list files in dir2, hence the operations in root directory and dir1 are expected to fail. To apply further operation, user needs to add dir2/ in the command or cd dir2 at first.1.7KViews0likes1CommentSSMS 21/22 Error Upload BACPAC file to Azure Storage
Hello All In my SSMS 20, I can use "Export Data-tier Application" to export an BACPAC file of Azure SQL database and upload to Azure storage in the same machine, the SSMS 21 gives error message when doing the same export, it created the BACPAC files but failed on the last step, "Uploading BACPAC file to Microsoft Azure Storage", The error message is "Could not load file or assembly 'System.IO.Hashing, Version=6.0.0.0, Culture=neutral, PublicKeyToken=cc7b13ffcd2ddd51' or one of its dependencies. The system cannot find the file specified. (Azure.Storage.Blobs)" I tried the fresh installation of SSMS 21 in a brand-new machine (Windows 11), same issue, Can anyone advice? Thanks45Views0likes2CommentsExclude Prefix in Azure Storage Action: Smarter Blob Management
Azure Storage Actions is a powerful platform for automating data management tasks across Blob and Data Lake Storage. Among its many features, Exclude Prefix stands out as a subtle yet critical capability that helps fine-tune task assignments. What Is the "Exclude Prefix" Feature? The Exclude Prefix option allows users to omit specific blobs or folders from being targeted by Azure Storage Actions. This is particularly useful when applying actions such as: Moving blobs to a cooler tier Deleting blobs Rehydrating archived blobs Triggering workflows based on blob changes For example, if you're running a task to archive blobs older than 30 days, but you want to exclude logs or config files, you can define a prefix like logs/ or config/ in the exclusion list. How to Use It — Example Scenario: In the following example, blobs across the entire storage account were deleted based on a condition: if a blob’s access tier was set to Hot, it was deleted except for those blobs or paths explicitly listed under the Exclude blob prefixes property. Create a Task: - Navigate to the Azure portal and search for Storage tasks. Then, under Services, click on Storage tasks – Azure Storage Actions On the Azure Storage Actions | Storage Tasks page, click Create to begin configuring a new task. Complete all the required fields, then click Next to proceed to the Conditions page. To configure blob deletion, add the following conditions on the Conditions page. Add the Assignment :- Click Add assignment in the Select scope section, choose your subscription and storage account, then provide a name for the assignment. In the Role assignment section, select Storage Blob Data Owner from the Role drop-down list to assign this role to the system-assigned managed identity of the storage task. In the Filter objects section, specify the Exclude Blob Prefix filter if you want to exclude specific blobs or folders from the task. In the example specified above, blobs will be deleted—except for those under the path “excludefiles” listed in the Exclude blobprefixes property. In the Trigger details section, choose the runs of the task and then select the container where you'd like to store the execution reports. Select Add. In the Tags tab, select Next and in the Review + Create tab, select Review + create. When the task is deployed, your deployment is complete page appears and select Go to resource to open the Overview page of the storage task. Enable the Task Assignment: - In the Trigger details section, we have a Enable task assignment checkbox which is checked by default. If the Enable task assignments checkbox is unchecked, you can still enable assignments manually from the Assignments page. To do this, go to Assignments, select the relevant assignment, and then click Enable. The task assignment is queued to run and will run at the specified time. Monitoring the runs:- After the task completes running, you can view the results of the run. With the Assignments page still open, select View task runs. Select the View report link to download a report. You can also view these comma-separated reports in the container that you specified when you configured the assignment. Conclusion: The Exclude Prefix feature in Azure Storage Actions provides enhanced control and flexibility when managing blob data at scale. By allowing you to exclude specific prefixes from actions like delete or tier changes, it helps you safeguard critical data, reduce mistakes, and fine-tune automation workflows. This targeted approach not only improves operational efficiency but also supports more granular data in Azure Blob Storage. Note:- Azure Storage Actions are generally available in the following public regions: https://learn.microsoft.com/en-us/azure/storage-actions/overview#supported-regions We can also exclude certain blobs using the “Not”operator when building task conditions. Blobs may be excluded based on specific blob or container attributes from the task conditions side as well—not just through task assignments. In the screenshot below, we are using the Not operator (!) to exclude blobs where the blob name is equal to "Test". Please refer: https://learn.microsoft.com/en-us/azure/storage-actions/storage-tasks/storage-task-conditions#multiple-clauses-in-a-condition. Reference Links:- About Azure Storage Actions - Azure Storage Actions | Microsoft Learn Storage task best practices - Azure Storage Actions | Microsoft Learn Known issues and limitations with storage tasks - Azure Storage Actions | Microsoft Learn150Views1like1Comment- 4.6KViews0likes0Comments
Granting List-Only permissions for users in Azure Storage Account using ABAC
In this blog, we’ll explore how to configure list-only permissions for specific users in Azure Storage, allowing them to view the structure of files and directories without accessing or downloading their contents. Granting list-only permissions to specific users for an Azure Storage container path allows them to list files and directories without reading or downloading their contents. While RBAC manages access at the container or account level, ABAC offers more granular control by leveraging attributes like resource metadata, user roles, or environmental factors, enabling customized access policies to meet specific requirements. Disclaimer: Please test this solution before implementing it for your critical data. Pre-Requisites: Azure Storage GPV2 / ADLS Gen 2 Storage account Make sure to have enough permissions(Microsoft.Authorization/roleAssignments/write permissions) to assign roles to users , such as Owner or User Access Administrator Note: If you want to grant list-only permission to a particular container, ensure that the permission is applied specifically to that container. This approach limits the scope of access to just the intended container and enhances security by minimizing unnecessary permissions. However, in this example, I am demonstrating how to implement this role for the entire storage account. This setup allows users to list files and directories across all containers within the storage account, which might be suitable for scenarios requiring broader access. Action: You can follow the steps below to create a Storage Blob Data Reader role with specific conditions using the Azure portal: Step 1: Sign-in to the Azure portal with your credentials. Go to the storage account where you could like the role to be implemented/ scoped to. Select Access Control (IAM)->Add-> Add role assignment: Step2: On the Roles tab, select (or search for) Storage Blob Data Reader and click Next. On the Members tab, select User, group, or service principal to assign the selected role to one or more Azure AD users, groups, or service principals. Click Select members. Find and select the users, groups, or service principals. You can type in the Select box to search the directory for display name or email address. Please select the user and continue with Step 3 to configure conditions. Step 3: The Storage Blob Data Reader provides access to list, read/download the blobs. However, we would need to add appropriate conditions to restrict the read/download operations. On the Conditions tab, click Add condition. The Add role assignment condition page appears: In the Add action section, click Add action. The Select an action pane appears. This pane is a filtered list of data actions based on the role assignment that will be the target of your condition. Check the box next to Read a blob, then click Select: Step 4: Add the build expression in such a way that the below expression evaluates to false, so that the result entirely depends on the above condition. Save On the Review + assign tab, click Review + assign to assign the role with the condition. After a few moments, the security principal is assigned the role. Please Note: Along with the above permission, I have given the user Reader permission at the storage account level. You could give the Reader permission at the resource level/resource group level/subscription level too. We mainly have Management Plane and Data Plane while providing permissions to the user. The Management plane consists of operation related to storage account such as getting the list of storage accounts in a subscription, retrieve storage account keys or regenerate the storage account keys, etc. The Data plane access refers to the access to read, write or delete data present inside the containers. For more info, please refer to: https://docs.microsoft.com/en-us/azure/role-based-access-control/role-definitions#management-and-dat... To understand about the Built-in roles available for Azure resources, please refer to: https://docs.microsoft.com/en-us/azure/role-based-access-control/built-in-roles Hence, it is important that you give minimum of ‘Reader’ role at the Management plane level to test it out in Azure Portal. Step 5: Test the condition (Ensure that the authentication method is set to Azure AD User Account and not Access key) User can list the blobs inside the container. Download/Read blob failed. Related documentations: What is Azure attribute-based access control (Azure ABAC)? | Microsoft Learn Azure built-in roles - Azure RBAC | Microsoft Learn Tutorial: Add a role assignment condition to restrict access to blobs using the Azure portal - Azure ABAC - Azure Storage | Microsoft Learn Add or edit Azure role assignment conditions using the Azure portal - Azure ABAC | Microsoft Learn824Views0likes0CommentsWhy is the Azure Monitor chart showing dashed lines for the Availability metric?
Have you ever noticed the dash lines on the "Availability" Azure Monitor metric for your Storage Account? What's that about? Is your Storage Account down? Why is it that we see sections where there is a dashed line and other sections where there is a solid line? Well, it turns out that Azure metrics charts use dashed line style to indicate that there is a missing value (also known as “null value”) between two known time grain data points. Meaning, this behavior is by design, yes, by design. It is useful for identifying missing data points. The line chart is a superior choice for visualizing trends of high-density metrics but may be difficult to interpret for the metrics with sparse values, especially when corelating values with time grain is important. The dashed line makes reading of these charts easier but if your chart is still unclear, consider viewing your metrics with a different chart type. For example, a scattered plot chart for the same metric clearly shows each time grain by only visualizing a dot when there is a value and skipping the data point altogether when the value is missing, for my case this is what I get just after selecting the scatter chart: So, by using the scatter chart is a lot clearer now where are the missing data points. But why would there be any missing data point for the "Availability" Azure Monitor metric in the first place? Well, it turns out that the Azure Storage Resource Provider only reports the Availability data to Azure Monitor when there is ongoing activity, meaning, when the Storage Resource Provider is processing requests on any of its services (blob, queue, file, table). If you are not sending any requests to your Storage Account, you should expect the "Availability" Azure Monitor metric to show big sections of dashed lines and understand that this is by design. This is critical because if you don't know this, you may think that the following chart shows that your Storage Account has been "unavailable" for several hours: However, by taking a look at the scatter chart, we know now that there was a lot of inactivity on this Storage Account, and that at some point only 1 datapoint was showing "no availability", after which there was also an important gap again reflecting a period of no activity, for the Storage Account to then start receiving requests and for the "Availability" Azure Monitor metric to show 100% again: References ======== Chart shows dashed line https://docs.microsoft.com/en-us/azure/azure-monitor/essentials/metrics-troubleshoot#chart-shows-dashed-line7.8KViews1like2CommentsConverting Page or Append Blobs to Block Blobs with ADF
In certain scenarios, a storage account may contain a significant number of page blobs classified under the hot access tier that are infrequently accessed or retained solely for backup purposes. To optimise costs, it is desirable to transition these page blobs to the archive tier. However, as indicated in the following documentation - https://learn.microsoft.com/en-us/azure/storage/blobs/access-tiers-overview the ability to set the access tier is only available for block blobs; this functionality is not supported for append or page blobs. The Azure blob storage connector in Azure data factory is capable of copying blobs from block, append, or page blobs and copying data to only block blobs. https://learn.microsoft.com/en-us/azure/data-factory/connector-azure-blob-storage?tabs=data-factory#supported-capabilities Note: No extra configuration is required to set the blob type on the destination. By default, the ADF copy activity creates blobs as Block Blobs. In this blog, we will understand how to make use of Azure Data Factory to copy the page blobs to block blobs. Please note that this is applicable to append blobs as well. Let’s take a look at the steps ahead Step 1: Creating ADF instance Create an Azure data factory resource in the Azure portal referring to the following document - https://learn.microsoft.com/en-us/azure/data-factory/quickstart-create-data-factory After creation, click on "Launch Studio" as shown below Step 2: Creating datasets Create two datasets by navigating to Author -> Datasets -> New dataset. These datasets are used in source and sink for the ADF copy activity Select "Azure blob storage" -> click on continue -> select "binary" and continue Step 3: Creating Linked service Create a new linked service and provide the storage account name which contains page blobs Provide the file path where the page blobs are located. You would also need to create another dataset for destination. Repeat the steps from 3 to 6 to create another destination dataset to copy the blobs to the storage account as block blobs. Note: You can use same or different storage account for the destination dataset. Set it as per your requirements. Step 4: Configuring a Copy data pipeline Once the two datasets are created, now create a new pipeline and under "Move and Transform" section, drag and drop the "Copy data" activity as shown below. Under the Source and Sink sections from the drop down, select the source and destination datasets respectively which were created in the previous steps. Select the “Recursively” option and publish the changes. Source: Sink: Note: You can configure the filters and copy behaviour as per your requirements. Step 5: Debugging and validating Now as the configuration is completed, click on "Debug". If the pipeline activity ran successfully, you should be able to see "succeeded" status in the output section as below. Verify the blob type of the blobs in the destination storage account and it should show as block blob and access tier as Hot. After converting the blobs to block blobs, several methods are available to change their access tier to archive. These include implementing a blob lifecycle management policy, utilizing storage actions, or by using Az CLI or PowerShell scripts. Conclusion Utilising ADF enables the conversion of page or append blobs to block blobs, after which any standard method such as LCM policy or storage actions may be used to change the access tier to archive. This strategy offers a more streamlined and efficient solution compared to developing custom code or scripts. Reference links: https://learn.microsoft.com/en-us/azure/storage/blobs/access-tiers-overview https://learn.microsoft.com/en-us/azure/data-factory/connector-azure-blob-storage?tabs=data-factory#supported-capabilities https://learn.microsoft.com/en-us/azure/data-factory/copy-activity-overview https://learn.microsoft.com/en-us/azure/storage-actions/storage-tasks/storage-task-quickstart-portal https://learn.microsoft.com/en-us/azure/storage/blobs/lifecycle-management-overview https://learn.microsoft.com/en-us/azure/storage/blobs/archive-blob?tabs=azure-powershell#bulk-archive238Views0likes0CommentsPurview Webinars
REGISTER FOR ALL WEBINARS HERE Upcoming Microsoft Purview Webinars JULY 15 (8:00 AM) Microsoft Purview | How to Improve Copilot Responses Using Microsoft Purview Data Lifecycle Management Join our non-technical webinar and hear the unique, real life case study of how a large global energy company successfully implemented Microsoft automated retention and deletion across the entire M365 landscape. You will learn how the company used Microsoft Purview Data Lifecyle Management to achieve a step up in information governance and retention management across a complex matrix organization. Paving the way for the safe introduction of Gen AI tools such as Microsoft Copilot. 2025 Past Recordings JUNE 10 Unlock the Power of Data Security Investigations with Microsoft Purview MAY 8 Data Security - Insider Threats: Are They Real? MAY 7 Data Security - What's New in DLP? MAY 6 What's New in MIP? APR 22 eDiscovery New User Experience and Retirement of Classic MAR 19 Unlocking the Power of Microsoft Purview for ChatGPT Enterprise MAR 18 Inheriting Sensitivity Labels from Shared Files to Teams Meetings MAR 12 Microsoft Purview AMA - Data Security, Compliance, and Governance JAN 8 Microsoft Purview AMA | Blog Post 📺 Subscribe to our Microsoft Security Community YouTube channel for ALL Microsoft Security webinar recordings, and more!1.3KViews2likes0CommentsRehydrating Archived Blobs via Storage Task Actions
Azure Storage Actions is a fully managed platform designed to automate data management tasks for Azure Blob Storage and Azure Data Lake Storage. You can use it to perform common data operations on millions of objects across multiple storage accounts without provisioning extra compute capacity and without requiring you to write code. Storage task actions can be used to rehydrate the archived blobs in any tier as required. Please note there is no option to set the rehydration priority and is defaulted to Standard one as of now. Note :- Azure Storage Actions are generally available in the following public regions: https://learn.microsoft.com/en-us/azure/storage-actions/overview#supported-regions Azure Storage Actions is currently in PREVIEW in following reions. Please refer: https://learn.microsoft.com/en-us/azure/storage-actions/overview#regions-supported-at-the-preview-level Below are the steps to achieve the rehydration :- Create a Task :- In the Azure portal, search for Storage tasks. Under Services, select Storage tasks - Azure Storage Actions. On the Azure Storage Actions | Storage Tasks page, select Create Fill in all the required details and click on Next to open the Conditions page. Add the conditions as below if you want to rehydrate to Cool tier. Add the Assignment :- Select Add assignment. In the Select scope section, select your subscription and storage account and name the assignment. In the Role assignment section, in the Role drop-down list, select the Storage Blob Data Owner to assign that role to the system-assigned managed identity of the storage task. In the Filter objects section, specify the filter if you want this to run on some specific objects or the whole storage account. In the Trigger details section, choose the runs of the task and then select the container where you'd like to store the execution reports. Select Add. In the Tags tab, select Next. In the Review + Create tab, select Review + create. When the task is deployed, Your deployment is complete page appears. Select Go to resource to open the Overview page of the storage task. Enable the Task Assignment :- Storage task assignments are disabled by default. Enable assignments from the Assignments page. Select Assignments, select the assignment, and then select Enable. The task assignment is queued to run and will run at the specified time. Monitoring the runs :- After the task completes running, you can view the results of the run. With the Assignments page still open, select View task runs. Select the View report link to download a report. You can also view these comma-separated reports in the container that you specified when you configured the assignment. Reference Links :- About Azure Storage Actions - Azure Storage Actions | Microsoft Learn Storage task best practices - Azure Storage Actions | Microsoft Learn Known issues and limitations with storage tasks - Azure Storage Actions | Microsoft Learn237Views0likes0CommentsAzure Storage Options - A Guide to Choosing the right storage option
At the heart of this post is Kairos IMS, an innovative Impact Management System designed to empower human-serving nonprofits and social impact organizations. Co-developed by the Urban League of Broward County and our trusted technology partner, Impactful, Kairos IMS reduces administrative burdens, enhances holistic care, and enables organizations to leverage data for increased agility and seamless service delivery. In this blog series, we’ll take a closer look at the powerful technologies that fuel Kairos IMS, from Azure services to security frameworks, offering insight into how modern infrastructure supports mission-driven impact. Click here to learn more. Provided in this guide is a nonprofit-friendly breakdown of the main Azure Storage types, what they’re good for, and how to choose based on your needs and budget. The 4 Main Types of Azure Storage Azure offers four primary types of storage: Storage Type What It Stores Best For Blob Storage Unstructured data: images, videos, PDFs Media files, documents, backups File Storage Shared files accessible via SMB protocol Team file shares, legacy apps, migrations Table Storage NoSQL key-value data Lightweight data like logs or sensor data Queue Storage Messages for task automation Background tasks, app-to-app communication Let’s break them down in more detail, with nonprofit use cases. 🟣 1. Azure Blob Storage (Binary Large Object) What it is: A flexible place to store unstructured data—like documents, images, and videos. Use case for nonprofits: Uploading program videos or workshop recordings for your community Storing scanned forms, reports, or grant applications Keeping secure backups of sensitive files Cost tip: You can save money using Cool or Archive tiers for files you rarely access. 🔵 2. Azure File Storage What it is: A cloud-based shared file system that acts like a network drive. Use case for nonprofits: Replacing on-premise file servers Collaborating across teams in remote or hybrid environments Making legacy nonprofit software cloud-accessible Bonus: It integrates easily with Windows using standard SMB protocols, so your team won’t need to learn anything new. 🟢 3. Azure Table Storage What it is: A NoSQL storage option for simple key-value pairs. Use case for nonprofits: Storing lightweight data like newsletter sign-ups or app usage logs When you need a low-cost alternative to a full database Note: It’s not for complex queries—this is basic storage, great for lightweight scenarios. 🟡 4. Azure Queue Storage What it is: A messaging system that lets apps send and receive messages asynchronously. Use case for nonprofits: Automating tasks, like sending thank-you emails after an online donation Managing volunteer registration workflows You probably won’t use this directly, but if your IT team or a consultant is building an app for you, it might be part of the backend. How to Choose: A Quick Guide for Nonprofits Need Best Option Store and access documents, images, or videos Blob Storage Share files across staff or locations File Storage Store structured data (like a simple database) Table Storage Automate tasks between services Queue Storage Long-term storage or backups (low cost) Blob Storage (Archive Tier) Replacing an on-site file server File Storage 💡 Cost-Saving Tips for Nonprofits Use your Azure credits: Eligible nonprofits get $3,500 in free Azure credits annually via Microsoft for Nonprofits. Pick the right tier: Blob storage offers Hot, Cool, and Archive tiers based on how often you access data. Turn on auto-delete or lifecycle rules: Save money by setting old files to auto-delete or move to a cheaper tier. Final Thoughts Azure Storage offers powerful tools to help your nonprofit stay secure, organized, and scalable. Choosing the right option ensures your team has access to the files and data they need—without overspending. Whether you’re working with an IT volunteer, a cloud consultant, or just learning it yourself, knowing the basics of Azure Storage puts your organization in a stronger position to grow and serve your community.553Views1like1Comment