azure netapp files
26 TopicsHybrid File Tiering Addresses Top CIO Priorities of Risk Control and Cost Optimization
Hybrid File Tiering addresses top CIO priorities of risk control and cost optimization This article describes how you can leverage Komprise Intelligent Tiering for Azure with any on-premises file storage platform and Azure Blob Storage to reduce your cost by 70% and shrink your ransomware attack surface. Note: This article has been co-authored by Komprise and Microsoft. Unstructured data plays a big role in today's IT budgets and risk factors Unstructured data, which is any data that does not fit neatly into a database or tabular format, has been growing exponentially and is now projected by analysts to be over 80% of business information. Unstructured data is commonly referred to as file data, which is the terminology used for the rest of this article. File data has caught some IT leaders by surprise because it is now consuming a significant portion of IT budgets with no sign of slowing down. File data is expensive to manage and retain because it is typically stored and protected by replication to an identical storage platform which can be very expensive at scale. We will now review how you can easily identify hot and cold data and transparently tier cold files to Azure to cut costs and shrink ransomware exposure with Komprise. Why file data is factoring into CIO priorities CIOs are prioritizing cost optimization, risk management and revenue improvement as key priorities for their data. 56% chose cost optimization as their top priority according to the 2024 Komprise State of Unstructured Data Management survey. This is because file data is often retained for decades, its growth rate is in double-digits, and it can easily be petabytes of data. Keeping a primary copy, a backup copy and a DR copy means three or more copies of the large volume of file data which becomes prohibitively expensive. On the other hand, file data has largely been untapped in terms of value, but businesses are now realizing the importance of file data to train and fine tune AI models. Smart solutions are required to balance these competing requirements. Why file data is vulnerable to ransomware attacks File data is arguably the most difficult data to protect against ransomware attacks because it is open to many different users, groups and applications. This increases risk because a single user's or group's mistake can lead to a ransomware infection. If the file is shared and accessed again, the infection can quickly spread across the network undetected. As ransomware lurks, the risk increases. For these reasons, you cannot ignore file data when creating a ransomware defense strategy. How to leverage Azure to cut the cost and inherent risk of file data retention You can cut costs and shrink the ransomware attack surface of file data using Azure even when you still require on-premises access to your files. The key is reducing the amount of file data that is actively accessed and thus exposed to ransomware attacks. Since 80% of file data is typically cold and has not been accessed in months (see Demand for cold data storage heats up | TechTarget), transparently offloading these files to immutable storage through hybrid tiering cuts both costs and risks. Hybrid tiering offloads entire files from the data storage, snapshot, backup and DR footprints while your users continue to see and access the tiered files without any change to your application processes or user behavior. Unlike storage tiering which is typically offered by the storage vendor and causes blocks of files to be controlled by the storage filesystem to be placed in Azure, hybrid tiering operates at the file level and transparently offloads the entire file to Azure while leaving behind a link that looks and behaves like the file itself. Hybrid tiering offloads cold files to Azure to cut costs and shrink the ransomware attack surface: Cut 70%+ costs: By offloading cold files and not blocks, hybrid tiering can shrink the amount of data you are storing and backing up by 80%, which cuts costs proportionately. As shown in the example below, you can cut 70% of file storage and backup costs by using hybrid tiering. Assumptions Amount of Data on NAS (TB) 1024 % Cold Data 80% Annual Data Growth Rate 30% On-Prem NAS Cost/GB/Mo $0.07 Backup Cost/GB/Mo $0.04 Azure Blob Cool Cost/GB/Mo $0.01 Komprise Intelligent Tiering for Azure/GB/Mo $0.008 On-Prem NAS On-prem NAS + Azure Intelligent Tiering Data in On-Premises NAS 1024 205 Snapshots 30% 30% Cost of On-Prem NAS Primary Site $1,064,960 $212,992 Cost of On-Prem NAS DR Site $1,064,960 $212,992 Backup Cost $460,800 $42,598 Data on Azure Blob Cool $0 819 Cost of Azure Blob Cool $0 $201,327 Cost of Komprise $100,000 Total Cost for 1PB per Year $2,590,720 $769,909 SAVINGS/PB/Yr $1,820,811 70% Shrink ransomware attack surface by 80%: Offloading cold files to immutable Azure Blob removes cold files from the active attack surface thus eliminating 80% of the storage, DR and backup costs while also providing a potential recovery path if the cold files get infected. By having Komprise tier to immutable Azure Blob with versioning, even if someone tried to infect a cold file, it would be saved as a new version – enabling recovery using an older version. Learn more about Azure Immutable Blob storage here. In addition to cost savings and improved ransomware defense, the benefits of Hybrid Cloud Tiering using Komprise and Azure are: Leverage Existing Storage Investment: You can continue to use your existing NAS storage and Komprise to tier cold files to Azure. Users and applications continue to see and access the files as if they were still on-premises. Leverage Azure Data Services: Komprise maintains file-object duality with its patented Transparent Move Technology (TMT), which means the tiered files can be viewed and accessed in Azure as objects, allowing you to use Azure Data Services natively. This enables you to leverage the full power of Azure with your enterprise file data. Works Across Heterogeneous Vendor Storage: Komprise works across all your file and object storage to analyze and transparently tier data to Azure file and object tiers. Ongoing Lifecycle Management in Azure: Komprise continues to manage data lifecycle in Azure, so as data gets colder, it can move from Azure Blob Cool to Cold to Archive tier based on policies you control. Azure and Komprise customers are already using hybrid tiering to improve their ransomware posture while reducing costs – a great example is Katten. Global law firm saves $900,000 per year and achieves resilient ransomware defense with Komprise and Azure Katten Muchin Rosenman LLP (Katten) is a full-service law firm delivering legal services across more than a dozen practice areas and sectors, including Aviation, Construction, Energy, Education, Entertainment, Healthcare and Real Estate. Like many other large law firms, Katten has been seeing an average 20% annual growth in storage for file related data, resulting in the need to add on-premises storage capacity every 12-18 months. With a focus on managing data storage costs in an environment where data is growing exponentially annually but cannot be deleted, Katten needed a solution that could provide deep data insights and the ability to move file data as it ages to immutable object storage in the cloud for greater cost savings and ransomware protection. Katten Law implemented hybrid tiering using Komprise Intelligent Tiering to Azure and leveraged Immutable Blob storage to not only save $900,000 annually but also improved their ransomware defense posture. Read how Katten Law does hybrid tiering to Azure using Komprise. Summary: Hybrid Tiering helps CIOs to optimize file costs and cut ransomware risks Cost optimization and Risk management are top CIO priorities. File data is a major contributor to both costs and ransomware risks. Organizations are leveraging Komprise to tier cold files to Azure while continuing to use their on-premises file storage NAS. This provides a low risk approach with no disruption to users and apps while cutting 70% costs and shrinking the ransomware attack surface by 80%. Next steps To learn more and get a customized assessment of your savings, visit the Azure Marketplace listing or contact azure@komprise.com.676Views3likes1CommentHow to Save 70% on File Data Costs
In the final entry in our series on lowering file storage costs, DarrenKomprise shares how Komprise can help lower on-premises and Azure-based file storage costs. Komprise and Azure offer you a means to optimize unstructured data costs now and in the future!14KViews1like1CommentSAP System Refresh and Cloning operations on Azure NetApp Files with SnapCenter
Discover the power of SAP HANA on Azure NetApp Files with seamless system refresh and cloning operations using SnapCenter. This innovative solution leverages Azure NetApp Files snapshot and volume cloning capabilities to provide end-to-end workflows for data protection and SAP system refresh operations. Whether you need to create system copies for testing, address logical corruption, or perform disaster recovery failover tests, SnapCenter ensures quick and efficient processes, saving you time and resources.SAP HANA data protection on Azure NetApp Files with SnapCenter
Efficient and fast data protection and recovery of SAP HANA running on Azure NetApp Files with SnapCenter introduction, configuration, backup and restore. Ensure the highest level of data protection for your SAP HANA systems with Azure NetApp Files. Leveraging the robust capabilities of Azure NetApp Files, NetApp SnapCenter backup and recovery offers seamless snapshot-based backup and restore operations. This solution provides the ability to efficiently offload snapshots to an Azure storage account for longer term retention, ensuring fast data protection and recovery for SAP applications.Azure NetApp Files - Error creating the volume
I'm encountering an issue while trying to create a new volume on my NetApp system, which is integrated with Azure AD (Entra ID). I don't have a traditional on-premises Active Directory or Domain Controller and am relying entirely on Azure AD. When I attempt to create the volume, I receive the following error: Error when creating - Failed to create the Active Directory machine account "NETAPP-B213". Reason: LDAP Error: Local error occurred Details: Error: Machine account creation procedure failed [ 76] Loaded the preliminary configuration. [ 79] Successfully connected to ip 10.0.8.4, port 88 using TCP [ 111] Successfully connected to ip 10.0.8.4, port 389 using TCP [ 111] Entry for host-address: 10.0.8.4 not found in the current source: FILES. Ignoring and trying next available source [ 122] Successfully connected to ip 10.0.8.4, port 88 using TCP [ 129] FAILURE: Unable to SASL bind to LDAP server using GSSAPI: Local error [ 132] Unable to connect to LDAP (Active Directory) service on evri3ba830eo2hg.migramer.com (Error: Local error) [ 132] Unable to make a connection (LDAP (Active Directory):MIGRAMER.COM), Result: RESULT_ERROR_LDAPSERVER_LOCAL_ERROR. (Code: ErrorFromNFSaaSErrorState) Has anyone encountered this issue or have any insights on resolving this LDAP error with Azure AD and NetApp? Any assistance would be greatly appreciated! Thanks, Mauricio355Views0likes1CommentOptimize HANA deployments with Azure NetApp Files application volume group for SAP HANA
This article describes the value and use cases for Azure NetApp Files (ANF) volume provisioning using Azure NetApp Files application volume group (AVG) for SAP HANA. The article refers to a series of videos, which explain the pre-requisites as well as the different deployment scenarios in detail.Microsoft Ignite 2023: What is new in file shares for enterprise workloads?
We are announcing various new capabilities for Azure Files and Azure NetApp Files to continue to improve file share experiences for enterprise workloads. You can now enjoy more flexibility, security, and performance for your data storage needs.3.5KViews0likes0CommentsSAP on Azure NetApp Files Sizing Best Practices
This article provides guidelines for properly sizing SAP deployments on Azure NetApp Files (ANF) to achieve optimal performance and cost-effectiveness. It emphasizes the importance of accurate capacity and performance estimation to avoid overprovisioning or underutilization of resources. The article introduces the "SAP on Azure NetApp Files Sizing Estimator" tool, which automates the sizing process and helps customers and partners to make informed decisions. By following these sizing guidelines and leveraging the tool, customers and partners can optimize their SAP landscapes on ANF for improved efficiency, reliability, and cost-effectiveness.