data + storage
110 TopicsMine your Azure backup data, it could save you 💰💡
Your data has a story to tell. Mine it, decipher it, and turn it into actionable outcomes. 📊🔍 Azure backups can become orphaned in several ways (I'll dive into that in a future post). But here’s a key point: orphaned doesn’t always mean useless, hence the word “Potential” in the title of my Power BI report. Each workload needs to be assessed individually. If a backup is no longer needed, you might be paying for it - unnecessarily and unknowingly. 🕵️♂️💸 To uncover these hidden costs, I combined data from the Azure Business Continuity Center with a PowerShell script I wrote to extract LastBackupTime and other metadata. This forms the foundation of my report, helping visualize and track backup usage over time. This approach helped me identify forgotten one-time backups, VMs deleted without stopping the backup, workloads excluded due to policy changes, and backups left behind after resource migrations. If you delete unneeded backups and have soft-delete enabled, the backup size drops to zero and Azure stops charging for it. ✅🧹 💡 Do your Azure backups have their own untold story to tell? 📸 Here's a snapshot of my report that helped me uncover these insights 👇34Views0likes0CommentsAzure Storage
You've asked for something about Azure Storage, and there's a wealth of information both from your own work and from trusted external sources. Here's a tailored overview that blends your contributions with broader context: 🔹 What Is Azure Storage? Azure Storage [1] is Microsoft’s cloud-based solution for storing modern data workloads. It offers: High availability and durability through redundancy and geo-replication. Massive scalability to support everything from small apps to enterprise-scale systems. Security with built-in encryption and fine-grained access control. Global accessibility via REST APIs, SDKs, Azure CLI, and Azure Storage Explorer. Azure Storage supports multiple data services: Azure Blob Storage: For unstructured data like images, videos, and backups. Azure Files: Managed file shares accessible via SMB or NFS. Azure Queues: For message-based communication between components. Azure Tables: A NoSQL store for structured data. Azure Disks: Persistent block storage for Azure VMs. Azure Elastic SAN and Azure Container Storage: For advanced storage orchestration and container-native workloads. 🔹 Your Contributions on Azure Storage You've made significant contributions to the Azure community, especially through your blog and event series: In your MVP Contributions tracker, you’ve documented multiple sessions and blog posts covering: Types of Azure Storage (e.g., Blob, File, Queue, Table) [2] Cost optimization strategies for Azure Storage [2] Azure Files and file sharing capabilities [2] Your presentation A COMPREHENSIVE GUIDE TO AZURE COST MANAGEMENT dives into how Azure Storage fits into broader cloud cost strategies, emphasizing budget control, resource allocation, and ROI [3]. 🔹 Practical Use Cases Azure Storage is ideal for: Backup and disaster recovery with geo-redundant storage. Big data analytics using Data Lake Storage. Web and mobile app content delivery via Blob Storage. Enterprise file sharing with Azure Files. IoT and telemetry ingestion using Queues and Tables. Would you like help turning this into a blog post, presentation, or training module? I can also summarize your past Azure Storage sessions or help you prepare new ones. References [1] Introduction to Azure Storage - Cloud storage on Azure [2] MVP Contributions [3] A COMPREHENSIVE GUIDE TO AZURE COST MANAGEMENT31Views0likes0CommentsA Complete Guide to Azure Database Migration Strategies, Tools, and Best Practices
As organizations increasingly adopt cloud-first strategies, migrating databases to the cloud has become a crucial step toward scalability, security, and cost-efficiency. Microsoft Azure, with its powerful ecosystem, offers a range of services and tools to simplify and streamline this process. In this blog, we’ll explore Azure Database Migration, including its benefits, strategies, tools, and best practices. https://dellenny.com/a-complete-guide-to-azure-database-migration-strategies-tools-and-best-practices/73Views1like0CommentsScaling Smart with Azure: Architecture That Works
Hi Tech Community! I’m Zainab, currently based in Abu Dhabi and serving as Vice President of Finance & HR at Hoddz Trends LLC a global tech solutions company headquartered in Arkansas, USA. While I lead on strategy, people, and financials, I also roll up my sleeves when it comes to tech innovation. In this discussion, I want to explore the real-world challenges of scaling systems with Microsoft Azure. From choosing the right architecture to optimizing performance and cost, I’ll be sharing insights drawn from experience and I’d love to hear yours too. Whether you're building from scratch, migrating legacy systems, or refining deployments, let’s talk about what actually works.46Views0likes1CommentStorage Accounts - Networking
Hi All, Seems like a basic issue, however, I cannot seem to resolve the issue. In a nutshell, a number of storage accounts (and other resources) were created with the Public Network Access set as below: I would like to change them all to them all to Enabled from selected virtual networks and IP addresses or even Disabled. However, when I change to Enabled from selected virtual networks and IP addresses, connectivity from, for example, Power Bi to the Storage Account fails. I have added the VPN IP's my local IP etc. But all continue to fail connection or authentication. Once it is changed back to Enabled for All networks everything works, i.e. Power Bi can access the Azure Blob Storage and refresh successfully. I have also enabled 'Allow Azure services on the trusted services list to access this storage account'. But PBI fails to have access to the data. data Source Credentials error, whether using Key, Service Principal etc, it fails. As soon as I switch it back to Enable From All Networks, it authenticates straight away. One more idea I had was to add ALL of the Resource Instances, as this would white list more Azure services, although PBI should be covered by enabling 'Allow Azure services on the trusted services list to access this storage account'. I thought I might give it a try. Also, I created an NSG and used the ServiceTags file to create an inbound rule to allow Power BI from UK South. Also, I have created a Private Endpoint. This should all have worked but still can’t set it to restricted networks. I must be missing something fundamental or there is something fundamentally off with this tenant. When any of the two restrictive options are selected, do they also block various Microsoft services? Any help would be gratefully appreciated.148Views1like2CommentsAzure support team not responding to support request
I am posting here because I have not received a response to my support request despite my plan stating that I should hear back within 8 hours. It has now gone a day beyond that limit, and I am still waiting for assistance with this urgent matter. This issue is critical for my operations, and the delay is unacceptable. The ticket/reference number for my original support request was 2410100040000309. And I have created a brand new service request with ID 2412160040010160. I need this addressed immediately.319Views0likes4CommentsAzure network security perimeter with storage accounts and Runbooks
I know this is a preview feature, and I don't know if it will be fixed in the future. The problem arises when you try to secure traffic between Azure serverless runbooks and a storage account. No matter what configuration you use, the runbook will access the storage account using a 10.x.x.x IP. That means you can't secure traffic using storage account firewall rules since private IPs are not allowed. I thought that with Azure's network security perimeter, this would be fixed since you can put your storage inside and specify that only resources from the subscription are allowed to access it. But no, it still doesn't work. Is Microsoft aware of this issue? I know you can use hybrid workers to get a public IP and so on, but that destroys the power of runbooks if you can't use the serverless option. Thanks for your time!82Views0likes1CommentAzure ADF ServiceNow connector can't retrieve table columns but same login can do in REST API
I have tried to create a pipeline using copy activity to extract data from a table in our ServiceNow dev platform. I have first used the latest version of the ServiceNow connector. However, it didn't work. When I tried to import schema, it shows below error message: Failed to load. The API request to ServiceNow failed. Request Url: https://airtrunkautemp.service-now.com/api/now/table/sys_dictionary?sysparm_query=name%3dfacilities_request^ORname%3dsm_order^ORname%3dtask, Status Code: Forbidden, Error message: {"error":{"message":"Insufficient rights to query records","detail":"Field(s) present in the query do not have permission to be read"},"status":"failure"} Activity ID: 5a99e871-893d-4426-809e-0b22654248f8 Then I tried to use the legacy version of the ServiceNow connector, extract full table data using query. After I executed the pipeline, only 1 column sys_id is returned. I have contacted the ServiceNow support for the issue, they checked and got back me the access login has no issue. Then I wrote python to use REST API to retrieve data from the same table, it works, I could extract all table columns without the insufficient rights issue. Does anyone have this experience before? How did you solve it?181Views0likes1CommentUnable to process AAS model connecting to Azure SQL with Service Account
Hello I have built a demo SSAS model that I am hosting on an Azure Analysis Services Server. The model connects to an Azure SQL database in my tenant (the Database is the default AdventureWorks provided by Azure when creating your first DB). To connect to the Azure SQL, I have created an App (service principal) and granted it reader access to my Azure SQL DB. If I login to the Azure SQL DB from SSMS with this account, using Microsoft Entra Service Principal Authentication providing ClientId@TenantID for the Username and SecretValue as the password, I am able to login and SELECT from the tables. However, when I try to process the SSAS model, I get an error. For reference, below I have put the TMSL script that sets the DataSource part of the SSAS after deployment via YAML pipelines (variables are replaced when running). I think the issue lies in the "AuthenticationKind" value I have provided in the credential, but I can't figure out what to use. When I create the datasource like this and process, I get error: Failed to save modifications to the server. Error returned: '<ccon>Windows authentication has been disabled in the current context.</ccon>. I don't understand why since I am not using Windows authentication kind. Every other keyword I used ib the "AuthenticationKind" part returns error AuthenticationKind not supported. Any help on how to change this script would be useful. { "createOrReplace": { "object": { "database": "$(AAS_DATABASE)", "dataSource": "$(AZSQLDataSourceName)" }, "dataSource": { "type": "structured", "name": "$(AZSQLDataSourceName)", "connectionDetails": { "protocol": "tds", "address": { "server": "$(AZSQLServer)" }, "initialCatalog": "$(AZSQLDatabase)" }, "credential": { "AuthenticationKind": "ServiceAccount", "username": "$(AZSQL_CLIENT_ID)@$(AZSQL_TENANT_ID)", "password": "$(AZSQL_CLIENT_SECRET)" } } } }102Views0likes1Comment