blob storage
11 TopicsMicrosoft Data + AI Hackathon held in partnership with the University of Oxford and Observable HQ
How many major cities like London and New York been affected by Covid-19 and how can we use historical data to explore how these effects might impact us in a post-Covid world? This was the question posed at a Microsoft Data + AI Hackathon held in partnership with the University of Oxford and Observable HQ at the New York City Reactor Meetup space.2KViews0likes0CommentsAzure Function with Blob Output Binding returning 404 on GetProperties check before writing the Blob
Hi. This question is similar: https://stackoverflow.com/questions/64546302/how-to-disable-blob-existence-check-in-azure-function-output-binding But I'm wondering if there are other answers or comments out there, and more recent. I have an Azure Function with an HTTP Trigger input binding and a Blob Storage output binding. For every execution, the output binding looks like it tries to get Blob Properties first, resulting in a 404. Quite rightly, as the data to be written is going to a new Blob. But this will always fail and in this case is redundant. It takes time to go through these steps - admittedly milliseconds, but still. Presumably it's also logging somewhere, so that's a storage cost - might be negligible now, but something to not be ignored. I'm not 100% sure where that logging would be stored, either, to go and manage it. The positive is that the overall function execution is fine. But it's still recording all these failures, and we're getting 10s of thousands through it a day. Is there a way to use the concise output binding code but not do this prior if-exists-get-properties check? My options seem to be live with it, or rewrite to use BlobContainer, BlobClient and so on instead of the Blob attribute output binding. Anyone got some clever ideas?2KViews0likes1CommentLesson Learned #25: Export/Import Azure SQL Database using Azure File Service?
First published on MSDN on Apr 03, 2017 In some situations, we need to import or export our Azure SQL Database using SQLPackage, but, unfortunately, either source and destination file we cannot specify a blob storage, in case that we want to save the file in this storage.1.9KViews1like0CommentsAzure function not launched
Scenario: - a user is connecting to a webapp - he uploads an excel file gathering a list of videos to analyse - this webapp transfers this exccel file into an azure blob - this transfer must launch an azure function which orchestrate as an analysis thanks to azure batch Issue: I encountered a dysfunctionnement in the pipeline: nothing happens when the file is transferred into the blob May someone can help me? Thank you a lot1.1KViews0likes1CommentHow to save Azure Vision AI Devkit stream to Blob storage?
The WebStreamModule streams the camera output, but I need to save that footage to a blob storage when the camera detects motion, so I can then run some computer vision AI models. Any idea how I could implement this? Many thanks, Raul800Views0likes1CommentAccess Firewall Protected (Select Vnet) Azure Storage from Azure SQL Database
I have a storage account which is firewall protected and Azure SQL. Both are in same tenant /subscription/resource group. I am unable to access the Blob from Azure SQL(Bulk import). I have tried "Resource instances" feature. But its not working. Can you anyone guide me how to solve this?513Views0likes0CommentsTransforming Enterprise AKS: Multi-Tenancy at Scale with Agentic AI and Semantic Kernel
In this post, I’ll show how you can deploy an AI Agent on Azure Kubernetes Service (AKS) using a multi-tenant approach that maximizes both security and cost efficiency. By isolating each tenant’s agent instance within the cluster and ensuring that every agent has access only to its designated Azure Blob Storage container, cross-tenant data leakage risks are eliminated. This model allows you to allocate compute and storage resources per tenant, optimizing usage and spending while maintaining strong data segregation and operational flexibility—key requirements for scalable, enterprise-grade AI applications.