azure blob storage
9 TopicsHow to Automate Cross-OS File Fixes with Azure Automation and PowerShell
Build a serverless file fixer in Azure using Automation, PowerShell, Blob Storage, and Event Grid. Learn how to set up the necessary resources, configure permissions, and automatically detect and correct cross-OS file issues—such as CRLF vs LF line endings and file permission mismatches. This streamlined approach saves time and eliminates manual fixes, ensuring smoother, error-free workflows for developers working across different operating systems.215Views0likes0Comments[pt1] Choosing the right Data Storage Source (Generally available) for Azure AI Search
When integrating Azure AI Search into your solutions, choosing the right data storage and data sources is crucial for efficient and scalable indexing. This blog dives into three primary data source connectors for Azure AI Search: Azure Blob Storage, Azure Cosmos DB for NoSQL, and Azure SQL Database. Each data source type offers distinct advantages and use cases depending on the structure of your data and the desired search functionality.385Views0likes0CommentsSeamless File Management in ASP.NET Core: Azure Blob Storage with Configurable Local Mode
Setting up the Project Before we dive into the implementation, let’s set up the basic structure: Create an ASP.NET Core Web API project. Add the Azure.Storage.BlobsNuGet Package for Azure Blob Storage integration. Update appsettings.json to include storage configurations: "Storage": { "Mode": "Local", "BlobStorage": { "ConnectionString": "YourAzureBlobStorageConnectionString", "ContainerName": "YourContainerName" } } Mode: Determines whether to use Local or Azure Blob Storage. ConnectionString and ContainerName: Azure Blob Storage details. Implementing the Storage Service Here is a custom StorageService that handles file uploads, deletions, and downloads. It dynamically decides the storage location based on the configuration mode. public class StorageService : IStorageService { private readonly IConfiguration _configuration; private readonly IHttpContextAccessor _httpContextAccessor; private readonly string _storageMode; public StorageService(IConfiguration configuration, IHttpContextAccessor httpContextAccessor) { _configuration = configuration; _httpContextAccessor = httpContextAccessor; _storageMode = _configuration["Storage:Mode"]; } public async Task<string> UploadFile(IFormFile file) { try { if (_storageMode == "Local") { var uploads = Path.Combine(Directory.GetCurrentDirectory(), "wwwroot", "uploads"); var fileName = $"{DateTime.Now:MMddss}-{file.FileName}"; var filePath = Path.Combine(uploads, fileName); if (!Directory.Exists(uploads)) Directory.CreateDirectory(uploads); using (var fileStream = new FileStream(filePath, FileMode.Create)) { await file.CopyToAsync(fileStream); } var host = string.Format("{0}{1}", _httpContextAccessor.HttpContext!.Request.IsHttps ? "https://" : "http://", _httpContextAccessor.HttpContext.Request.Host); return $"{host}/uploads/{fileName}"; } else { var container = new BlobContainerClient(_configuration["Storage:BlobStorage:ConnectionString"], _configuration["Storage:BlobStorage:ContainerName"]); await container.CreateIfNotExistsAsync(PublicAccessType.Blob); var blob = container.GetBlobClient(file.FileName); using (var fileStream = file.OpenReadStream()) { await blob.UploadAsync(fileStream, new BlobHttpHeaders { ContentType = file.ContentType }); } return blob.Uri.ToString(); } } catch (Exception ex) { throw; } } public async Task<bool> DeleteFile(string fileName) { try { if (_storageMode == "Local") { var uploads = Path.Combine(Directory.GetCurrentDirectory(), "wwwroot", "uploads"); var filePath = Path.Combine(uploads, fileName); if (File.Exists(filePath)) { File.Delete(filePath); return true; } } else { var container = new BlobContainerClient(_configuration["Storage:BlobStorage:ConnectionString"],_configuration["Storage:BlobStorage:ContainerName"]); var blob = container.GetBlobClient(fileName); await blob.DeleteIfExistsAsync(); return true; } } catch (Exception ex) { return false; } } public byte[] DownloadFile(string fileName) { var blobServiceClient = new BlobServiceClient(_configuration["Storage:BlobStorage:ConnectionString"]); var container = blobServiceClient.GetBlobContainerClient(_configuration["Storage:BlobStorage:ContainerName"]); var blobClient = container.GetBlobClient(fileName); using (var memoryStream = new MemoryStream()) { blobClient.DownloadTo(memoryStream); return memoryStream.ToArray(); } } } Static File Middleware Add app.UseStaticFiles(); in Program.cs to serve local files. Usage Example Here’s how to use the UploadFile method in a Web API: public async Task<ServiceResponse<List<ProductPicture>>> UploadImages(List<IFormFile> images) { ServiceResponse<List<ProductPicture>> response = new(); try { List<ProductPicture> pictures = new(); foreach (var file in images) { var imageUrl = await _storageService.UploadFile(file); pictures.Add(new ProductPicture { Url = imageUrl, FileName = Path.GetFileName(imageUrl) }); } response.Data = pictures; } catch (Exception ex) { response.Success = false; response.Message = ex.Message; } return response; } Conclusion By integrating Azure Blob Storage and providing a configurable local storage mode, this approach ensures flexibility for both development and production environments. It reduces development friction while leveraging Azure’s scalability when deployed. For further details, explore the Azure Blob Storage Documentation and Quickstart: Azure Blob Storage.608Views0likes0CommentsUsing Azure's AI Language Service to Summarize and Extract Themes from Interview Transcripts
Imagine you’re a program evaluator or a qualitative researcher tasked with analyzing hundreds of interview transcripts. Each transcript is filled with valuable information, but the sheer volume and time-consuming nature of the task can be overwhelming. You find yourself buried in a sea of words, desperately seeking a way to extract meaningful insights efficiently. In our latest blog post, we dive into the world of program evaluations and qualitative research. Learn how Azure’s AI/ML tools revolutionize transcript analysis, making it faster, more accurate, and less labor-intensive4.5KViews0likes0CommentsTeach ChatGPT to Answer Questions: Using Azure AI Search & Azure OpenAI (Lang Chain)
In this two-part series, we will explore how to build intelligent service using Azure. In Series 1, we'll use Azure AI Search to extract keywords from unstructured data stored in Azure Blob Storage. In Series 2, we'll Create a feature to answer questions based on PDF documents using Azure OpenAI.43KViews4likes9CommentsTeach ChatGPT to Answer Questions: Using Azure AI Search & Azure OpenAI (Semantic Kernel)
In this two-part series, we will explore how to build intelligent service using Azure. In Series 1, we'll use Azure AI Search to extract keywords from unstructured data stored in Azure Blob Storage. In Series 2, we'll Create a feature to answer questions based on PDF documents using Azure OpenAI26KViews4likes3CommentsHarnessing Retail Data with Azure: Integrating Blob Storage and Databricks for Advanced Analytics
Learn how a retail company leverages Azure Blob Storage and Azure Databricks to store, process, and analyze its massive sales data. You will see how the company uses PySpark to transform data into insights that help them optimize their product strategy and marketing campaigns. You will also find some learning resources to help you get started with data engineering on Microsoft Azure.1.4KViews0likes0Comments