storage explorer
8 TopicsBuilding a Scalable Web Crawling and Indexing Pipeline with Azure storage and AI Search
In the ever-evolving world of data management, keeping search indexes up-to-date with dynamic data can be challenging. Traditional approaches, such as manual or scheduled indexing, are resource-intensive, delay-prone, and difficult to scale. Azure Blob Trigger combined with an AI Search Indexer offers a cutting-edge solution to overcome these challenges, enabling real-time, scalable, and enriched data indexing. This blog explores how Blob Trigger, integrated with Azure Cognitive Search, transforms the indexing process by automating workflows and enriching data with AI capabilities. It highlights the step-by-step process of configuring Blob Storage, creating Azure Functions for triggers, and seamlessly connecting with an AI-powered search index. The approach leverages Azure's event-driven architecture, ensuring efficient and cost-effective data management.1.7KViews7likes10CommentsAzure Storage Container - Soft Delete Monitoring
Hi All, Can someone let me know if there is a way to export all the soft deleted items in my container on a daily basis into a csv or any file format I could connect PBI with? I want to monitor all Active and Soft Deleted items on a Power Bi report I have created and at this moment I can't seem to find a way to get a list of all the items that have been deleted.. Thanks457Views0likes1CommentCan an azure blob with malware infect other blobs in the storage account?
Hi, I need to know whether it can be infected to other blobs or files if I accidently uploaded a malware into azure storage account or all the blobs are isolated from each other. If it can be infected, does it infect the blob/files only within the container or to the entire storage account?1.7KViews0likes4CommentsRead contents of multiple blobs at once using a single API Call
I have a list of multiple blob names whose contents I need to read at once. But from the documentation, I could find only api that allows me to read each blob at a time. This approach takes a lot of time and is expensive operation as it involves multiple REST API calls. Is there an api that allows me to pass the list of blobnames and access it contents using one API call ? Thanks Satya553Views0likes0CommentsError while trying to write in Azure Tables ('' is not a valid value for a partition key or row key)
Hi everyone, I am a beginner in Azure. I'm trying to add a row in Azure Tables storage using Python. I followed the simple example of Microsoft documentation : https://docs.microsoft.com/en-us/azure/azure-functions/functions-bindings-storage-table-output?tabs=in-process%2Cstorage-extension&pivots=programming-language-python#example However, I get this error : "Microsoft.Azure.WebJobs.Extensions.Tables: '' is not a valid value for a partition key or row key" and I don't see the table created in my MicrosoftAzure Storage Explorer. I already have a functional azure function with a queueTrigger as an input binding. So, I added an ouptut binding Table in the configuration file "function.json": { "scriptFile": "__init__.py", "bindings": [ { "name": "dataSetId", "type": "queueTrigger", "direction": "in", "queueName": "local-toprep", "connection": "AzureWebJobsStorage" }, { "type": "table", "direction": "out", "name": "message", "tableName": "messages", "partitionKey": "message", "connection": "AzureWebJobsStorage" } ] } And in my main function ,I added my table "message" in the arguments and I added those lines : rowKey = str(uuid.uuid4()) data = { "Name": "Output binding message", "PartitionKey": "message", "RowKey": rowKey } message.set(json.dumps(data)) Thank you in advance for your help 🙂Solved4KViews0likes1CommentData Migration from OpenText Content Server to Azure Blob Storage or other services.
Hi all We want to migrate data (~40Terabyte) from OpenText Content Server to Azure Blog Storage or other similar/better services if any with a view to Archive (Closed Projects) about 80% of the data and other 20% (Active Projects) will be moved to SharePoint Online for business users to be able to access the data. Another business requirement is to be able to search data in Azure Blob Storage effectively & efficiently via Azure Cognitive Services or other similar/better services. I'd greatly appreciate any help or guidance or lessons learnt from anyone out there who have done similar migration of data from OpenText Content Server to Azure Blob Storage with any experiences to share. It would be great to hear what techniques were used and how successful were you in migrating the data and what reconciliations methods were used to ensure all the data (i.e. size & number of files/folders) to ensure all teh data was migrated successfully. Thanks for your time and effort in advance.4.1KViews0likes1CommentVirtual Machine OS Disk Size Reduction
Hi, I am looking to create a Win 10 Pro VM with OS disk size of 32 GB. Unfortunately, I couldn't find any work around except using PowerShell script. Is there any alternative way please? I am a newbie so a simpler method will be highly appreciated! Thanks666Views0likes0Comments