Forum Widgets
Latest Discussions
Azure NetApp Files | Azure cli command to suspend/resume backup at volume level
I'm looking for the corresponding azure cli for the following action suspend a backup policy for a specific volume I do see a cli (at policy level) az netappfiles account backup-policy update, which has got the following parameter [--enabled{0,1, f,false, n, no, t,true, y, yes}] The property to decide policy is enabled or not.Accepted values:0, 1, f, false, n, no, t, true, y, yes But this is at the netapp account > policy level I'm unable to find the azure cli to do thisat specific volume level. Is there a cli for this action at Volume level in the Configure Backups dialog box > Policy State { Suspend/Resume}, how shall we achieve this if we have programmatically do this step.DurgalakshmiSivadasDec 10, 2024Copper Contributor57Views0likes2CommentsWhy is element in the queue deleted even if the function throws an exception?
I write an Azure Function with a queue trigger and i want to send the data to the backend service if the backend service is avaiable and if not available then the element should still be in the queue. My question is how can i achieve this? my code and host.json looks like this ? [Function("QueueCancellations")] public async Task<IActionResult> QueueCancellation([QueueTrigger("requests", Connection = "ConnectionStrings:QUEUE_CONNECTION_STRING")] string message) { try { using (var httpClient = new HttpClient()) { var content = new StringContent(message, Encoding.UTF8, "application/json"); var httpResponse = await httpClient.PostAsync(_configuration["LOCAL_SERVICE_URL_CANCELL"], content); if (httpResponse.IsSuccessStatusCode) { return new OkObjectResult("Data sent to backend"); } else { return new BadRequestObjectResult("Backend not available"); } } } catch (Exception ex) { _logger.LogError(ex.Message); return new BadRequestObjectResult("Backend not available"); } } { "version": "2.0", "logging": { "applicationInsights": { "samplingSettings": { "isEnabled": false, "excludedTypes": "Request" }, "enableLiveMetricsFilters": true } }, "logLevel": { "default": "Information", "Host.Results": "Information", "functions": "Information", "Host.Aggregator": "Information" }, "extensions": { "queues": { "maxPollingInterval": "00:00:02", "visibilityTimeout": "00:00:30", "batchSize": 16, "maxDequeueCount": 5, "newBatchThreshold": 8, "messageEncoding": "base64" } } }akin_kNov 04, 2024Copper Contributor133Views0likes1CommentAdvise on upload processing
Hello, I am here to seek some advise regarding the architecture of an application. It is written in .net. The application has a part where tenants can upload fotos into their account (blob storage). They can do that either via Web UI or by connecting an uploader application (like Azure Storage Explorer). Each tenant basically has some kind of image-inbox. Those uploaded images need to be processed afterwards. It is worth to mention that we use Aspire and Azurite when developing. My issue starts with the processing of those images. First I was using azure functions to process those images and they have been triggered by storage blob events. This worked but was kind of slow. When we installed Aspire it was even not possible to use Azure Functions anymore. This was the time when we removed Azure Functions and created our own "Worker" application. This application polled the Azure Blob Storage and if there were images started a sub-worker per tenant to process those files. This was already much faster and more reliable. But after deploying to production we quickly found out, that the cost sky-rocketed due to the polling. I then looked into then new GetChangeFeedClient approach which seems to be a less expensive version to get informed about new data in the blob storage. I developed that just to find out, that it is not supported in Azurite, thus we can not even run in locally. At this point I do not even understand why Azurite does not support all of the Azure features. How can anyone develop something without a simulator? Anyway this seems to be a dead end. I also thought about Azure Event Grid but that is also not supported on Azurite. At this point I am not sure what to do. Should I wait for .net 9 and go back to Azure Functions because they will be supported and will I live with the slow processing and overhead? Or should I ditch Azure Blob Storage and find another storage solution? Any advise is appreciated...andreas1415Nov 03, 2024Copper Contributor89Views0likes2Commentsdisk size on azure wrong size in ubuntu
I create an Azure VM running Ubuntu. I added permanent storage of 64GB: Size64 GiB Storage typePremium SSD LRS IOPS240 Throughput (MBps)50 Disk tierP6 However from within Linux it shows as 8GB df -h /dev/sdb1 Filesystem Size Used Avail Use% Mounted on /dev/sdb1 7.8G 28K 7.4G 1% /mnt any ideas what is wrong ? Thanks PeterwpyungOct 24, 2024Copper Contributor129Views0likes1CommentAzure Premium disk host caching
I have a Zabbix application which is writing and reading from PostgreSQL DB. The PostgreSQL DB is installed in a azure VM with premium SSD data disk. Should I change the host cache on data disk to read/write, will it have any issues. Currently its in read only optionemiljjOct 14, 2024Copper Contributor156Views0likes1CommentAzure File Share : Unable to mount Azure file share when executing docker-compose
IamtryingtosetupacontainerservicelocallythatmountsanAzurefileshareasitsextendedstorage.However,Iamencounteringanerrordespiteverifyingthekeysandstoragename.HereistheerrormessageIamreceiving. [error] Error response from daemon: failed to populate volume: error while mounting volume '/mnt/cloudstor/ipfs-docker_ipfs-azure': VolumeDriver.Mount: mount failed: exit status 32 output="mount error(13): Permission denied\nRefer to the mount.cifs(8) manual page (e.g. man mount.cifs)\nmount error(13): Permission denied\nRefer to the mount.cifs(8) manual page (e.g. man mount.cifs)\nmount: mounting //fileshare.file.core.windows.net/ipfs-share on /mnt/cloudstor/ipfs-docker_ipfs-azure failed: Permission denied\njamesterzzzzzzOct 14, 2024Occasional Reader188Views0likes2CommentsAzure NetApp Files - Error creating the volume
I'm encountering an issue while trying to create a new volume on my NetApp system, which is integrated with Azure AD (Entra ID). I don't have a traditional on-premises Active Directory or Domain Controller and am relying entirely on Azure AD. When I attempt to create the volume, I receive the following error: Error when creating - Failed to create the Active Directory machine account "NETAPP-B213". Reason: LDAP Error: Local error occurred Details: Error: Machine account creation procedure failed [ 76] Loaded the preliminary configuration. [ 79] Successfully connected to ip 10.0.8.4, port 88 using TCP [ 111] Successfully connected to ip 10.0.8.4, port 389 using TCP [ 111] Entry for host-address: 10.0.8.4 not found in the current source: FILES. Ignoring and trying next available source [ 122] Successfully connected to ip 10.0.8.4, port 88 using TCP [ 129] FAILURE: Unable to SASL bind to LDAP server using GSSAPI: Local error [ 132] Unable to connect to LDAP (Active Directory) service on evri3ba830eo2hg.migramer.com (Error: Local error) [ 132] Unable to make a connection (LDAP (Active Directory):MIGRAMER.COM), Result: RESULT_ERROR_LDAPSERVER_LOCAL_ERROR. (Code: ErrorFromNFSaaSErrorState) Has anyone encountered this issue or have any insights on resolving this LDAP error with Azure AD and NetApp? Any assistance would be greatly appreciated! Thanks, Mauriciomauricio1480Oct 13, 2024Copper Contributor159Views0likes1CommentGenerating a SAS Token for Azure Blob Storage in C# using APIs
Hello Community, I am trying to generate a SAS token for a Blob Storage account using C#. However, I'm encountering an error when trying to authenticate the request: <Error> <Code>AuthenticationFailed</Code> <Message>Server failed to authenticate the request. Make sure the value of Authorization header is formed correctly including the signature. RequestId:0ead0723-901e-002c-7b88-037068000000 Time:2024-09-10T13:52:02.2658050Z</Message> <AuthenticationErrorDetail>Signature fields not well formed.</AuthenticationErrorDetail> </Error> To clarify, I am trying to generate the SAS token without using the NuGet package "Azure.Storage.Blobs". I managed to generate the SAS token successfully using the library, but now I want to achieve the same result purely with APIs. Below is the code I’m using to generate the SAS token: private string GenerateSasToken(string accountName, string accountKey, string containerName, string blobName) { // Define SAS parameters string signedPermissions = "r"; // Read permissions string signedStart = DateTime.UtcNow.AddMinutes(-5).ToString("yyyy-MM-ddTHH:mm:ssZ"); // Start time is 5 minutes ago to account for clock skew string signedExpiry = DateTime.UtcNow.AddHours(8).ToString("yyyy-MM-ddTHH:mm:ssZ"); // Expiry time 8 hours from now string signedResource = "b"; // Blob resource type string signedVersion = "2022-11-02"; // Storage service version string signedProtocol = "https"; // HTTPS only string signedIp = ""; // No IP restriction // Canonicalized resource: "/blob/account/container/blob" string canonicalizedResource = $"/blob/{accountName}/{containerName}/{blobName}"; // Construct the string-to-sign string stringToSign = $"{signedPermissions}\n" + $"{signedStart}\n" + $"{signedExpiry}\n" + $"{canonicalizedResource}\n" + $"\n" + // signedIdentifier (optional, left empty) $"{signedIp}\n" + $"{signedProtocol}\n" + $"{signedVersion}\n"; // Decode the account key from Base64 byte[] keyBytes = Convert.FromBase64String(accountKey); // Create HMAC-SHA256 hash using (HMACSHA256 hmac = new HMACSHA256(keyBytes)) { byte[] signatureBytes = hmac.ComputeHash(Encoding.UTF8.GetBytes(stringToSign)); string signature = Convert.ToBase64String(signatureBytes); // Construct the SAS token var sasToken = HttpUtility.ParseQueryString(string.Empty); sasToken["sp"] = signedPermissions; sasToken["st"] = signedStart; sasToken["se"] = signedExpiry; sasToken["spr"] = signedProtocol; sasToken["sv"] = signedVersion; sasToken["sr"] = signedResource; sasToken["sig"] = HttpUtility.UrlEncode(signature); // URL-encoded signature // Return the full blob URL with the SAS token return $"https://{accountName}.blob.core.windows.net/{containerName}/{blobName}?{sasToken}"; } } Problem: I am getting the error that the signature fields are not well-formed. This seems to indicate that there is an issue with how the string-to-sign or the signature itself is being constructed. Does anyone have experience with this error, or can you spot anything wrong with my code? Any help or advice would be greatly appreciated! I really appreciate any help you can provide.mohammedbabaSep 11, 2024Copper Contributor382Views0likes2Comments🚀 Share Your Experience for a Graph Connector of Azure File Share! 🚀
Hello Azure Tech Community! I'm Danny, the Product Manager at Microsoft responsible for Graph Connectors. We're excited to explore the opportunity to develop a Graph Connector for Azure File Share, which would allow your users to easily search and interact with files stored in AFS. If you're interested in this new connector, please fill out our survey to share your experience. ️Fill out our survey form here We're also looking for customers who use Azure File Share to participate in an interview with us. If you're interested, please leave a comment or reach out to me atdannyyao@microsoft.com.dannyyaoAug 15, 2024Microsoft455Views0likes0CommentsSFTP enabled Storage Account Behind Nginx Reverse Proxy
I am trying to put SFTP enabled storage Account behind nginx proxy. I tried with the below configuration in nginx.conf stream { upstream backend{ server <<storageAccount_private_ip>>:22; } server { listen 22; proxy_pass backend; } } nginx service won't restart as port 22 is already in use for SSH. Can someone help me?sachinratnawatJul 30, 2024Copper Contributor476Views0likes2Comments
Resources
Tags
- azure blob storage21 Topics
- azure blob13 Topics
- Azure Files8 Topics
- azure backup6 Topics
- storage explorer6 Topics
- Azure Data Lake4 Topics
- Azure Archive3 Topics
- Queue Storage3 Topics
- updates2 Topics
- Azure ExpressRoute1 Topic