azure storage
9 TopicsAzure Logic Apps : HTTP Request OR Custom Connector
Hello, As far as I know, We use HTTP requests while consuming the First-party/third-party API, then when should we use a custom connector? What are those business cases where one should use an HTTP request in PowerAutomate and use in PowerApps Or use a custom connector and use in PowerApps and Power Automate? What are the pros and cons of HTTP Request OR Custom Connector? Thanks and Regards, -Sri723Views0likes1CommentAzure SQL Database : Can I use same primary key column and foreign key column for multiple tables?
CREATE TABLE Table1( PRIMARY KEY (Table1ID), Column2 int ); CREATE TABLE Table2( PRIMARY KEY (Table1ID), Column2 int, FOREIGN KEY (Table1ID) REFERENCES Table1(Table1ID) ); CREATE TABLE Table3( PRIMARY KEY (Table1ID), Column2 int, FOREIGN KEY (Table1ID) REFERENCES Table1(Table1ID) );268Views0likes0CommentsAzure Logic App workflow (Standard) Resubmit and Retry
Hello Experts, A workflow is scheduled to run daily at a specific time and retrieves data from different systems using REST API Calls (8-9). The data is then sent to another system through API calls using multiple child flows. We receive more than 1500 input data, and for each data, an API call needs to be made. During the API invocation process, there is a possibility of failure due to server errors (5xx) and client errors (4xx). To handle this, we have implemented a "Retry" mechanism with a fixed interval. However, there is still a chance of flow failure due to various reasons. Although there is a "Resubmit" feature available at the action level, I cannot apply it in this case because we are using multiple child workflows and the response is sent back from one flow to another. Is it necessary to utilize the "Resubmit" functionality? The Retry Functionality has been developed to handle any Server API errors (5xx) that may occur with Connectors (both Custom and Standard), including client API errors 408 and 429. In this specific scenario, it is reasonable to attempt retrying or resubmitting the API Call from the Azure Logic Apps workflow. Nevertheless, there are other situations where implementing the retry and resubmit logic would result in the same error outcome. Is it acceptable to proceed with the Retry functionality in this particular scenario? It would be highly appreciated if you could provide guidance on the appropriate methodology. Thanks -Sri928Views0likes0CommentsStep-by-Step Guide to Creating a Cosmos DB with Private DNS in Azure
Introduction: In this blog post, we will walk through the process of creating a Cosmos DB instance with Private DNS in the Azure cloud environment. Private DNS allows you to resolve the Cosmos DB endpoint using a custom domain name within your virtual network, enhancing security and network management. Follow these steps with accompanying screenshots to set up your Cosmos DB with Private DNS. Prerequisites: Azure subscription Virtual network created Custom domain name Step 1: Create a Cosmos DB Instance: 1.1. Log in to the Azure portal (https://portal.azure.com/). 1.2. Click on "Create a resource" and search for "Azure Cosmos DB." 1.3. Click "Create" to start the Cosmos DB creation process. Step 2: Configure Basics: 2.1. Choose the appropriate subscription and resource group. 2.2. Enter a unique name for your Cosmos DB instance. 2.3. Choose the desired API (e.g., Core SQL for SQL API). 2.4. Select the desired location for your Cosmos DB. Step 3: Networking: 3.1. Under the "Networking" section, select "Enable virtual network." 3.2. Choose the virtual network and subnet where you want to enable Private DNS. 3.3. Click "Next: Advanced" to proceed. Step 4: Advanced: 4.1. Under the "Advanced" section, select "Enable automatic failover" if needed. 4.2. Enter a unique DNS prefix for your Cosmos DB. 4.3. Configure other advanced settings as necessary. 4.4. Click "Review + Create." Step 5: Review and Create: 5.1. Review your configuration settings. 5.2. Click "Create" to start the deployment of your Cosmos DB instance. Step 6: Create Private DNS Zone: 6.1. In the Azure portal, navigate to "Create a resource" and search for "Private DNS Zone." 6.2. Select "Private DNS Zone" and click "Create." 6.3. Choose the subscription and resource group. 6.4. Enter the name of your custom domain (e.g., cosmosprivatedns.local). 6.5. Associate the private DNS zone with the virtual network where your Cosmos DB resides. Step 7: Add Virtual Network Link: 7.1. Inside the Private DNS Zone you created, go to "Virtual network links" and click "+ Add." 7.2. Select the virtual network where your Cosmos DB is located. 7.3. Choose the subnet associated with your Cosmos DB. Step 8: Update DNS Configuration in Cosmos DB: 8.1. Go back to your Cosmos DB instance's settings. 8.2. Under "Connection strings," update the "Hostname" with the custom domain name you created in the Private DNS Zone (e.g., mycosmosdb.cosmosprivatedns.local). Step 9: Test Private DNS Resolution: 9.1. Set up a test application within the same virtual network. 9.2. Use the custom domain name when connecting to the Cosmos DB instance. 9.3. Verify that the connection is successful, indicating the Private DNS resolution is working2.5KViews0likes0CommentsScaling storage over multiple regions
Hi, I was recently experimenting with optimizing WordPress using static fronts hosted on Azure Storage Static Website. Now that I have a super fast frontend up-and-running, I was wondering how to best scale theses storages with CDN. Here's my simple architecture of the layout. What would be the best approach to ensure availability across the globe? I've also blogged about this setup at https://www.in2it.be/2020/02/scale-to-the-clouds/ if you want to have more background information. The site itself is available at https://wpazure.in2it.dev if you want to see the blazing-fast result of a WordPress site.1.3KViews0likes1CommentUse Azure Storage Table REST API with AAD token via PostMan
You can refer to below steps for scenarios in which you have an application special requirement and need to call raw Storage table REST API from your dev environment via Postman. It consists of two main HTTP requests: first, to authenticate directly using AD security principal to get access token, second an authenticated storage REST API call for Table Storage. Documentation related Query Entities REST API - https://docs.microsoft.com/en-us/rest/api/storageservices/query-entities Authorize access to tables using Azure Active Directory https://docs.microsoft.com/en-us/azure/storage/tables/authorize-access-azure-active-directory Prerequisites To follow the steps in this article you must have: Azure subscription An Azure AD tenant Registered application (AD Service principal) Steps to reproduce this scenario: Acquire oAuth 2.0 token: Created security principal for application (Azure portal > AAD > app registrations). Documentation reference: https://docs.microsoft.com/en-us/rest/api/servicebus/get-azure-active-directory-token#register-your-app-with-azure-ad Assigned Storage Table Data Reader role at storage account level to SP created in step #1 (wait for 30 mins) Used Postman to get the Azure AD token: Launch Postman. For the method, select GET. For the URI, enter https://login.microsoftonline.com/<TENANT ID>/oauth2/token. Replace <TENANT ID> with the tenant ID value you copied earlier. On the Headers tab, add Content-Type key and application/x-www-form-urlencoded for the value. Switch to the Body tab and add the following keys and values. Select form-data. Add grant_type key, and type client_credentials for the value. Add client_id key, and paste the value of client ID you noted down earlier. Add client_secret key, and paste the value of client secret you noted down earlier. Add resource key, and type https://storage.azure.com/ for the value Select Send to send the request to get the token. You see the token in the result. Save the token (excluding double quotes). You will use it later Called Query Entities storage REST API and passed the oAuth 2.0 token from previous step In Postman, open a new tab. Select GET for the method. Enter URI in the following format: https://<account>.table.core.windows.net /<table>(). Replace <account> with the name of the Storage Account name. Replace <table> with the name of the table. On the Headers tab, add the following three headers. Add Authorization key and value for it in the following format: Bearer <TOKEN from Azure AD>. When you copy/paste the token, don't copy the enclosing double quotes. Select Send to get the entities from table. You see the status as OK with the code 200 as shown in the following image.5.1KViews0likes0CommentsAzure Developer Templates
Dear Azure Developers, I would like to share updated version of Azure Developer Templates documentation where new samples were added. You will find examples how to use the latest version of the Azure SDK libraries: https://daniel-krzyczkowski.github.io/AzureDeveloperTemplates/ I hope you will find it helpful.1.2KViews0likes0Comments