web apps
68 TopicsAzure App Service Flask Deployment issues with Error "didn't respond to HTTP pings on port: 8000"
Hello Everyone, I am Deploying the Flask Web app with Immersive Reader and I am trying deploying it by a ZIP file, tried with visual studio code too by the steps mentioned over the link below: https://learn.microsoft.com/en-us/azure/app-service/quickstart-python?tabs=flask%2Cwindows%2Cazure-cli%2Czip-deploy%2Cdeploy-instructions-azportal%2Cterminal-bash%2Cdeploy-instructions-zip-azcli So, I have gone through each step mentioned there and applied it and showing me the below result. I have gone through diagnostic resources. But didn't find any solution for the following error in logs.216Views0likes0CommentsIntroducing Microsoft Playwright Testing private preview
Explore Microsoft Playwright Testing, a new service built for running Playwright tests easily at scale. Playwright is a fast-growing, open-source framework that enables reliable end-to-end testing and automation for modern web apps. Microsoft Playwright Testing uses the cloud to enable you to run Playwright tests with much higher parallelization across different operating system-browser combinations simultaneously. This means getting tests done faster, which can help speed up delivery of features without sacrificing quality. The service is currently in private preview and needs your feedback to help shape this service! To get started,join the waitlist.And check out the full blog post for more information. How do you think Microsoft Playwright can help you with in your app development?1.5KViews4likes3CommentsAzure Function App Http Javascript render simple html file to replicate jsrsasign sign certificate
Good day, Please Help. 1. In PowerBI im trying to render the javascript sign certificate ofjsrsasign, i only got it working via an html file. So im trying to read the html file, simple hello to start of with. Am i better going directly to do thejsrsasign? 2. Locally on VS i got the simple function to return Hello Azure, but trying to read the simple html file executes no error but if i copy in postman i just get a 401 no content found, im not sure how further to debug as in VS i get Ok status, Nothing in Console? Anybody have an example or links plz? const { app } = require('@azure/functions'); const fs = require('fs'); const path = require('path'); app.http('IC5', { methods: ['GET', 'POST'], authLevel: 'anonymous', handler: async (request, context) => { context.log(`Http function processed request for url "${request.url}"`); // const name = request.query.get('name') || await request.text() || 'world'; // return { body: `Hello, ${name}!` }; //var res = { //body: "", //headers: { //"Content-Type": "text/html" //} //}; // readFile = require('../SharedCode/readFile.js'); //filepath = __dirname + '/test3.html'; //fs = require('fs'); //await fs.readFile(filepath,function(error,content){ fs.readFile(path.resolve('./test3.html'), 'UTF-8', (err, htmlContent) => { context.res = { status: 200, headers: { 'Content-Type': 'text/html' }, body: htmlContent } }) // if (request.query.name || (request.body && request.body.name)) { // res.body = "<h1>Hello " + (request.query.name || request.body.name) + "</h1>"; //} else { //fs.readFile(path.resolve(__dirname,'test3.html'), 'UTF-8', (err, htmlContent) => { //res.body= htmlContent; //context.res = res; //}); // } } }); //TEST IN POSTMAN: http://localhost:7071/api/IC5?name=hurry294Views0likes0CommentsWe have just re-deployed our app to ASEv3 and its failing
I am getting several issues once I migrated our asp.net framework 4.8 app from ASEv2 to ASEv3. It should be noted that the code is identical; but we did deploy the code this time using terraform so deployment of the apps (we have multiple apps for different customers using isolated databases and subdomain name) is identical and the creation of storage accounts and all other services are the same. I have checked the configuration, env variables of the web app, and config of storage accounts (v2) are both identical between the old service and the new one. Response Streaming Firstly, we cant stream files of particular sizes to the client anymore. We just get a weird URL Rewrite error (we dont have any url rewrite rules in our web.config). Small files work fine, larger files (2mb) do not (they are txt files) from a file share on Azure Storage. Initially we couldn't stream small files either, but I disabled dynamic compression to web.config and it fixed it; but then we started getting the above issue with larger files: <urlCompression doStaticCompression="true" doDynamicCompression="false" /> What we get is an "URL Rewrite Module Error": Module: RewriteModule Notification: ReleaseRequestState Handler: System.Web.Http.WebHost.HttpControllerHandler Error Code: 0x80004004 Request Issues We have an integration test that 'PUT's an empty model to an authenticated endpoint which has no authentication (Basic Auth); previously this returned a 401 Unauthorised; now it returns a 411 LengthRequired error. This must be something to do with the validation order of the request which has changed; perhaps ASEv3 has a slightly newer/different IIS version? Remote Debugging I cannot connect Visual Studio to my web app to remote debug; I presume they have killed off ASP.Net Framework apps and only support .net core now? When I select my web app (with remote debugging enabled) from Visual Studio Attach to Process, it just presents an "Operation not suppored 0x80aa0001" error The response issue is the most important as I just cannot work out what is going wrong and what has changed. This all works locally fine, even connecting to the same (live/prod) database and storage account works fine; so its just an issue within ASEv3 which was never an issue with ASEv2.526Views0likes1CommentError loading add-in ... while manifest is ok
I developped an Excel web addin with an aspnet core 6.0 API and published it on a dedicated server on Azure. Excel raise an "Error loading add-in" error when the complement is installed on Excel. Please note that : - the xml manifest has been validated (common error in this case) - cors is enabled - the api has been tested online with swagger Does any one have a suggestion ? Thanks595Views0likes1CommentBest Practices for API Error Handling: A Comprehensive Guide
APIs (Application Programming Interfaces) play a critical role in modern software development, allowing different systems to communicate and interact with each other. However, working with APIs comes with its challenges, one of the most crucial being error handling. When an API encounters an issue, it's essential to handle errors gracefully to maintain system reliability and ensure a good user experience. In this article, we'll discuss best practices for API error handling that can help developers manage errors effectively. Why is API Error Handling Important? API error handling is crucial for several reasons: Maintaining System Reliability: Errors are inevitable in any system. Proper error handling ensures that when errors occur, they are handled in a way that prevents them from cascading and causing further issues. Enhancing User Experience: Clear, informative error messages can help users understand what went wrong and how to resolve the issue, improving overall user satisfaction. Security: Proper error handling helps prevent sensitive information from being exposed in error messages, reducing the risk of security breaches. Debugging and Monitoring: Effective error handling makes it easier to identify and debug issues, leading to quicker resolutions and improved system performance. Best Practices for API Error Handling 1. Use Standard HTTP Status Codes HTTP status codes provide a standard way to communicate the outcome of an API request. Use status codes such as 200 (OK), 400 (Bad Request), 404 (Not Found), and 500 (Internal Server Error) to indicate the result of the request. Choosing the right status code helps clients understand the nature of the error without parsing the response body. 2. Provide Descriptive Error Messages Along with HTTP status codes, include descriptive error messages in your API responses. Error messages should be clear, concise, and provide actionable information to help users understand the problem and how to fix it. Avoid technical jargon and use language that is understandable to your target audience. 3. Use Consistent Error Response Formats Maintain a consistent format for your error responses across all endpoints. This makes it easier for clients to parse and handle errors consistently. A typical error response may include fields like status, error, message, code, and details, providing a structured way to convey error information. 4. Avoid Exposing Sensitive Information Ensure that error messages do not expose sensitive information such as database details, API keys, or user credentials. Use generic error messages that do not reveal internal system details to potential attackers. 5. Implement Retry Logic for Transient Errors For errors that are likely to be transient, such as network timeouts or service disruptions, consider implementing retry logic on the client side. However, retries should be implemented judiciously to avoid overwhelming the server with repeated requests. 6. Document Common Errors Provide comprehensive documentation that includes common error codes, messages, and their meanings. This helps developers quickly identify and troubleshoot common issues without needing to contact support. 7. Use Logging and Monitoring Implement logging and monitoring to track API errors and performance metrics. Logging helps you understand the root cause of errors, while monitoring allows you to proactively identify and address issues before they impact users. 8. Handle Rate Limiting and Throttling Implement rate limiting and throttling to protect your API from abuse and ensure fair usage. Return appropriate error codes (e.g., 429 - Too Many Requests) when rate limits are exceeded, and provide guidance on how users can adjust their requests to comply with rate limits. 9. Provide Support for Localization If your API serves a global audience, consider providing support for localization in your error messages. This allows users to receive error messages in their preferred language, improving the user experience for non-English speakers. 10. Test Error Handling Finally, thoroughly test your API's error handling capabilities to ensure they work as expected. Test various scenarios, including valid requests, invalid requests, and edge cases, to identify and address potential issues. Conclusion Effective error handling is essential for building reliable and user-friendly APIs. By following these best practices, you can ensure that your API handles errors gracefully, provides meaningful feedback to users, and maintains high availability and security. Implementing robust error handling practices will not only improve the reliability of your API but also enhance the overall user experience.5KViews0likes0CommentsAZURE KEY VAULT:SECURING YOUR DIGITAL ASSETS
Unveiling the Power of Azure Key Vault with PIN Authentication Intro What is Azure Key Vault ? Azure Key Vault is a cloud service for securely storing and accessing secrets. A secret is anything that you want to tightly control access to, such as API keys, passwords, certificates, or cryptographic keys. Key Vault service supports two types of containers: vaults and managed hardware security module(HSM) pools. Vaults support storing software and HSM-backed keys, secrets, and certificates. Managed HSM pools only support HSM-backed keys. Azure Key Vault enables Microsoft Azure applications and users to store and use several types of secret/key data: keys, secrets, and certificates. Keys, secrets, and certificates are collectively referred to as “objects”. Workshop What better than a little hands on to understand the how to ? For example how are we getting our secrets ? How do we manage Key Vault ? Let’s build our workshop as traditional Azure Enthusiasts and share our experience throughout this post ! We need: Azure Subscription VSCode or any IDE of our choice Patience Our approach is simple yet it grasps the usage and power of Azure Key Vault. We are going to create an Azure Key Vault instance, and store specific key value pairs. These are smart crafted even the name plays its role since we can integrate our SDK in every aspect of the Key Vault objects. The pairs are named after a user PIN number and are stored as a single value. For example we are going to create: secret-1234 , value= user1:p@ssw0rd1 secret-4567, value= user2:p@ssw0rd2 secret-1212, value= user3:p@ssw0rd3 Can you see the dynamics? We have a PIN system where we store PIN numbers so the user won’t have to remember or even know Username & Password ! Impressive right ? Our Web App is simple, yet it is verifying via a form the correct PIN and also allows the specific user to login and proceed ! Let’s see that in action ! Let’s create a Resource Group and an Azure Key Vault instance: # Get Subscription ID and add to variable sub_id= $(az account show --query id) # Create a resource group az group create --name rg-mygroup --location northeurope # Generate a random name for the Key Vault key_vault_name=$(cat /dev/urandom | tr -dc 'a-z0-9' | fold -w 5 | head -n 1) key_vault_name="kv$key_vault_name" # Create Key Vault with RBAC for Authorization az keyvault create --name $key_vault_name --resource-group rg-keypin --location northeurope --sku standard --enabled-for-deployment true --enabled-for-template-deployment true --enabled-for-disk-encryption true --enable-rbac-authorization true # Add the Key Vault Administrator to you az role assignment create --role "Key Vault Administrator" --assignee $(az ad signed-in-user show --query id --output tsv) --scope /subscriptions/$sub_id/resourceGroups/rg-mygroup/providers/Microsoft.KeyVault/vaults/$key_vault_name Adding the Key Vault Administrator role to yourself (or a Role that allows you to edit secrets) enables us to add our secret pairs now, we have already added the secret-1234 for reference: Before we move let’s grab the Vault Uri. Go to overview and see the URI on the right or run: az keyvault show --name $key_vault_name --query properties.vaultUri Our example is using VSCode but if you are comfortable with other IDEs or Visual Studio you can just get the code ! So we start as follows: Edit the files accordingly: from flask import Flask, render_template, request, redirect, url_for, flash from azure.identity import DefaultAzureCredential from azure.keyvault.secrets import SecretClient import os app = Flask(__name__) app.secret_key = 'edF32ert44rfgSAv2' # Change to a strong secret key # Azure Key Vault setup key_vault_name = os.environ["KEY_VAULT_NAME"] kv_uri = f"https://{key_vault_name}.vault.azure.net" credential = DefaultAzureCredential() client = SecretClient(vault_url=kv_uri, credential=credential) @app.route('/', methods=['GET', 'POST']) def index(): if request.method == 'POST': pin = request.form['pin'] secret_name = f"secret-{pin}" try: retrieved_secret = client.get_secret(secret_name) # Assuming the secret value is in 'username:password' format username, password = retrieved_secret.value.split(':') # Redirect to success page or pass username/password to the template return render_template('success.html', username=username, password=password) except Exception as e: # Handle error (e.g., secret not found) flash('Invalid PIN or secret not found.') return redirect(url_for('index')) return render_template('index.html') @app.route('/success') def success(): return render_template('success.html') if __name__ == '__main__': app.run(debug=True) Time to build our Web App. From VSCode or from Azure Portal create a new Web App and a new Service Plan ( S1 is fine) for Python 3.10. Activate the System assigned Managed Identity and add the Key Vault Secrets User role to the Managed Identity. az webapp identity assign -n WEBAPPNAME-g rg-mygroup //Get the principalId from Output az role assignment create --role "Key Vault Secrets User" --assignee PRINCIPALID --scope /subscriptions/$sub_id/resourceGroups/rg-mygroup/providers/Microsoft.KeyVault/vaults/$key_vault_name Remember to add a Configuration Setting for the KEY_VAULT_NAME in the Web App also. From VSCode deploy to Azure Web Apps or run the relevant Az Cli command: az webapp up --name <web_app_name> --sku <app_service_plan_sku> --runtime "PYTHON|<python_version>" Allow some time to build, and browse the Web App. We are presented with the Welcome Page: Now once we enter the correct PIN we are logged in with the matching username and password ! Closing We have seen a small fraction of the amazing features of Azure Key Vault, a pretty cool feature where we are using PIN to Authenticate and Key Vault securely holds all of our Credentials. We can take this much further with more details, and also secure the whole Architecture with VNET Integration and Private Endpoints, as well as Front Door ! Our project stands as a testament to how cloud services like Azure Key Vault can be seamlessly integrated into web applications to implement secure authentication mechanisms. The resulting application is a fine example of combining modern web development practices with the powerful security features offered by Azure, achieving a highly functional and secure user authentication system based solely on PINs. This approach not only simplifies the user experience but also maintains a high standard of security, demonstrating the effectiveness of Azure Key Vault in contemporary web application development.473Views1like0CommentsCannot add Apple Business Manager from Azure AD admin center
I cannot add Apple Business Manager from Azure AD admin center > Enterprise applications. I can find the app, but if I clicks on it, there is only the button "Sign up for Apple Business Manager". After clicking on it, there is a site from Apple and I cannot add the app there. The prerequisites from the article below are fulfilled: Tutorial: Configure Apple Business Manager for automatic user provisioning with Azure Active Directory | Microsoft Docs3.7KViews1like6CommentsA look into App Service: Backup and Restore over Azure Virtual Network
There is a new preview feature that provides a secure way backups are handled for web applications. This feature “App Service: Backup and Restore over Azure Virtual Network”, provides an additional layer of security by allowing backups to be stored in a firewall-protected storage account. The primary advantage I see of this feature is the enhanced security it offers. By storing backups in a firewall-protected storage account, it ensures that your data is safe from unauthorized access. Additionally, this feature allows for custom backups giving more control over what data is backed up and when. There are a few prerequisites needed: The app must be integrated with a virtual network or be in a v3 App Service environment. The storage account must have granted access from the virtual network that the app is integrated with. I am sharing couple of scenarios where this feature could be beneficial: A healthcare company web app that handles sensitive patient data. If there is a requirement to ensure that data is securely backed up and protected from unauthorized access they can back up their web app’s files and configuration data to a firewall-protected storage account, ensuring that their data is secure. In software development projects following DTAP deployment strategy, they have multiple environments such as Development, Testing, Staging, and Production. Each environment is isolated and has its own set of resources. With this feature, the pipelines can back up the web app’s files and configuration data to a firewall-protected storage account in the same virtual network. This aligns with their backup policy and adds an extra layer of security as the backups are not exposed to the public internet. This new preview feature offers enhanced security and flexibility for backing up web app data. It should be part of backup and disaster recovery strategy. It’s worth checking out if you’re looking for a secure and customizable backup solution for your web apps.1.1KViews0likes0Comments