This sample application demonstrates how to implement various AI scenarios on Azure App Service using Azure AI Foundry. It provides production-ready code that you can integrate into your existing Flask applications by copying the AIPlaygroundCode package and following the integration steps.
Ideal for: Developers looking to add AI capabilities to existing Flask applications, learn Azure AI Foundry integration patterns, and implement enterprise-grade AI features with minimal setup effort.
Key Scenarios Covered
Conversational AI: Natural language processing with context awareness and session management
Reasoning Models: Advanced problem-solving capabilities with step-by-step analytical thinking
Structured Output: JSON-formatted responses with schema validation for system integration
Multimodal Processing: Image analysis and audio transcription using vision and audio models
Enterprise Chat: Retail-optimized AI assistant with customer service and business intelligence scenarios
Quick Start - Azure Deployment (Recommended)
Prerequisites
- Azure CLI installed
- Azure Developer CLI (azd) installed
- Azure subscription with contributor access
- Optional: Azure AI Foundry access with existing model deployments if you want to use existing endpoints:
- Chat/Reasoning/Image model: gpt-4o-mini or similar for conversational AI, reasoning, and image analysis
- Audio model: gpt-4o-mini-audio-preview or similar for audio transcription and processing
- Model Compatibility: Use recommended models (gpt-4o-mini, gpt-35-turbo) to avoid deployment errors. Advanced models like gpt-4o may not be available in all regions
One-Command Deployment
- Clone and Deploy
git clone https://github.com/Azure-Samples/azure-app-service-ai-scenarios-integrated-sample.git
cd azure-app-service-ai-scenarios-integrated-sample
azd up
What to Expect During Deployment
When you run azd up, you'll see these prompts:
- Enter a unique environment name: โ myai-demo
- Subscription Selection โ Use arrow keys to select
- Resource Group โ Create new or use existing
- Select a location to create the resource group in: e.g. East US
- Choose new (create AI services) or existing [new/existing]: e.g. existing
- Enter your project endpoint URL: e.g. https://your-project.services.ai.azure.com/models
- Enter your deployment name (gpt-4o-mini, gpt-4, gpt-35-turbo): e.g. gpt-4o-mini
- Enter your audio deployment name (gpt-4o-mini-audio-preview): e.g. gpt-4o-mini-audio-preview
- Select an Azure location to use: e.g. East US
Automatic steps: Package building, RBAC setup, provisioning, deployment
Expected deployment time: 4-5 minutes for complete provisioning and deployment
- Azure AI Foundry Integration During Deployment
Duringazd up, you'll be prompted to configure AI setup:
Prompt: "Do you have an existing Azure AI Foundry endpoint?"
- Answer "existing": If you have existing Azure AI Foundry resources
- You'll be asked to provide your endpoint URL (e.g., https://your-project.services.ai.azure.com/models)
- Managed Identity role permissions will be automatically configured on your endpoint
- You'll specify your existing model deployment names (chat model name e.g. gpt-4o-mini and audio model name e.g. gpt-4o-mini-audio-preview)
- Answer "new": Creates new Azure AI Foundry resources automatically
- Provisions new Azure AI Foundry project with required dependencies
- Deploys chat and audio models with configurable names (defaults: gpt-4o-mini and gpt-4o-mini-audio-preview)
- Configures Managed Identity integration and updates all required settings
Configuration is automatic - all environment variables and permissions are set up during deployment!
Note: You no longer need to manually configure AI settings or manage API keys - everything is handled automatically through Managed Identity integration.
- Test Your Application
- Click "๐งช Test Config" to verify connection, then start using AI scenarios!
- Refer to Usage Examples section below to test manually with sample scenarios
What Gets Deployed
- Azure App Service (Basic B2 SKU) with Python 3.11
- Azure AI Foundry resources (if "new" was chosen for existing endpoint):
- AI Hub with cognitive services multi-service account
- AI Project workspace with model deployments (gpt-4o-mini, gpt-4o-mini-audio-preview)
- Storage account for AI project data and model artifacts
- Managed Identity Configuration (automatic for both new and existing endpoints):
- System-assigned managed identity enabled on App Service
- "Cognitive Services OpenAI User" role assigned to access AI endpoints
- "AI Developer" role assigned for Azure AI Foundry project access
- All necessary environment variables configured automatically in App Service
Local Development Setup
Prerequisites
- Python 3.8+
- Check version: python --version
- Download from: https://python.org/downloads/
- Azure AI Foundry with deployed models
- ๐ Setup Guide: Create and deploy models in Azure AI Foundry
- Required models: gpt-4o-mini (chat/reasoning/image) and gpt-4o-mini-audio-preview (audio)
- Alternative: Use azd up deployment method above for automatic setup
Step-by-Step Installation & Setup
Step 1: Install and Launch Application
- Install Dependencies
pip install -r requirements.txt
- Launch Application
python app.py
Verify Application is Running
- Open http://localhost:5000 in your browser
- You should see the application homepage
Step 2: Configure AI Settings
- Open Settings Page
-
- Navigate to http://localhost:5000/settings
- Fill in Configuration Fields
-
- Azure AI Foundry Endpoint: https://your-project.services.ai.azure.com/models
- API Key: Your Azure AI Foundry API key (Note: For production deployment, Managed Identity is automatically configured)
- Chat Model Name: Your deployed chat model name (e.g., gpt-4o-mini)
- Audio Model Name: Your audio model deployment (e.g., gpt-4o-mini-audio-preview)
- Save and Test Configuration
-
- Click Save to store your settings
- Click "๐งช Test Config" to verify connection
- Wait for success confirmation message
Step 3: Test Core Features
Refer to the Test Core Features section below for detailed testing instructions with sample scenarios.
โ Success Indicators:
- Configuration test shows success message
- Application loads at http://localhost:5000
- Chat interface is accessible and functional
Test Core Features
Note: These tests work for both local development and Azure deployment setups.
Step-by-Step Testing Guide
1. Access Chat Interface
- Go to http://localhost:5000
- Click the floating AI chat button (bottom-right corner)
- Verify the chat popup opens correctly
2. Test Basic Conversational AI
Test these exact messages (copy and paste):
- Identity Test:
- Message: "Who are you and what can you help with?"
- Expected: AI identifies as Enterprise AI Assistant, explains capabilities
- Product Inquiry Test:
- Message: "Tell me about features and price for Pro Gaming X1"
- Expected: Relevant product information response
- Customer Service Test:
- Message: "What is the return policy and how do I process a customer refund?"
- Expected: Helpful customer service guidance
3. Test Multimodal Features
Image Analysis Testing:
- Type message: "Analyze this laptop and tell me its specifications"
- Click ๐ button to upload file
- Select file: tests/test_inputs/laptop.jpeg
- Send message
- Expected: AI describes laptop specifications from the image
Audio Processing Testing:
- Type message: "Transcribe this customer service call"
- Click ๐ button to upload file
- Select file: tests/test_inputs/test_customer_service_audio.mp3
- Send message
- Expected: AI provides transcription of the audio content
4. Test Advanced Reasoning Capabilities
Complex Business Analysis Testing:
Message: "Zava's sales have dropped 15% this quarter. Walk me through a systematic approach to identify root causes and develop an action plan."
Expected: Structured analytical approach with step-by-step reasoning process
โ Success Indicators:
- All chat responses are relevant and helpful
- Image analysis identifies laptop features accurately
- Audio transcription provides readable text from audio file
- Reasoning responses show structured analytical thinking
- No error messages in chat interface
- File uploads complete successfully
- Responses may show truncation message for long content (this is normal)
- Configuration test shows green success message
๐ Sample Test Files: Browse tests/test_inputs/ folder for additional sample images and audio files to test multimodal capabilities.
Integration with Existing Applications
This section provides guidance for integrating AI capabilities into your existing Flask applications. Note that this requires additional Azure resource setup and dependency management.
Prerequisites for Integration
- Azure AI Foundry endpoint with model deployments:
- Chat/reasoning/image model (e.g., gpt-4o-mini)
- Audio model (e.g., gpt-4o-mini-audio-preview)
Step 1: Set Up Azure Resources
Configure App Service Managed Identity for Azure AI Foundry:
# Enable system-assigned managed identity for your App Service
az webapp identity assign --name <your-app-name> --resource-group <your-rg>
# Grant required roles to your App Service for Azure AI Foundry access
az role assignment create \
--role "Cognitive Services OpenAI User" \
--assignee <managed-identity-principal-id> \
--scope <your-ai-foundry-resource-id>
# Grant AI Developer role for Azure AI Foundry project access
az role assignment create \
--role "Azure AI Developer" \
--assignee <managed-identity-principal-id> \
--scope <your-ai-foundry-resource-id>
Step 2: Copy and Merge Files
Copy the AIPlaygroundCode folder:
cp -r AIPlaygroundCode/ /path/to/your-existing-app/
Merge Dependencies (Important!):
- requirements.txt: Merge the dependencies from this sample's requirements.txt with your existing requirements file
- wsgi.py: If you have an existing wsgi.py, ensure it properly references your Flask app instance
- app.py routes: Copy the AI-related routes from the sample app.py to your existing Flask application
Example requirements.txt merge:
# Your existing dependencies
flask==2.3.3
# ... your other dependencies
# Add these AI-related dependencies from the sample
azure-identity==1.15.0
openai==1.35.13
pillow==10.0.0
pydub==0.25.1
Step 3: Update App Service Configuration
Add AI configuration as App Service environment variables:
# Add Azure AI Foundry configuration as App Service environment variables
az webapp config appsettings set --name <your-app-name> --resource-group <your-rg> --settings \
AZURE_INFERENCE_ENDPOINT="https://your-project-name.region.models.ai.azure.com/models" \
AZURE_AI_CHAT_DEPLOYMENT_NAME="gpt-4o-mini" \
AZURE_AI_AUDIO_DEPLOYMENT_NAME="gpt-4o-mini-audio-preview" \
AZURE_CLIENT_ID="system-assigned-managed-identity"
Step 4: Add AI Settings Route to Your Flask App
Add the settings route to your existing Flask application:
# Add these imports to your existing app.py
from AIPlaygroundCode.config import get_model_config, update_model_config, is_configured
# Add AI configuration routes
@app.route('/settings')
def settings():
from flask import render_template
config = get_model_config()
return render_template('AIPlaygroundCode/templates/settings.html', config=config)
@app.route('/settings', methods=['POST'])
def update_settings():
# Copy the complete settings update logic from the sample app.py
# This includes form handling, validation, and configuration updates
pass
Step 5: Add AI Interface to Your Application
Integrate the AI chat interface into your existing application pages by copying specific sections from AIPlaygroundCode/templates/retail_home.html:
Look for these markers in retail_home.html and copy to your templates:
- HTML Structure(copy the section marked with<!-- AI Chat Interface Start -->):
<!-- AI Chat Interface Start -->
<div id="chat-popup" class="chat-popup">
<!-- Complete chat popup structure -->
</div>
<!-- AI Chat Interface End -->
- CSS Styles(copy the section marked with/* AI Chat Styles Start */):
<style>
/* AI Chat Styles Start */
.chat-popup { /* ... */ }
/* AI Chat Styles End */
</style>
- JavaScript Functions(copy the section marked with// AI Chat JavaScript Start):
<script>
// AI Chat JavaScript Start
function toggleChat() { /* ... */ }
// AI Chat JavaScript End
</script>
Integration Steps:
- Copy each marked section from retail_home.html to your main application template
- Ensure the chat button appears on your pages (floating bottom-right)
- Test the chat interface opens and can send messages
- Verify file upload functionality works for multimodal features
What You Get After Integration:
- New Route: /settings for AI configuration
- Settings Page: Self-service configuration interface for Azure AI Foundry endpoints
- AI Chat Interface: Integrated chat functionality within your application pages
- Secure Configuration: Managed Identity authentication with Azure AI Foundry (no API keys required)
Resource Clean-up
To prevent incurring unnecessary charges, it's important to clean up your Azure resources after completing your work with the application.
When to Clean Up
- After you have finished testing or demonstrating the application
- If the application is no longer needed or you have transitioned to a different project
- When you have completed development and are ready to decommission the application
Removing Azure Resources
To delete all associated resources and shut down the application, execute the following command:
azd down
Please note that this process may take up to 10 minutes to complete.
Alternative: You can delete the resource group directly from the Azure Portal to clean up resources.
Guidance
Costs
Pricing varies per region and usage, so it isn't possible to predict exact costs for your usage. The majority of Azure resources used in this infrastructure are on usage-based pricing tiers.
You can try the Azure pricing calculator for the resources:
Core Resources (always deployed):
- Azure App Service: Basic B2 tier with 3.5 GB RAM, 10 GB storage. Pricing
Azure AI Foundry Resources (deployed when "No" is chosen for existing endpoint):
- Azure AI Services: Multi-service account with consumption-based pricing for model usage (tokens). Pricing
- Azure AI Hub: Management layer for AI projects (minimal cost)
- Azure Storage Account: Standard LRS for AI project data and model artifacts (minimal cost). Pricing
Cost-saving tip: Choose "Yes" when prompted about existing Azure AI Foundry endpoint to reuse existing resources and avoid duplicate charges.
Cost Management: To avoid unnecessary costs, remember to clean up your resources when no longer needed by running azd down or deleting the resource group in the Azure Portal.
Security Guidelines
This template uses Managed Identity for secure authentication between Azure services.
Additional Security Measures to Consider:
- Enable Microsoft Defender for Cloud to secure your Azure resources
- Implement network security with Virtual Networks for App Service
- Configure Azure Web Application Firewall for additional protection
- Enable GitHub secret scanning in your repository
Important Security Notice
This template has been built to showcase Azure AI services and tools. We strongly recommend implementing additional security features before using this code in production environments.
Support and Feedback
- Issues: Report bugs or request features via GitHub Issues
- Questions: Use GitHub Discussions for implementation questions
- Rate the Sample: Star the repository if this helped your project
References
Implementation Guides
- GitHub Repository - Complete implementation guide and source code
- Project Structure - Learn the high-level constructs and architecture
- Configuration Guide - Understand configuration options and environment setup
- Testing Guide - Learn how to test the application locally and on Azure
- FAQ & Troubleshooting - Frequently asked questions and troubleshooting guide
Azure AI Foundry Documentation
- Chat Completions - Basic conversational AI implementation
- Reasoning Models - Advanced reasoning capabilities
- Multimodal AI - Image and audio processing
- Structured Outputs - JSON schema validation