We are happy to introduce this Preview feature which enables you to run microservices and containerized applications on a serverless platform.
Considering the distributed nature of microservices, you need to account for failures, retries, and timeouts in a system composed of microservices. While Container Apps features the building blocks for running microservices, use of Dapr (Distributed Application Runtime) provides an even richer microservices programming model.
Azure Container Apps offers a fully managed version of the Dapr APIs when building microservices. When you use Dapr in Azure Container Apps, you can enable sidecars to run next to your microservices that provide a rich set of capabilities. Available Dapr APIs include Service to Service calls, Pub/Sub, Event Bindings, State Stores, and Actors.
In this blog, we demonstrate a sample Dapr application deployed in Azure Container Apps, which can:
- Read input message from Azure Storage Queue using Dapr Input Binding feature
- Process the message using a Python application running inside a Docker Container
- Output the result into Azure Storage Blob using Dapr Output Binding feature
Prerequisites
- Azure account with an active subscription
- Azure CLI
Ensure you're running the latest version of the CLI via the upgrade command.az upgrade
- Docker Hub account to store your Docker image
Deployment Steps Overview
The complete deployment can be separated into 3 steps:
- Create required Azure resources
- Build a Docker image contains the Python application and Dapr components
- Deploy Azure Container App with the Docker image and enable Dapr sidecar
- Run test to confirm your Container App can read message from Storage Queue and push the process result to Storage Blob
Create required Azure resources
Create Powershell script CreateAzureResourece.ps1 with the following Az CLI commands:
This script will:
Install Azure CLI Container App Extension
- Create a Resource Group
- Create Container App Environment (the platform will also create a Log Analytics workspace)
- Create Storage Account
- Create Storage Queue (Dapr input binding source)
- Create Storage Container (Dapr outbound binding target)
Note: Define your own values for the following variables in the first 7 lines of the script
For $LOCATION, in Preview stage, we can only use northeurope and canadacentral
$RESOURCE_GROUP
$LOCATION
$CONTAINERAPPS_ENVIRONMENT
$AZURE_STORAGE_ACCOUNT
$STORAGE_ACCOUNT_QUEUE
$STORAGE_ACCOUNT_CONTAINER
$RESOURCE_GROUP=""
$LOCATION=""
$CONTAINERAPPS_ENVIRONMENT=""
$AZURE_STORAGE_ACCOUNT=""
$STORAGE_ACCOUNT_QUEUE=""
$STORAGE_ACCOUNT_CONTAINER=""
az login
Write-Host "====Install Extension===="
az extension add --name containerapp --upgrade
az provider register --namespace Microsoft.App
Write-Host "====Create Resource Group===="
az group create `
--name $RESOURCE_GROUP `
--location "$LOCATION"
Write-Host "====Create Container App ENV===="
az containerapp env create `
--name $CONTAINERAPPS_ENVIRONMENT `
--resource-group $RESOURCE_GROUP `
--location "$LOCATION"
Write-Host "====Create Storage Account===="
az storage account create `
--name $AZURE_STORAGE_ACCOUNT `
--resource-group $RESOURCE_GROUP `
--location "$LOCATION" `
--sku Standard_RAGRS `
--kind StorageV2
$AZURE_STORAGE_KEY=(az storage account keys list --resource-group $RESOURCE_GROUP --account-name $AZURE_STORAGE_ACCOUNT --query '[0].value' --out tsv)
echo $AZURE_STORAGE_KEY
Write-Host "====Create Storage Container===="
az storage queue create -n $STORAGE_ACCOUNT_QUEUE --fail-on-exist --account-name $AZURE_STORAGE_ACCOUNT --account-key $AZURE_STORAGE_KEY
Write-Host "====Create Stroge Queue===="
az storage container create -n $STORAGE_ACCOUNT_CONTAINER --fail-on-exist --account-name $AZURE_STORAGE_ACCOUNT --account-key $AZURE_STORAGE_KEY
Run the Powershell script
PS C:\> Set-ExecutionPolicy RemoteSigned -Scope CurrentUser
PS C:\> .\CreateAzureResourece.ps1
Build a Docker image contains the Python application and Dapr components
All the source code are saved in the following structure:
|-Base Floder
|- Dockerfile
|- startup.sh
|- input.yaml
|- output.yaml
|- requirements.txt
|- app.py
Dockerfile
FROM python:3.9-slim-buster
WORKDIR /app
RUN apt-get update && apt-get -y install gcc g++
RUN pip3 install --upgrade pip
COPY . .
RUN pip3 install -r requirements.txt
ENTRYPOINT ["/app/startup.sh"]
startup.sh
#!/bin/bash
echo "======starting Docker container======"
cd /app
python -u /app/app.py
input.yaml
<AZURE_STORAGE_ACCOUNT>, <STORAGE_ACCOUNT_CONTAINER>, <STORAGE_ACCOUNT_QUEUE> values can be found in your CreateAzureResourece.ps1 script
<AZURE_STORAGE_KEY> value can be retrieved by:
$AZURE_STORAGE_KEY=(az storage account keys list --resource-group $RESOURCE_GROUP --account-name $AZURE_STORAGE_ACCOUNT --query '[0].value' --out tsv)
# dapr component name: queueinput
componentType: bindings.azure.storagequeues
version: v1
metadata:
- name: storageAccount
value: "<AZURE_STORAGE_ACCOUNT>"
- name: storageAccessKey
value: "<AZURE_STORAGE_KEY>"
- name: queue
value: "<STORAGE_ACCOUNT_QUEUE>"
- name: ttlInSeconds
value: "60"
- name: decodeBase64
value: "true"
scopes:
- bindingtest
output.yaml
# dapr component name: bloboutput
componentType: bindings.azure.blobstorage
version: v1
metadata:
- name: storageAccount
value: "<AZURE_STORAGE_ACCOUNT>"
- name: storageAccessKey
value: "<AZURE_STORAGE_KEY>"
- name: container
value: "<STORAGE_ACCOUNT_CONTAINER>"
- name: decodeBase64
value: "true"
scopes:
- bindingtest
requirements.txt
dapr-dev
dapr-ext-grpc-dev
dapr-ext-fastapi-dev
gTTS
requests
Flask
app.py
It use the following code to take input from Dapr queueinput binding.
@app.route("/queueinput", methods=['POST'])
def incoming():
incomingtext = request.get_data().decode()
It use the following code to write output to Dapr bloboutput binding.
url = 'http://localhost:'+daprPort+'/v1.0/bindings/bloboutput'
uploadcontents = '{ "operation": "create", "data": "'+ base64_message+ '", "metadata": { "blobName": "'+ outputfile+'" } }'
requests.post(url, data = uploadcontents)
import os
import datetime
import base64
import requests
from gtts import gTTS
from io import BytesIO
from flask import Flask,request
app = Flask(__name__)
#code
daprPort = os.getenv('DAPR_HTTP_PORT')
daprGRPCPort = os.environ.get('DAPR_GRPC_PORT')
print('>>>>>>>>DAPR_HTTP_PORT : '+ daprPort )
print('>>>>>>>>DAPR_GRPC_PORT : '+ daprGRPCPort )
@app.route("/queueinput", methods=['POST'])
def incoming():
incomingtext = request.get_data().decode()
print(">>>>>>>Message Received: "+ incomingtext,flush="true")
outputfile = "Msg_"+datetime.datetime.now().strftime("%Y%m%d-%H%M%S-%f")+".mp3"
base64_message = process_message(incomingtext,outputfile)
url = 'http://localhost:'+daprPort+'/v1.0/bindings/bloboutput'
uploadcontents = '{ "operation": "create", "data": "'+ base64_message+ '", "metadata": { "blobName": "'+ outputfile+'" } }'
#print(uploadcontents)
requests.post(url, data = uploadcontents)
print('>>>>>>Audio uploaded to storage.',flush="true")
return "Incoming message successfully processed!"
def process_message(incomingtext,outputfile):
tts = gTTS(text=incomingtext, lang='en', slow=False)
tts.save(outputfile)
print('>>>>>>>Audio saved to ' + outputfile,flush="true")
fin = open(outputfile, "rb")
binary_data = fin.read()
fin.close()
base64_encoded_data = base64.b64encode(binary_data)
base64_message = base64_encoded_data.decode('utf-8')
return base64_message
if __name__ == '__main__':
app.run(host="localhost", port=6000, debug=False)
Using the following Docker command to create Docker image and push it to Docker hub.
cd "<the folder where you save your Dockerfile>"
$Dockerhubaccount="<your docker hub account name>"
docker build -t $Dockerhubaccount/daprbindingtest:v1 .
docker push $Dockerhubaccount/daprbindingtest:v1
Deploy Azure Container App with the Docker image and enable Dapr sidecar
Use "az containerapp env dapr-component set" command to set the input and output Dapr sidecar.
az containerapp env dapr-component set `
--name CONTAINERAPPS_ENVIRONMENT --resource-group $RESOURCE_GROUP `
--dapr-component-name queueinput `
--yaml input.yaml
az containerapp env dapr-component set `
--name CONTAINERAPPS_ENVIRONMENT --resource-group $RESOURCE_GROUP `
--dapr-component-name bloboutput `
--yaml output.yaml
Use "az containerapp create" command to create an App in your container app environment.
az containerapp create `
--name bindingtest `
--resource-group $RESOURCE_GROUP `
--environment $CONTAINERAPPS_ENVIRONMENT `
--image $Dockerhubaccount/daprbindingtest:v1 `
--target-port 6000 `
--ingress external `
--min-replicas 1 `
--max-replicas 1 `
--enable-dapr `
--dapr-app-port 6000 `
--dapr-app-id bindingtest
After you successfully run the containerapp create command, you should be able to see a new App Revision being provisioned.
Go to Container App --> Logs Portal, run the following query command:
ContainerAppConsoleLogs_CL
| where RevisionName_s == "<your app revision name>"
| project TimeGenerated,RevisionName_s,Log_s
In the log, we should be able to see:
- Successful init for output binding bloboutput (the name is defined in the daprcomponents.yaml)
- Successful inti for input binding queueinput (the name is defined in the daprcomponents.yaml)
- Application discovered on port 6000 (as we defined in Python application code)
- Also Dapr should be able to send OPTIONS request to /queueinput, and get Http 200 response. That means the application can take message from input Storage Queue.
Run test to confirm your Container App can read message from Storage Queue and push the process result to Storage Blob
Now, let's add a new message in the Storage Queue.
After the application being successfully executed, we can find any output audio file in the output Storage Blob.
To check the application log, you can use Container App --> Logs Portal, run the following query command:
ContainerAppConsoleLogs_CL
| where RevisionName_s == "<your app revision name>"
| project TimeGenerated,RevisionName_s,Log_s