In a previous a blog I wrote about Cognitive Services in containers, and we dove in an example on how to deploy a Speech Container.
In this blog we are going to dive into the different way you can run Cognitive Services in a container on Azure. We first look at the different options on how you can run a container on Azure and secondly, we investigate a few scenarios how you might want to run your application.
What are containers?
Containerization is an approach to software distribution in which an application or service, including its dependencies & configuration, is packaged together as a container image. With little or no modification, a container image can be deployed on a container host. Containers are isolated from each other and the underlying operating system, with a smaller footprint than a virtual machine. Containers can be instantiated from container images for short-term tasks and removed when no longer needed.
Read more on Microsoft Docs
First let us start find out the options you can choose from, when hosting a container solution on Azure. We will look at 4 different options. For every option there is a code sample using the Azure CLI.
To follow along with the code samples you need to create a Text Analytics endpoint in Azure first.
# Create a resource group called demo_rg in the region westeurope
az group create --name demo_rg --location westeurope
# Create a Text Analytics Cognitive Service Endpoint
az cognitiveservices account create \
--name textanalytics-resource \
--resource-group demo_rg \
--kind TextAnalytics \
--sku F0 \
--location westeurope \
# Get the endpoint URL
az cognitiveservices account show --name textanalytics-resource --resource-group demo_rg --query properties.endpoint -o json
# Get the API Keys
az cognitiveservices account keys list --name textanalytics-resource --resource-group demo_rg
You can run the containers directly from the Microsoft Container Repository or create a reusable container and store it in your own Container repository.
Sample on how to run the Text Analytics Container locally from the Microsoft Container Registry.
```
# Run the container
docker run --rm -it -p 5000:5000 --memory 8g --cpus 1 \
mcr.microsoft.com/azure-cognitive-services/textanalytics/sentiment \
Eula=accept \
Billing={ENDPOINT_URI} \
ApiKey={API_KEY}
# Test the container
curl -X POST "http://localhost:5000/text/analytics/v3.0/sentiment"
-H "Content-Type: application/json"
-d "{\"documents\":[{\"language\":\"en\",\"id\":\"1-en\",\"text\":\"Hello beautiful world.\"}]}"
```
You can use a container recipes to create Cognitive Services Containers that can be reused. Containers can be built with some or all configuration settings so that they are not needed when the container is started.
Once you have this new layer of container (with settings), and you have tested it locally, you can store the container in a container registry. When the container starts, it will only need those settings that are not currently stored in the container. The private registry container provides configuration space for you to pass those settings in.
# Create a Dockerfile
FROM mcr.microsoft.com/azure-cognitive-services/sentiment:latest
ENV billing={BILLING_ENDPOINT}
ENV apikey={ENDPOINT_KEY}
ENV EULA=accept
# Build the docker image
docker build -t <your-image-name> .
# Run your container
docker run --rm -it -p 5000:5000 --memory 8g --cpus 1 <your-image-name>
# Test the container
curl -X POST "http://localhost:5000/text/analytics/v3.0/sentiment"
-H "Content-Type: application/json"
-d "{\"documents\":[{\"language\":\"en\",\"id\":\"1-en\",\"text\":\"Hello beautiful world.\"}]}"
Now that you have your own container you can push the container to your own Container Repository.
# Create the container registry
az acr create --resource-group demo_rg --name henksregistry --sku Basic --admin-enabled true
# Get the username
az acr credential show -n henksregistry --query username
# Get the password
az acr credential show -n henksregistry --query passwords[0].value
# Get the repository URL
az acr show -n henksregistry --query loginServer
# Sign in to your private registry with the details from the steps above
docker login henksregistry.azurecr.io
# Tag your container
docker tag <your-image-name> henksregistry.azurecr.io/<your-image-name>
# Push your container to the registry
docker push henksregistry.azurecr.io/<your-image-name>
Now you have a Text Analytics Container in your own private Azure Container Registry.
Read more on Docs
- Install and run Text Analytics containers
- Create containers for reuse
- Create a private container registry using the Azure CLI
Let us go to the next step and see how we can deploy these containers in the different offerings and see the benefits.
The easiest option to run a container is to use an Azure Container instance also referred as the serverless container option. It can run Linux and Windows containers. If you use Linux based containers you can run multiple containers in a container group, mount volumes from Azure Files, monitored them with Azure Monitor and use a GPU.
Azure Container Instances is a great solution for any scenario that can operate in isolated containers, including simple applications, task automation, and build jobs. You can run them securely in a virtual network and use a Traffic Manager to distribute traffic across multiple instances.
Read more about Azure Container Instances
- Azure Container Instances documentation
- Azure Container Instances (ACI) across 3 regions in under 30 seconds with Azure Traffic Manager by A...
- What is Traffic Manager?
This sample shows how you can deploy the sentiment analysis container to an Azure Container Instance.
az container create
--resource-group <insert resource group name> \
--name <insert container name> \
--dns-name-label <insert unique name> \
--memory 2 --cpu 1 \
--ports 5000 \
--image mcr.microsoft.com/azure-cognitive-services/textanalytics/sentiment:3.0-en \
--environment-variables \
Eula=accept \
Billing=<insert endpoint> \
ApiKey=<insert apikey>
Azure App Service is a great way to host a simple container solution, it offers some more out of the box functionality then Azure Container Instances. All the features from an App Service like Azure AutoScale are available and easily integrate with services like Azure Front Door and traffic manager to perform traffic routing and load-balancing.
App Service not only adds the power of Microsoft Azure to your application, such as security, load balancing, autoscaling, and automated management. You can also take advantage of its DevOps capabilities, such as continuous deployment from Azure DevOps, GitHub, Docker Hub, and other sources, package management, staging environments, custom domain, and TLS/SSL certificates.
Azure App Service for your containerized solutions is great if you are looking for easy to use service that automatically scales when traffic increases and integrates with other Azure Services for traffic management like Traffic Manager and Azure Frontdoor.
Read more about Azure App Service on Microsoft Docs and Azure.com.
- App Service documentation
- Web App for Containers
This sample shows how you can deploy the sentiment analysis container to an Azure App Service
# Create an App Service
az appservice plan create
--name <app-name>
--resource-group demo_rg
--sku S1 --is-linux
# Create the webapp with the private container (Deployment can take a while)
az webapp create
--name <webapp-name>
--plan <app-name>
--resource-group demo_rg
-i <registryname>.azurecr.io/<your-image-name>
curl -X POST "http://<webapp-name>.azurewebsites.net/text/analytics/v3.0/sentiment"
-H "Content-Type: application/json"
-d "{\"documents\":[{\"language\":\"en\",\"id\":\"1-en\",\"text\":\"Hello beautiful world.\"}]}"
The options above are the quickest way to get started with containers on Azure. But some solutions require advanced orchestration or need to be able to run on edge devices in disconnected scenarios. For this scenarios Azure offers Azure Kubernetes Services and IoT Edge.
Deploy and manage containerized applications more easily with a fully managed Kubernetes service. Azure Kubernetes Service (AKS) offers serverless Kubernetes, an integrated continuous integration and continuous delivery (CI/CD) experience, and enterprise-grade security and governance. Unite your development and operations teams on a single platform to rapidly build, deliver, and scale applications with confidence.
Kubernetes is a rapidly evolving platform that manages container-based applications and their associated networking and storage components. Kubernetes focuses on the application workloads, not the underlying infrastructure components. Kubernetes provides a declarative approach to deployments, backed by a robust set of APIs for management operations.
You can build and run modern, portable, microservices-based applications, using Kubernetes to orchestrate and manage the availability of the application components. Kubernetes supports both stateless and stateful applications as teams progress through the adoption of microservices-based applications.
As an open platform, Kubernetes allows you to build your applications with your preferred programming language, OS, libraries, or messaging bus. Existing continuous integration and continuous delivery (CI/CD) tools can integrate with Kubernetes to schedule and deploy releases.
AKS provides a managed Kubernetes service that reduces the complexity of deployment and core management tasks, like upgrade coordination. The Azure platform manages the AKS control plane, and you only pay for the AKS nodes that run your applications.
Azure IoT Edge is a fully managed service built on Azure IoT Hub. Deploy your cloud workloads—artificial intelligence, Azure and third-party services, or your own business logic—to run on Internet of Things (IoT) edge devices via standard containers. By moving certain workloads to the edge of the network, your devices spend less time communicating with the cloud, react more quickly to local changes, and operate reliably even in extended offline periods.
Azure IoT Edge moves cloud analytics and custom business logic to devices so that your organization can focus on business insights instead of data management. Scale out your IoT solution by packaging your business logic into standard containers, then you can deploy those containers to any of your devices and monitor it all from the cloud.
Analytics drives business value in IoT solutions, but not all analytics needs to be in the cloud. If you want to respond to emergencies as quickly as possible, you can run anomaly detection workloads at the edge. If you want to reduce bandwidth costs and avoid transferring terabytes of raw data, you can clean and aggregate the data locally then only send the insights to the cloud for analysis.
If you want to learn more about IoT Edge you can watch the video "IoT ELP Module 3 Adding Intelligence, Unlocking New Insights with AI & ML"
There are also other services on Azure that can run containers, but might need some work to get the Cognitive Services containers running on them:
- Azure Batch
- Azure Function App
- Azure Service Fabric
Continue your learning journey and get skilled up on all things Azure AI!
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.