Blog Post

Apps on Azure Blog
4 MIN READ

Building Agents on Azure Container Apps with Goose AI Agent, Ollama and gpt-oss

simonjj's avatar
simonjj
Icon for Microsoft rankMicrosoft
Oct 13, 2025

Azure Container Apps (ACA) is redefining how developers build and deploy intelligent agents. With serverless scale, GPU-on-demand, and enterprise-grade isolation, ACA provides the ideal foundation for hosting AI agents securely and cost-effectively.

Last month we highlighted how you can deploy n8n on Azure Container Apps to go from click-to-build to a running AI based automation platform in minutes, with no complex setup or infrastructure management overhead. In this post, we’re extending that same simplicity to AI agents, where we’ll show why Azure Container Apps is the best platform for running open-source agentic frameworks like Goose. Whether you’re experimenting with open-source models or building enterprise-grade automation, ACA gives you the flexibility and security you need.

Challenges when building and hosting AI agents

Building and running AI agents in production presents its own set of challenges. These systems often need access to proprietary data and internal APIs, making security and data governance critical, especially when agents interact dynamically with multiple tools and models. At the same time, developers need flexibility to experiment with different frameworks without introducing operational overhead or losing isolation.

Simplicity and performance are also key. Managing scale, networking, and infrastructure can slow down iteration, while separating the agent’s reasoning layer from its inference backend can introduce latency and added complexity from managing multiple services. In short, AI agent development requires security, simplicity, and flexibility to ensure reliability and speed at scale.

Why ACA and serverless GPUs for hosting AI agents

Azure Container Apps provide a secure, flexible, and developer-friendly platform for hosting AI agents and inference workloads side by side within the same ACA environment. This unified setup gives you centralized control over network policies, RBAC, observability, and more, while ensuring that both your agentic logic and model inference run securely within one managed boundary.

ACA also provides the following key benefits:

  • Security and data governance: Your agent runs in your private, fully isolated environment, with complete control over identity, networking, and compliance. Your data never leaves the boundaries of your container
  • Serverless economics: Scale automatically to zero when idle, pay only for what you use — no overprovisioning, no wasted resources.
  • Developer simplicity: One-command deployment, integrated with Azure identity and networking. No extra keys, infrastructure management, or manual setup are required.
  • Inferencing flexibility with serverless GPUs: Bring any open-source, community, or custom model. Run your inferencing apps on serverless GPUs alongside your agentic applications within the same environment. For example, running gpt-oss models via Ollama inside ACA containers avoids costly hosted inference APIs and keeps sensitive data private.

These capabilities let teams focus on innovation, not infrastructure, making ACA a natural choice for building intelligent agents.

Deploy the Goose AI Agent to ACA

The Goose AI Agent, developed by Block, is an open source, general-purpose agent framework designed for quick deployment and easy customization. Out of the box, it supports many features like email integration, github interactions, and local CLI and system tool access.

It’s great for building ready-to-run AI assistants that can connect to other systems while having a modular design that makes customization simple on top of supporting great defaults out the box. By deploying Goose on ACA, you gain all the benefits of serverless scale, secure isolation, GPU-on-demand, while maintaining the ability to customize and iterate quickly.

Get started: Deploy Goose on Azure Container Apps using this open-source starter template.

In just a few minutes, you’ll have a private, self-contained AI agent running securely on Azure Container Apps, ready to handle real-world workloads without compromise.

Goose running on Azure Container Apps adding some content to a README, submitting a PR and sending a summary email to the team.

 

Additional Benefits of running Goose on ACA

Running the Goose AI Agent on Azure Container Apps (ACA) showcases how simple and powerful hosting AI agents can be.

  • Always available: Goose can run continuously—handling long-lived or asynchronous workloads for hours or days—without tying up your local machine.
  • Cost efficiency: ACA’s pay-per-use, serverless GPU model eliminates high per-call inference costs, making it ideal for sustained or compute-intensive workloads.
  • Seamless developer experience: The Goose-on-ACA starter template sets up everything for you—model server, web UI, and CLI endpoints—with no manual configuration required.

With ACA, you can go from concept to a fully running agent in minutes, without compromising on security, scalability, or cost efficiency.

Part of a Growing Ecosystem of Agentic frameworks on ACA

ACA is quickly becoming the go-to platform for containerized AI and Agentic workloads. From n8nGoose to other emerging open-source and commercial agent frameworks, developers can use ACA to experiment, scale, and secure their agents - all while taking advantage of serverless scaleGPU-on-demand, and complete network isolation.

It’s the same developer-first workflow that powers modern applications, now extended to intelligent agents.  Whether you’re building a single agent or an entire automation ecosystem, ACA provides the flexibility and reliability you need to innovate faster.

Updated Oct 14, 2025
Version 3.0
No CommentsBe the first to comment