serverless
38 TopicsIntroducing Azure AI Travel Agents: A Flagship MCP-Powered Sample for AI Travel Solutions
We are excited to introduce AI Travel Agents, a sample application with enterprise functionality that demonstrates how developers can coordinate multiple AI agents (written in multiple languages) to explore travel planning scenarios. It's built with LlamaIndex.TS for agent orchestration, Model Context Protocol (MCP) for structured tool interactions, and Azure Container Apps for scalable deployment. TL;DR: Experience the power of MCP and Azure Container Apps with The AI Travel Agents! Try out live demo locally on your computer for free to see real-time agent collaboration in action. Share your feedback on our community forum. We’re already planning enhancements, like new MCP-integrated agents, enabling secure communication between the AI agents and MCP servers and more. NOTE: This example uses mock data and is intended for demonstration purposes rather than production use. The Challenge: Scaling Personalized Travel Planning Travel agencies grapple with complex tasks: analyzing diverse customer needs, recommending destinations, and crafting itineraries, all while integrating real-time data like trending spots or logistics. Traditional systems falter with latency, scalability, and coordination, leading to delays and frustrated clients. The AI Travel Agents tackles these issues with a technical trifecta: LlamaIndex.TS orchestrates six AI agents for efficient task handling. MCP equips agents with travel-specific data and tools. Azure Container Apps ensures scalable, serverless deployment. This architecture delivers operational efficiency and personalized service at scale, transforming chaos into opportunity. LlamaIndex.TS: Orchestrating AI Agents The heart of The AI Travel Agents is LlamaIndex.TS, a powerful agentic framework that orchestrates multiple AI agents to handle travel planning tasks. Built on a Node.js backend, LlamaIndex.TS manages agent interactions in a seamless and intelligent manner: Task Delegation: The Triage Agent analyzes queries and routes them to specialized agents, like the Itinerary Planning Agent, ensuring efficient workflows. Agent Coordination: LlamaIndex.TS maintains context across interactions, enabling coherent responses for complex queries, such as multi-city trip plans. LLM Integration: Connects to Azure OpenAI, GitHub Models or any local LLM using Foundy Local for advanced AI capabilities. LlamaIndex.TS’s modular design supports extensibility, allowing new agents to be added with ease. LlamaIndex.TS is the conductor, ensuring agents work in sync to deliver accurate, timely results. Its lightweight orchestration minimizes latency, making it ideal for real-time applications. MCP: Fueling Agents with Data and Tools The Model Context Protocol (MCP) empowers AI agents by providing travel-specific data and tools, enhancing their functionality. MCP acts as a data and tool hub: Real-Time Data: Supplies up-to-date travel information, such as trending destinations or seasonal events, via the Web Search Agent using Bing Search. Tool Access: Connects agents to external tools, like the .NET-based customer queries analyzer for sentiment analysis, the Python-based itinerary planning for trip schedules or destination recommendation tools written in Java. For example, when the Destination Recommendation Agent needs current travel trends, MCP delivers via the Web Search Agent. This modularity allows new tools to be integrated seamlessly, future-proofing the platform. MCP’s role is to enrich agent capabilities, leaving orchestration to LlamaIndex.TS. Azure Container Apps: Scalability and Resilience Azure Container Apps powers The AI Travel Agents sample application with a serverless, scalable platform for deploying microservices. It ensures the application handles varying workloads with ease: Dynamic Scaling: Automatically adjusts container instances based on demand, managing booking surges without downtime. Polyglot Microservices: Supports .NET (Customer Query), Python (Itinerary Planning), Java (Destination Recommandation) and Node.js services in isolated containers. Observability: Integrates tracing, metrics, and logging enabling real-time monitoring. Serverless Efficiency: Abstracts infrastructure, reducing costs and accelerating deployment. Azure Container Apps' global infrastructure delivers low-latency performance, critical for travel agencies serving clients worldwide. The AI Agents: A Quick Look While MCP and Azure Container Apps are the stars, they support a team of multiple AI agents that drive the application’s functionality. Built and orchestrated with Llamaindex.TS via MCP, these agents collaborate to handle travel planning tasks: Triage Agent: Directs queries to the right agent, leveraging MCP for task delegation. Customer Query Agent: Analyzes customer needs (emotions, intents), using .NET tools. Destination Recommendation Agent: Suggests tailored destinations, using Java. Itinerary Planning Agent: Crafts efficient itineraries, powered by Python. Web Search Agent: Fetches real-time data via Bing Search. These agents rely on MCP’s real-time communication and Azure Container Apps’ scalability to deliver responsive, accurate results. It's worth noting though this sample application uses mock data for demonstration purpose. In real worl scenario, the application would communicate with an MCP server that is plugged in a real production travel API. Key Features and Benefits The AI Travel Agents offers features that showcase the power of MCP and Azure Container Apps: Real-Time Chat: A responsive Angular UI streams agent responses via MCP’s SSE, ensuring fluid interactions. Modular Tools: MCP enables tools like analyze_customer_query to integrate seamlessly, supporting future additions. Scalable Performance: Azure Container Apps ensures the UI, backend and the MCP servers handle high traffic effortlessly. Transparent Debugging: An accordion UI displays agent reasoning providing backend insights. Benefits: Efficiency: LlamaIndex.TS streamlines operations. Personalization: MCP’s data drives tailored recommendations. Scalability: Azure ensures reliability at scale. Thank You to Our Contributors! The AI Travel Agents wouldn’t exist without the incredible work of our contributors. Their expertise in MCP development, Azure deployment, and AI orchestration brought this project to life. A special shoutout to: Pamela Fox – Leading the developement of the Python MCP server. Aaron Powell and Justin Yoo – Leading the developement of the .NET MCP server. Rory Preddy – Leading the developement of the Java MCP server. Lee Stott and Kinfey Lo – Leading the developement of the Local AI Foundry Anthony Chu and Vyom Nagrani – Leading Azure Container Apps roadmap Matt Soucoup and Julien Dubois – Leading the ACA DevRel strategy Wassim Chegham – Architected MCP and backend orchestration. And many more! See the GitHub repository for all contributors. Thank you for your dedication to pushing the boundaries of AI and cloud technology! Try It Out Experience the power of MCP and Azure Container Apps with The AI Travel Agents! Try out live demo locally on your computer for free to see real-time agent collaboration in action. Conclusion Developers can explore today the open-source project on GitHub, with setup and deployment instructions. Share your feedback on our community forum. We’re already planning enhancements, like new MCP-integrated agents, enabling secure communication between the AI agents and MCP servers and more. This is still a work in progress and we also welcome all kind of contributions. Please fork and star the repo to stay tuned for updates! ◾️We would love your feedback and continue the discussion in the Azure AI Foundry Discord aka.ms/foundry/discord On behalf of Microsoft DevRel Team.Create a Database Schema and REST APIs with a Single Prompt Using GitHub Copilot in VS Code
The Age of Prompt-Driven Development A significant shift is underway in the way we develop software. AI agents and prompt-based tools are shaping modern development. As a developer, you don’t want to miss this shift. Knowing how to use these tools puts you ahead. Instead of writing endless boilerplate, you can now describe what you want, and AI will generate code, create your database, connect APIs, and even deploy your app. New tools like Cursor, Windsurf, Lovable, and Bolt are rising fast. You can create stunning apps and websites by chatting with AI. Even with all these fancy tools, full-stack apps still need a solid backend, and that means data. Every application needs to work with data. Whether you’re building a blog, a booking platform, or an AI Agent, you’ll need to store and retrieve information. That usually means using a real database like PostgreSQL, MySQL, or MongoDB (unless you’re treating Excel or Google Sheets like a backend, which… we’ve all done once). So schema design, database setup, and API generation can’t be skipped. I decided to experiment with automating the process of designing a database schema, running a database, and managing data using just prompts using GitHub Copilot in VS Code. Working with databases is often repetitive work and slows developers down. I think the issues we always face are the following during the setup of a database from scratch. Every App Needs a Database — It’s Time to Simplify It You start with a manual schema setup You have to create tables, think through relationships, indexes, data types, and naming. You map tables to objects using ORM libraries and build APIs to access that data. It’s easy to miss things or overcomplicate at an early stage. Schema changes are painful Your app evolves. You rename a column, split a table, or add a new relation. Now you need to write migrations. Update your ORM. Avoid downtime. And hope nothing breaks in staging or production. Every change triggers more boilerplate Once the schema changes, you usually: Update model files Fix serializers or DTOs Rewrite REST API endpoints or GraphQL resolvers Modify test data and fixtures That’s a lot of work for just one change. Team coordination becomes tricky In team projects, syncing schema changes between developers often leads to merge conflicts, broken migrations, or inconsistent environments. But now? With the rise of AI code generation tools like GitHub Copilot, you can extend Copilot Chat with the Model Context Protocol (MCP) from external providers, and you can create a fully working database schema with a single prompt — right inside VS Code. And it’ll save you hours every week. Let me show you how you can achieve this. Let’s Build: A Travel Agency App Schema What You Need VS Code (with GitHub Copilot enabled) UV is installed. GibsonAI -Sign up for a free account - This tool turns your prompt into a complete schema, deploys a serverless database, and gives you a live REST API for managing data. Step 1: Set Up GibsonAI CLI and Log In Before using the GibsonAI MCP server, install GibsonAI’s CLI and log in: uvx --from gibson-cli@latest gibson auth login This logs you into your GibsonAI account so you can start using all CLI features. Step 2: Enable MCP Server in VS Code To use the GibsonAI MCP server inside your VS Code project, you’ll need to add a configuration script. Create a file in your project or inside an empty folder called mcp.json in the .vscode/folder. This file defines which GibsonAI MCP server to use for this project. Copy and paste the following content into the .vscode/mcp.json file: { "inputs": [], "servers": { "gibson": { "type": "stdio", "command": "uvx", "args": ["--from", "gibson-cli@latest", "gibson", "mcp", "run"] } } } Once this file is added, GibsonAI tools inside VS Code will connect to the MCP server. Step 3: Describe Your Travel App Schema in a Prompt Open GitHub Copilot Chat in VS Code, switch to Agent mode, and select the LLM model, such as GPT-4.1 or GPT-4o. You should see the available tools from GibsonA Then enter a prompt like this: “Create a database for a travel agency. It should include tables for destinations, bookings, users, and reviews. Each user can make bookings and write reviews. Each destination has a name, description, price, and rating.” GibsonAI reads your prompt, creates a new database project, and magically generates: A complete relational schema Visual Entity-Relationship Diagram (ERD) Proper foreign key constraints UUIDs, timestamps, and standard fields A clean MySQL or Postgres structure Step 4: Deploy Your Schema and Enable CRUD APIs Go to the GibsonAI app, log in, and open your newly created project. There, you can see and review the schema. Now you can click “Deploy” to launch your schema: Alternatively, you can use Copilot chat to deploy the database. GibsonAI hosts the serverless MySQL database. Now you can get the database connection string and connect to your existing app. Or access live CRUD APIs and use them in your app: You now have a working backend without writing a single SQL query. You can plug these APIs directly into your frontend or backend — no need to write REST controllers for typical CRUD operations. GibsonAI also lets me share my database project schema with others. Feel free to clone the travel agency database I created for the demo: https://app.gibsonai.com/clone/rRZ4wD9HDCdHO Step 5: Let Copilot Help You Build Around the API Now that your schema and API are live, use GitHub Copilot to build UI components using React or any other frontend frameworks. GitHub Copilot + GibsonAI MCP = the fastest way to go from prompt to full-featured app. Final Thoughts The future of development is not about using more AI-generated code. It’s about writing fewer, smarter prompts — and letting AI handle the slow, repetitive, or painful tasks so you can fully focus on the innovation. You can already boost your development workflow with GitHub Copilot Agent Mode. It will provide you with a powerful set of tools that enable agents to run SQL queries, create tables, design schemas, import CSV files, and more. Give it a try. The next time you start a project, open VS Code, write a prompt, and let the database build itself. Want to learn more about MCP see the MCP for Beginners resources from Microsoft.Microsoft Build 2024: Essential Guide for AI Developers at Startups and Cloud-First Companies
Generative AI is advancing fast, with OpenAI’s GPT-4o leading the way. GPT-4o boasts improved multilingual understanding, faster responses, lower costs, and real-time processing of text, audio, and images. This boosts new Generative AI (GenAI) use cases. Explore cutting-edge solutions like models, frameworks, vector databases, and LLM observability platforms. Born-in-the-cloud companies are at the forefront of this AI revolution. Be part of the future at Microsoft Build 2024!Essential Microsoft Resources for MVPs & the Tech Community from the AI Tour
Unlock the power of Microsoft AI with redeliverable technical presentations, hands-on workshops, and open-source curriculum from the Microsoft AI Tour! Whether you’re a Microsoft MVP, Developer, or IT Professional, these expertly crafted resources empower you to teach, train, and lead AI adoption in your community. Explore top breakout sessions covering GitHub Copilot, Azure AI, Generative AI, and security best practices—designed to simplify AI integration and accelerate digital transformation. Dive into interactive workshops that provide real-world applications of AI technologies. Take it a step further with Microsoft’s Open-Source AI Curriculum, offering beginner-friendly courses on AI, Machine Learning, Data Science, Cybersecurity, and GitHub Copilot—perfect for upskilling teams and fostering innovation. Don’t just learn—lead. Access these resources, host impactful training sessions, and drive AI adoption in your organization. Start sharing today! Explore now: Microsoft AI Tour Resources.Unlocking the Power of Azure Container Apps in 1 Minute Video
Azure Container Apps provides a seamless way to build, deploy, and scale cloud-native applications without the complexity of managing infrastructure. Whether you’re developing microservices, APIs, or AI-powered applications, this fully managed service enables you to focus on writing code while Azure handles scalability, networking, and deployments. In this blog post, we explore five essential aspects of Azure Container Apps—each highlighted in a one-minute video. From intelligent applications and secure networking to effortless deployments and rollbacks, these insights will help you maximize the capabilities of serverless containers on Azure. Azure Container Apps - in 1 Minute Azure Container Apps is a fully managed platform designed for cloud-native applications, providing effortless deployment and scaling. It eliminates infrastructure complexity, letting developers focus on writing code while Azure automatically handles scaling based on demand. Whether running APIs, event-driven applications, or microservices, Azure Container Apps ensures high performance and flexibility with minimal operational overhead. Watch the video on YouTube Intelligent Apps with Azure Container Apps – in 1 Minute Azure Container Apps, Azure OpenAI, and Azure AI Search make it possible to build intelligent applications with Retrieval-Augmented Generation (RAG). Your app can call Azure OpenAI in real-time to generate and interpret data, while Azure AI Search retrieves relevant information, enhancing responses with up-to-date context. For advanced scenarios, AI models can execute live code via Azure Container Apps, and GPU-powered instances support fine-tuning and inferencing at scale. This seamless integration enables AI-driven applications to deliver dynamic, context-aware functionality with ease. Watch the video on YouTube Networking for Azure Container Apps: VNETs, Security Simplified – in 1 Minute Azure Container Apps provides built-in networking features, including support for Virtual Networks (VNETs) to control service-to-service communication. Secure internal traffic while exposing public endpoints with custom domain names and free certificates. Fine-tuned ingress and egress controls ensure that only the right traffic gets through, maintaining a balance between security and accessibility. Service discovery is automatic, making inter-app communication seamless within your Azure Container Apps environment. Watch the video on YouTube Azure Continuous Deployment and Observability with Azure Container Apps - in 1 Minute Azure Container Apps simplifies continuous deployment with built-in integrations for GitHub Actions and Azure DevOps pipelines. Every code change triggers a revision, ensuring smooth rollouts with zero downtime. Observability is fully integrated via Azure Monitor, Log Streaming, and the Container Console, allowing you to track performance, debug live issues, and maintain real-time visibility into your app’s health—all without interrupting operations. Watch the video on YouTube Effortless Rollbacks and Deployments with Azure Container Apps – in 1 Minute With Azure Container Apps, every deployment creates a new revision, allowing multiple versions to run simultaneously. This enables safe, real-time testing of updates without disrupting production. Rolling back is instant—just select a previous revision and restore your app effortlessly. This powerful revision control system ensures that deployments remain flexible, reliable, and low-risk. Watch the video on YouTube Watch the Full Playlist For a complete overview of Azure Container Apps capabilities, watch the full JavaScript on Azure Container Apps YouTube Playlist Create Your Own AI-Powered Video Content Inspired by these short-form technical videos? You can create your own AI-generated videos using Azure AI to automate scriptwriting and voiceovers. Whether you’re a content creator, or business looking to showcase technical concepts, Azure AI makes it easy to generate professional-looking explainer content. Learn how to create engaging short videos with Azure AI by following our open-source AI Video Playbook. Conclusion Azure Container Apps is designed to simplify modern application development by providing a fully managed, serverless container environment. Whether you need to scale microservices, integrate AI capabilities, enhance security with VNETs, or streamline CI/CD workflows, Azure Container Apps offers a comprehensive solution. By leveraging its built-in features such as automatic scaling, revision-based rollbacks, and deep observability, developers can deploy and manage applications with confidence. These one-minute videos provide a quick technical overview of how Azure Container Apps empowers you to build scalable, resilient applications with ease. FREE Content Check out our other FREE content to learn more about Azure services and Generative AI: Generative AI for Beginners - A JavaScript Adventure! Learn more about Azure AI Agent Service LlamaIndex on Azure JavaScript on Azure Container Apps JavaScript at MicrosoftHow To: Retrieve from CosmosDB using Azure API Management
In this How To, I will show a simple mechanism for reading items from CosmosDB using Azure API Management (APIM). There are many scenarios where you might want to do this in order to leverage the capabilities of APIM while having a highly scalable, flexible data store.How To: Send requests to Azure Storage from Azure API Management
In this How To, I will show a simple mechanism for writing a payload to Azure Blob Storage from Azure API Management. Some examples where this is useful is implementing a Claim-Check pattern for large messages or to support message logging when Application Insights is not suitable.How To: APIM Asynch to Synch Pattern
Just because you can do something, does not mean you should... In this How To post, I will use APIM to turn an asynchronous messaging into a synchronous messaging by publishing a message to Azure Service Bus and retrieving the response using Azure Blob Storage.4.4KViews2likes0Comments