In this post, we'll walk through the architecture for an "agentic" helpdesk solution built entirely on cloud-native services. We'll combine a blazing-fast FastAPI backend, resilient Azure Storage and Service Bus, the intelligence of Azure OpenAI, and the deep integration of Microsoft 365 (Teams, Planner, and Power Automate) to create a truly automated workflow.
The High-Level Architecture
The core idea is to decouple the system. Instead of one large application doing everything, we split the process into distinct, scalable components.
- Ingestion: A lightweight API endpoint simply to capture the request.
- Decoupling: A message queue to hold the request for background processing.
- Processing: An asynchronous worker that handles all the heavy lifting: AI enrichment, notifications, and decision-making.
- Action: A set of automated actions that connect directly to our M365 tools.
Here’s the entire flow visualized as a flowchart:
Step-by-Step Workflow Breakdown
Let's dive into the details of each step.
Ingestion: The FastAPI Endpoint
The user's journey begins at a simple web form (built with FastAPI and Jinja2). The form captures the essential details: Title, Description, Category, Priority, and the user's email.
When the user hits "Submit," the request hits our POST /submit endpoint. This endpoint does two things immediately:
- Full Storage: It saves the complete entity (using Category as the PartitionKey and a GUID as the RowKey) into Azure Table Storage for a permanent record.
- Compact Message: It sends a compact JSON message (containing just the key info like tableRow, category, priority, etc.) to an Azure Service Bus queue named 'm365'.
This "split" is crucial. The API responds instantly to the user ("Submitted!") without waiting for any complex processing. The entire "heavy" part of the job is now in the queue.
The Asynchronous Worker & AI Enrichment
A separate Python process is constantly listening to the 'm365' Service Bus queue. When our new message arrives, the worker wakes up and:
- Parses the compact message.
- Uses the partition and row keys to fetch the full entity from Azure Table Storage.
- Calls our enrich_helpdesk_entity function, which is a wrapper for Azure OpenAI.
This AI step is where the magic begins. We send a prompt with the user's raw data and ask the AI to return a clean JSON object with an improved title, a concise summary, and a calculated urgency. If the AI fails, it gracefully falls back to using the original user input.
Human-in-the-Loop: Teams Notification
Now that we have a clean, enriched summary, we need to let the support team know. The worker calls send_to_teams, which formats the enriched data into a nice MessageCard and posts it to a designated Teams channel via a webhook.
The support team now sees a clean, AI-summarized notification, giving them instant visibility.
The 'Agent' Decides: AI-Driven Action
This is the "agentic" part of the workflow. Just notifying a channel is good, but true automation means taking the next step.
The worker calls decide_action, which uses the Microsoft Agent Framework (powered by an AzureOpenAIChatClient). We prompt the agent with the key data (category, priority, and the user's original ActionHint).
The agent's job is to intelligently decide the best action. It returns a simple JSON response like { "action": "create-task" }. This is far more powerful than a simple if/else block, as it can be trained to handle nuanced requests. The system defaults to the user's hint if the agent fails.
Execution: Closing the Loop in M365
Based on the agent's decision, the worker executes one of four actions:
- notify-team: Uses Azure Communication Services (ACS) to send a formatted email to a distribution list.
- create-task: Uses MSAL to get a Microsoft Graph token and directly creates a new task in a specific Planner plan/bucket.
- create-ticket: Makes an HTTP POST to a Power Automate flow, which can then connect to any system (like ServiceNow, JIRA, etc.) to create a formal ticket.
- store-only: Does nothing further. The request is stored and visible in Teams, but no other action is taken.
Visualizing the Interactions (Sequence Diagram)
Conclusion
This architecture provides a powerful, scalable, and intelligent solution for a common business problem. By leveraging a decoupled, event-driven design with serverless components, the system is both cost-effective and resilient.
The real power, however, comes from the two-stage AI: first, for enrichment (making data human-readable) and second, for decision-making (making the system autonomous). This "agentic" pattern, deeply integrated with the Microsoft 365 ecosystem, is a clear look at the future of business process automation.
Bonus Round: An Analytics Agent for Process Insights
We can easily extend this project by adding a Chat Interface Agent. Imagine a simple chat UI (in Teams, or its own web page) where a support manager can ask, in plain English:
- "How many total tickets did we receive today?"
- "Show me all 'High' priority requests for the 'IT' category."
- "Which team had the most 'create-task' actions assigned?"
Technically, this is another "agent" (powered by Azure OpenAI) that translates the user's natural language question into a valid OData query for our HelpdeskRequests table. It then fetches the data, summarizes it, and presents the answer in the chat. This creates a powerful, conversational "Copilot" for our new helpdesk process, giving us instant, natural language access to our operational data.
Git Repo
References
For more in-depth information on the services and frameworks used in this post, check out the official Microsoft Learn documentation: