AzureAI
64 TopicsSubgenAI makes AI practical, scalable, and sustainable with Azure Database for PostgreSQL
Authors: Abe Omorogbe, Senior Program Manager at Microsoft and Julia Schröder Langhaeuser, VP of Product Serenity Star at SubgenAI AI agents are thriving in pilots and prototypes. However, scaling them across organizations is more difficult. A recent MIT report shows that 95 percent of projects fail to reach production. Long development cycles, lack of observability, and compliance hurdles leave enterprises struggling to deliver production-ready agents. SubgenAI, a European generative AI company that focuses on democratizing AI for businesses and governments, saw an opportunity to change this. Its flagship platform, Serenity Star, transforms AI agent development from a code-heavy, fragmented process into a streamlined, no-code experience. Built on Microsoft Azure Database for PostgreSQL, Semantic Kernel, and Microsoft Foundry, Serenity Star empowers organizations to deploy production-grade AI agents in minutes, not months. SubgenAI’s mission is to make generative AI accessible, scalable, and secure for every organization. Whether you're a startup or a multinational, Serenity Star offers the tools to build intelligent agents tailored to your business logic, with full control over data and deployment. “Many things must happen around it in the coming years. Serenity Star is designed to solve problems like data control, compliance, and decision ethics—so companies can unleash the full potential of generative AI without compromising trust or profitability” - Lorenzo Serratosa Simplifying complex AI agent development Technical and operational challenges are inherent in enterprise-wide AI agent deployments. Examples include time-consuming iteration cycles, lack of observability and cost control, security concerns, and data sovereignty requirements. Serenity Star addresses these pain points by handling the entire AI agent lifecycle while providing enterprise-grade security and compliance features. Users can focus on defining their agent's purpose and behavior rather than wrestling with technical implementation details. Its framework focuses on four essentials for AI agents: the brain (underlying model), knowledge (accessible information), behavior (programmed responses), and tools (external system integrations). This framework directly influenced the technology stack choices for Serenity Star, with Azure Database for PostgreSQL powering the knowledge retrieval and Semantic Kernel enabling flexible model orchestration. Real-world architecture in action When a user query comes in, Serenity Star uses the vector capabilities of Azure Database for PostgreSQL to retrieve the most relevant knowledge. That context, combined with the user’s input, forms a complete prompt. Semantic Kernel then routes the request to the right large language model, ensuring the agent delivers accurate and context-aware responses. Serenity Star’s native connectors to platforms such as Microsoft Teams, WhatsApp, and Google Tag Manager are also part of this architecture, delivering answers directly in the collaboration and communication tools enterprises already use every day. Figure 1: Serenity Star Architecture This routing and orchestration architecture applies to the multi-tenant SaaS deployments and dedicated customer instances offered by Serenity Star. Azure Database for PostgreSQL provides native Row-Level Security (RLS) capabilities, a key advantage for securely managing multi-tenant environments. Multi-tenant deployments allow organizations to get started quickly with lower overhead, while dedicated instances meet the needs of enterprises with strict compliance and data sovereignty requirements. Optimizing for scale The same architecture that powers retrieval, routing, and multi-channel delivery also provides a foundation for performance at scale. As adoption grows, the team continuously monitors query volume, response times, and resource efficiency across both multi-tenant and dedicated environments. To stay ahead of demand, SubgenAI actively experiments with new Azure Database for PostgreSQL features such as DiskANN for faster vector search. These optimizations keep latency low even as more users and connectors are added. The result is a platform that maintains sub-60-second response times for 99 percent of chart generations, regardless of deployment model or integration point. With this systematic approach to scaling, organizations can deploy fully functional AI agents that are connected to their preferred communication platforms in just 15 minutes instead of hours. For enterprises that have struggled with failed AI projects, Serenity Star offers not only a secure and compliant solution but also one proven to grow with their needs. Why Azure Database for PostgreSQL is a cornerstone The knowledge component of AI agents relies heavily on retrieval-augmented generation (RAG) systems that perform similarity searches against embedded content. This requires a database capable of handling efficient vector search while maintaining enterprise-grade reliability and security. SubgenAI evaluated multiple vector database options. However, Azure Database for PostgreSQL with PGVector emerged as the clear winner. There were several compelling reasons for this. One is its mature technology, which provides immediate credibility with enterprise customers. Two, the ability to scale GenAI use cases with features like DiskANN for accurate and scalable vector search. There, the flexibility and appeal of using an open-source database with a vibrant and fast-moving community. As CPO Leandro Harillo explains: “When we tell them their data runs on Azure Database for PostgreSQL, it’s a relief. It's a well-known technology versus other options that were born with this new AI revolution.” As an open-source relational database management system, Azure Database for PostgreSQL offers extensibility and seamless integration with Microsoft’s enterprise ecosystem. It has a trusted reputation that appeals to organizations with strict data sovereignty and compliance requirements such as those in healthcare and insurance where reliability and governance are non-negotiable. The integration with Azure's broader ecosystem also simplified implementation. With Serenity Star built entirely on Azure infrastructure, Azure Database for PostgreSQL provided seamless connectivity and consistent performance characteristics. The fast response times necessary for real-time agent interactions are the result, along with maintaining the reliability demanded by enterprise customers. Semantic Kernel: Enabling model flexibility at scale Enterprise AI success requires the ability to experiment with different models and adapt quickly as technology evolves. Semantic Kernel makes this possible, supporting over 300 LLMs and embedding models through a unified interface. With Serenity Star, organizations can make genuine choices about their AI implementations without vendor lock-in. Companies can use embedding models from OpenAI through Azure deployments, ensuring their information remains in their own infrastructure while accessing cutting-edge capabilities. If business requirements change or new models emerge, switching becomes a configuration change rather than a development project. Semantic Kernel's comprehensive connector ecosystem also accelerated SubgenAI's own development process. Interfaces for different vector databases enabled rapid prototyping and comparison during the evaluation phase. “Semantic Kernel helped us to be able to try the different ones and choose the one that fit better for us,” notes Julia Schroder, VP of Product. The SubgenAI team has also extended Semantic Kernel to support more features in Azure Database for PostgreSQL, which is easier because of how well-known and popular PostgreSQL is. SubgenAI has also contributed improvements back to the community. This collaborative approach ensures the platform benefits from the latest developments while helping advance the broader ecosystem. Proven impact of Azure Database for PostgreSQL across industries Because organizations struggle to deliver production-ready agents because of long development cycles, lack of observability, and compliance, the effectiveness of Azure Database for PostgreSQL and other Azure services is reflected in deployment metrics and customer feedback. Production-ready agents typically require around 30 iterations for basic implementations. Complex use cases demand significantly more refinement. One GenAI customer in medical education required over 200 iterations to perfect an agent that evaluates medical students through complex case analysis. Azure PostgreSQL and other Azure services support hour-long iteration cycles rather than week-long sprints, which made this level of refinement economically feasible. Cost efficiency is another significant advantage. SubgenAI provisions and configures models in Microsoft Foundry, which eliminates idling GPU resources while providing detailed cost breakdowns. Users can see exactly how tokens are consumed across prompt text, RAG context, and tool usage, enabling data-driven optimization decisions. Consulting partnerships validate the platform's market position. One consulting firm with 50,000 employees is delighted with the easier implementation, faster deployment, and reliable production performance. Conclusion The combination of Azure Database for PostgreSQL and Semantic Kernel has enabled SubgenAI to address the fundamental challenges that cause 95 percent of enterprise AI projects to fail. Organizations using Serenity Star bypass the traditional barriers of lengthy development cycles, limited observability, and compliance hurdles that typically derail AI initiatives. The platform's architecture delivers measurable results, including a 50 percent reduction in coding time, support for complex agents requiring 200+ iterations, and deployment capabilities that compress months-long projects into 15-minute implementations. Azure Database for PostgreSQL provides the enterprise-grade foundation that customers in regulated industries require, while Semantic Kernel ensures organizations retain flexibility as AI technology evolves. This technological partnership creates a reliable pathway for companies to deploy production-ready AI agents without sacrificing data sovereignty or operational control. Through the reliability of Azure Database for PostgreSQL and the flexibility of Semantic Kernel, Serenity Star delivers an enterprise-ready foundation that makes AI practical, scalable, and sustainable.236Views1like0CommentsImport error: Cannot import name "PromptAgentDefinition" from "azure.ai.projects.models"
Hello, I am trying to build the agentic retrieval using Azure Ai search. During the creation of agent i am getting "ImportError: cannot import name 'PromptAgentDefinition' from 'azure.ai.projects.models'". Looked into possible ways of building without it but I need the mcp connection. This is the documentation i am following: https://learn.microsoft.com/en-us/azure/search/agentic-retrieval-how-to-create-pipeline?tabs=search-perms%2Csearch-development%2Cfoundry-setup Note: There is no Promptagentdefinition in the directory of azure.ai.projects.models. ['ApiKeyCredentials', 'AzureAISearchIndex', 'BaseCredentials', 'BlobReference', 'BlobReferenceSasCredential', 'Connection', 'ConnectionType', 'CosmosDBIndex', 'CredentialType', 'CustomCredential', 'DatasetCredential', 'DatasetType', 'DatasetVersion', 'Deployment', 'DeploymentType', 'EmbeddingConfiguration', 'EntraIDCredentials', 'EvaluatorIds', 'FieldMapping', 'FileDatasetVersion', 'FolderDatasetVersion', 'Index', 'IndexType', 'ManagedAzureAISearchIndex', 'ModelDeployment', 'ModelDeploymentSku', 'NoAuthenticationCredentials', 'PendingUploadRequest', 'PendingUploadResponse', 'PendingUploadType', 'SASCredentials', 'TYPE_CHECKING', '__all__', '__builtins__', '__cached__', '__doc__', '__file__', '__loader__', '__name__', '__package__', '__path__', '__spec__', '_enums', '_models', '_patch', '_patch_all', '_patch_evaluations', '_patch_sdk'] Traceback (most recent call last): Please let me know what i should do and if there is any other alternative. Thanks in advance.91Views0likes3CommentsGet to know the core Foundry solutions
Foundry includes specialized services for vision, language, documents, and search, plus Microsoft Foundry for orchestration and governance. Here’s what each does and why it matters: Azure Vision With Azure Vision, you can detect common objects in images, generate captions, descriptions, and tags based on image contents, and read text in images. Example: Automate visual inspections or extract text from scanned documents. Azure Language Azure Language helps organizations understand and work with text at scale. It can identify key information, gauge sentiment, and create summaries from large volumes of content. It also supports building conversational experiences and question-answering tools, making it easier to deliver fast, accurate responses to customers and employees. Example: Understand customer feedback or translate text into multiple languages. Azure Document IntelligenceWith Azure Document Intelligence, you can use pre-built or custom models to extract fields from complex documents such as invoices, receipts, and forms. Example: Automate invoice processing or contract review. Azure SearchAzure Search helps you find the right information quickly by turning your content into a searchable index. It uses AI to understand and organize data, making it easier to retrieve relevant insights. This capability is often used to connect enterprise data with generative AI, ensuring responses are accurate and grounded in trusted information. Example: Help employees retrieve policies or product details without digging through files. Microsoft FoundryActs as the orchestration and governance layer for generative AI and AI agents. It provides tools for model selection, safety, observability, and lifecycle management. Example: Coordinate workflows that combine multiple AI capabilities with compliance and monitoring. Business leaders often ask: Which Foundry tool should I use? The answer depends on your workflow. For example: Are you trying to automate document-heavy processes like invoice handling or contract review? Do you need to improve customer engagement with multilingual support or sentiment analysis? Or are you looking to orchestrate generative AI across multiple processes for marketing or operations? Connecting these needs to the right Foundry solution ensures you invest in technology that delivers measurable results.Building an Agentic, AI-Powered Helpdesk with Agents Framework, Azure, and Microsoft 365
The article describes how to build an agentic, AI-powered helpdesk using Azure, Microsoft 365, and the Microsoft Agent Framework. The goal is to automate ticket handling, enrich requests with AI, and integrate seamlessly with M365 tools like Teams, Planner, and Power Automate.282Views0likes0CommentsIssue with: Connected Agent Tool Forcing from an Orchestrator Agent
Hi Team, I am trying to force tool selection for my Connected Agents from an Orchestrator Agent for my Multi-Agent Model. Not sure if that is possible Apologies in advance for too much detail as I really need this to work! Please let me know if there is a flaw in my approach! The main intention behind going towards Tool forcing was because with current set of instructions provided to my Orchestrator Agent, It was providing hallucinated responses from my Child Agents for each query. I have an Orchestrator Agent which is connected to the following Child Agents (Each with detailed instructions) Child Agent 1 - Connects to SQL DB in Fabric to fetch information from Log tables. Child Agent 2 - Invokes OpenAPI Action tool for Azure Functions to run pipelines in Fabric. I have provided details on 3 approaches. Approach 1: I have checked the MS docs "CONNECTED_AGENT" is a valid property for ToolChoiceType "https://learn.microsoft.com/en-us/python/api/azure-ai-agents/azure.ai.agents.models.agentsnamedtoolchoicetype?view=azure-python" Installed the latest Python AI Agents SDK Beta version as it also supports "Connected Agents": https://pypi.org/project/azure-ai-agents/1.2.0b6/#create-an-agent-using-another-agents The following code is integrated into a streamlit UI code. Python Code: agents_client = AgentsClient( endpoint=PROJECT_ENDPOINT, credential=DefaultAzureCredential( exclude_environment_credential=True, exclude_managed_identity_credential=True ) ) # ------------------------------------------------------------------- # UPDATE ORCHESTRATOR TOOLS (executed once) # ------------------------------------------------------------------- fabric_tool = ConnectedAgentTool( id=FABRIC_AGENT_ID, name="Fabric_Agent", description="Handles Fabric pipeline questions" ) openapi_tool = ConnectedAgentTool( id=OPENAPI_AGENT_ID, name="Fabric_Pipeline_Trigger", description="Handles OpenAPI pipeline triggers" ) # Update orchestrator agent to include child agent tools agents_client.update_agent( agent_id=ORCH_AGENT_ID, tools=[ fabric_tool.definitions[0], openapi_tool.definitions[0] ], instructions=""" You are the Master Orchestrator Agent. Use: - "Fabric_Agent" when the user's question includes: "Ingestion", "Trigger", "source", "Connection" - "Fabric_Pipeline_Trigger" when the question mentions: "OpenAPI", "Trigger", "API call", "Pipeline start" Only call tools when needed. Respond clearly and concisely. """ ) # ------------------------- TOOL ROUTING LOGIC ------------------------- def choose_tool(user_input: str): text = user_input.lower() if any(k in text for k in ["log", "trigger","pipeline","connection"]): return fabric_tool if any(k in text for k in ["openapi", "api call", "pipeline start"]): return openapi_tool # No forced routing → let orchestrator decide return None forced_tool = choose_tool(user_query) run = agents_client.runs.create_and_process( thread_id=st.session_state.thread.id, agent_id=ORCH_AGENT_ID, tool_choice={ "type": "connected_agent", "function": forced_tool.definitions[0] } Error: Azure.core.exceptions.HttpResponseError: (invalid_value) Invalid value: 'connected_agent'. Supported values are: 'code_interpreter', 'function', 'file_search', 'openapi', 'azure_function', 'azure_ai_search', 'bing_grounding', 'bing_custom_search', 'deep_research', 'sharepoint_grounding', 'fabric_dataagent', 'computer_use_preview', and 'image_generation'. Code: invalid_value Message: Invalid value: 'connected_agent'. Supported values are: 'code_interpreter', 'function', 'file_search', 'openapi', 'azure_function', 'azure_ai_search', 'bing_grounding', 'bing_custom_search', 'deep_research', 'sharepoint_grounding', 'fabric_dataagent', 'computer_use_preview', and 'image_generation'." Approach 2: Create ConnectedAgentTool as you do, and pass its definitions to update_agent(...). Force a tool by name using tool_choice={"type": "function", "function": {"name": "<tool-name>"}}. Do not set type: "connected_agent" anywhere—there is no such tool_choice.type. Code: from azure.identity import DefaultAzureCredential from azure.ai.agents import AgentsClient # Adjust imports to your SDK layout: # e.g., from azure.ai.agents.tool import ConnectedAgentTool agents_client = AgentsClient( endpoint=PROJECT_ENDPOINT, credential=DefaultAzureCredential( exclude_environment_credential=True, exclude_managed_identity_credential=True # keep your current credential choices ) ) # ------------------------------------------------------------------- # CREATE CONNECTED AGENT TOOLS (child agents exposed as function tools) # ------------------------------------------------------------------- fabric_tool = ConnectedAgentTool( id=FABRIC_AGENT_ID, # the **child agent ID** you created elsewhere name="Fabric_Agent", # **tool name** visible to the orchestrator description="Handles Fabric pipeline questions" ) openapi_tool = ConnectedAgentTool( id=OPENAPI_AGENT_ID, # another child agent ID name="Fabric_Pipeline_Trigger", # tool name visible to the orchestrator description="Handles OpenAPI pipeline triggers" ) # ------------------------------------------------------------------- # UPDATE ORCHESTRATOR: attach child tools # ------------------------------------------------------------------- # NOTE: definitions is usually a list of ToolDefinition objects produced by the helper agents_client.update_agent( agent_id=ORCH_AGENT_ID, tools=[ fabric_tool.definitions[0], openapi_tool.definitions[0] ], instructions=""" You are the Master Orchestrator Agent. Use: - "Fabric_Agent" when the user's question includes: "Ingestion", "Trigger", "source", "Connection" - "Fabric_Pipeline_Trigger" when the question mentions: "OpenAPI", "Trigger", "API call", "Pipeline start" Only call tools when needed. Respond clearly and concisely. """ ) # ------------------------- TOOL ROUTING LOGIC ------------------------- def choose_tool(user_input: str): text = user_input.lower() if any(k in text for k in ["log", "trigger", "pipeline", "connection"]): return "Fabric_Agent" # return the **tool name** if any(k in text for k in ["openapi", "api call", "pipeline start"]): return "Fabric_Pipeline_Trigger" # return the **tool name** return None forced_tool_name = choose_tool(user_query) # ------------------------- RUN INVOCATION ------------------------- if forced_tool_name: # FORCE a specific connected agent by **function name** run = agents_client.runs.create_and_process( thread_id=st.session_state.thread.id, agent_id=ORCH_AGENT_ID, tool_choice={ "type": "function", # <-- REQUIRED "function": { "name": forced_tool_name # <-- must match the tool's name as registered } } ) else: # Let the orchestrator auto-select (no tool_choice → "auto") run = agents_client.runs.create_and_process( thread_id=st.session_state.thread.id, agent_id=ORCH_AGENT_ID ) Error: azure.core.exceptions.HttpResponseError: (None) Invalid tool_choice: Fabric_Agent. You must also pass this tool in the 'tools' list on the Run. Code: None Message: Invalid tool_choice: Fabric_Agent. You must also pass this tool in the 'tools' list on the Run. Approach 3: Modified version of the 2nd Approach with Took Definitions call: # ------------------------- TOOL ROUTING LOGIC ------------------------- def choose_tool(user_input: str): text = user_input.lower() if any(k in text for k in ["log", "trigger","pipeline","connection"]): # return "Fabric_Agent" return ( "Fabric_Agent", fabric_tool.definitions[0] ) if any(k in text for k in ["openapi", "api call", "pipeline start"]): # return "Fabric_Pipeline_Trigger" return ( "Fabric_Pipeline_Trigger", openapi_tool.definitions[0] ) # No forced routing → let orchestrator decide # return None return (None, None) # forced_tool = choose_tool(user_query) forced_tool_name, forced_tool_def = choose_tool(user_query) # ------------------------- ORCHESTRATOR CALL ------------------------- if forced_tool_name: tool_choice = { "type": "function", "function": { "name": forced_tool_name } } run = agents_client.runs.create_and_process( thread_id=st.session_state.thread.id, agent_id=ORCH_AGENT_ID, tool_choice=tool_choice, tools=[ forced_tool_def ] # << only the specific tool ) else: # no forced tool, orchestrator decides run = agents_client.runs.create_and_process( thread_id=st.session_state.thread.id, agent_id=ORCH_AGENT_ID ) Error: TypeError: azure.ai.agents.operations._patch.RunsOperations.create() got multiple values for keyword argument 'tools'120Views0likes1CommentBuild Your Ignite Schedule: Top Sessions for Developers
Get Ready for Microsoft Ignite Welcome back to our Azure Tech Community Microsoft Ignite 2025 series! If you joined us for Your Guide to Azure Community Activations at Microsoft Ignite 2025, you already know that Microsoft Ignite isn’t just about product updates, it’s where ideas, innovation, and community come together. With hundreds of sessions across every area of Azure, it can be hard to know where to focus. That’s why we’ve curated a list of sessions tailored specifically for Developers to help you make the most of your time, strengthen your technical strategy, and get inspired for the year ahead. Use the Microsoft Ignite session scheduler feature in the session catalog to personalize your agenda, save your favorites, and organize your time at Microsoft Ignite. Make Every Minute Count: Top Recommended Sessions for Developers Microsoft Ignite moves fast, so it pays to plan your path before you arrive. By organizing your schedule around your interests and goals, you’ll be able to maximize learning opportunities, connect with peers, and leave with actionable insights. The recommendations below highlight our top picks for Developers who want to build, modernize, and innovate using Azure and AI. Reimagining software development with GitHub Copilot and AI agents (BRK105) Discover how AI is transforming the development workflow, from writing code to managing pull requests, and boosting productivity. Build AI Apps fast with GitHub and Azure AI Foundry in action (BRK110) Explore how to build, train, and deploy intelligent apps using GitHub and Azure AI Foundry in record time. Building and deploying data agents in Microsoft Fabric (THR738) Learn how to build, deploy, and manage intelligent data agents in Microsoft Fabric using curated data, Git-powered CI/CD, and best practices for enterprise-scale AI integration. CI/CD for Fabric: Accelerating Lakehouse to production in 25 minutes (THR739) Bring software-engineering rigor to your data with Microsoft Fabric—learn how to use Git-integrated deployment pipelines, parameterized deployments, and automated checks to confidently operationalize your Lakehouse from validation to production. Build A2A and MCP Systems using SWE Agents and agent‑framework (LAB513) Explore how to build, orchestrate, and deploy multi‑agent systems with the new agent‑framework, SWE Agents, and MCP—hands‑on and production‑minded. Before diving into your targeted sessions, make time for these essential moments that set the stage for everything happening at Microsoft Ignite. Opening Keynote (KEY01) Hear from Microsoft leaders as they unveil the latest innovations shaping the future of AI, cloud, and the developer ecosystem. This is the session that sets the tone for the entire event. Innovation Session: Your AI Apps and Agent Factory (BRK1706) Join Microsoft engineering leaders as they explore the foundational AI capabilities powering the Microsoft Cloud. Learn how Azure AI services, Copilot experiences, and responsible AI frameworks come together to help developers and organizations innovate faster Innovation Session: Agents at work: Shaping the future of business (BRK1708) Learn how AI-powered agents and digital workers are reshaping collaboration and productivity. This session showcases real-world use cases across Microsoft 365, Dynamics, and Azure ecosystems. Plan Smarter with the Session Scheduler With the “add to schedule” feature in the session catalog you’ll be able to: Browse and filter all Microsoft Ignite sessions by topic, product, or persona. Save your favorite sessions to build a personalized schedule. Set reminders and block time for networking, community booths, and demos. Stay Connected with the Azure Tech Community Your Microsoft Ignite journey doesn’t stop when the sessions end, the conversation continues across the Azure Tech Community. Share the sessions you’re most excited about using #MSIgnite #AzureTechCommunity, tag azure, and connect with other developers exploring the future of cloud management and security. Follow the Azure Tech Community for real-time updates, announcements, and product news throughout Ignite. See You at Microsoft Ignite 2025 With the right plan in place, every session becomes an opportunity to learn, grow, and connect. Explore these recommendations, save your favorites, and get ready for an unforgettable Microsoft Ignite 2025 experience.291Views1like0CommentsYour Guide to Azure Community Activations at Microsoft Ignite 2025
Microsoft Ignite 2025 is right around the corner! From November 18–21 in San Francisco, we’re excited to bring the Azure community together for four days of learning, connection, and fun! Whether you’re joining us onsite at the Moscone Center or tuning in online, the Community Space will be buzzing with MVP meetups, interactive theater sessions, and plenty of opportunities to network. This is the first in a series of posts highlighting what you can expect at Microsoft Ignite. Today, we’re spotlighting Azure Community activations across Infrastructure, AI, Data, and MVP programs. In upcoming posts, we’ll dive deeper into sessions tailored for IT professionals, developers, and even first-time attendees. Azure Infrastructure Microsoft Ignite is packed with practical insights to help you migrate, modernize, and secure workloads. Learn how to Troubleshoot AKS networking with Agentic AI and strengthen your AI workload resiliency with Azure’s networking stack. Dive into migration best practices with sessions on moving data for analytics, lessons from Azure MVPs, and community insights from real-world projects. And don’t miss the fan favorite: Learn Infrastructure-as-Code through Minecraft —where cloud automation meets creativity. AI & Agents If you’re passionate about AI, the community sessions will put you at the center of what’s next. Connect with peers at the Global AI Community meetup. Get a sneak peek at what’s coming with Azure AI Insiders. Hear directly from MVPs and Microsoft leaders on shaping the future of Azure AI Foundry and how AI is transforming customer innovation. Azure Data For those focused on data, Microsoft Ignite is your chance to learn, influence, and connect. Share your feedback on SQL Server Management Studio and Copilot in SSMS. Bring your toughest questions to a Q&A with Azure Data Leadership. And join the community to explore real-world data intelligence solutions and career-building opportunities across the data ecosystem. MVP Program Interested in becoming a Microsoft MVP or expanding your community impact? Learn how to get nominated and grow your influence in So you want to become an MVP? Join program leads and MVPs to hear their stories in Becoming an MVP in Azure, AI, or the Data Platform. These sessions are the perfect place to start if you’re looking to give back and level up your community journey. Stay Connected Year-Round The conversations doesn't stop after Microsoft Ignite. Join the communities that keep the learning going: AKS Community (Infrastructure) Azure AI Foundry (AI) Global AI Community (AI) Fabric Community (Data) Azure Data Community (Data) Fellow Developers And this is just the beginning. Microsoft Ignite is packed with opportunities to learn from experts, connect with peers, and explore what’s next with Azure. Stay tuned for our upcoming posts, where we’ll share curated session highlights designed for different audiences to help you make the most of your Microsoft Ignite experience. 👉 Be sure to mark your calendar, start building your schedule, and get ready to be inspired at Microsoft Ignite 2025442Views2likes0CommentsIntelligent Conversations: Building Memory-Driven Bots with Azure AI and Semantic Kernel
Discover how memory-driven AI reshapes the way we interact, learn, and collaborate. 💬✨ On October 20th at 8pm CET, we’ll explore how Semantic Kernel, Azure AI Search, and Azure OpenAI models enable bots that remember context, adapt to users, and deliver truly intelligent conversations. 🤖💭 Join Marko Atanasov and Bojan Ivanovski as they dive into the architecture behind context-aware assistants and the future of personalized learning powered by Azure AI. 🌐💡 ✅Save your seat now: https://lnkd.in/dnZSj6Pb51Views0likes0CommentsIntelligent Conversations: Building Memory-Driven Bots with Azure AI and Semantic Kernel
Discover how memory-driven AI reshapes the way we interact, learn, and collaborate. 💬✨ On October 20th at 8pm CET, we’ll explore how Semantic Kernel, Azure AI Search, and Azure OpenAI models enable bots that remember context, adapt to users, and deliver truly intelligent conversations. 🤖💭 Join Marko Atanasov and Bojan Ivanovski as they dive into the architecture behind context-aware assistants and the future of personalized learning powered by Azure AI. 🌐💡 ✅ Save your seat on the following link: https://streamyard.com/watch/DN4thzYripaz198Views0likes0CommentsUnlocking Document Insights with Azure AI 🚀
Every organisation is drowning in documents (contracts, invoices, reports), yet the real challenge lies in extracting meaningful insights from this unstructured information. Imagine turning those files into structured, actionable data with just a few clicks. That’s exactly what we’ll explore in this Microsoft Zero To Hero session: ✨ How Azure Document Intelligence can automate document processing ✨ Ways to enhance data extraction with AI ✨ Seamless integration into Azure’s data platform for end-to-end insights Join us to see how AI-powered automation can save time, reduce manual effort, and unlock the value hidden in your documents. 📌 Don’t miss this opportunity to learn and apply Azure AI in action! 🗓️ Date: 7 October 2025 ⏰ Time: 19:00 (AEDT) 🎙️ Speaker: Akanksha Malik 📌 Topic: Unlocking Document Insights with Azure AI61Views0likes0Comments