MVPBuzz
18 TopicsKCDC: Where Global Voices Drive Local Innovation
At the heart of the Kansas City Developer Conference (KCDC) is a vibrant, inclusive community—one that empowers developers from around the world to connect, collaborate, and innovate. This year, Microsoft MVPs brought their expertise and passion to the event, uplifting others and sparking curiosity across every session. We caught up with a few of them, to hear their reflections on the experience and what makes KCDC such a standout event in the developer world. MVP Dennie Declercq: Hospitality That Fuels Innovation How far would you go for a great developer conference? For MVP Dennie Declercq, the answer is over 4,500 miles (7,000+ km) —crossing continents and time zones from Belgium all the way to Kansas City, Missouri! Dennie’s journey to KCDC wasn’t just about the miles; it was about the magnetic pull of a truly special event. He shared, “KCDC has an incredibly welcoming vibe for speakers. They celebrate their Kansas City roots with legendary BBQ and unique speaker shirts, but what really sets them apart is how they treat their speakers. KCDC challenges the status quo—offering professional photo shoots that speakers can use for personal branding and their professional lives. It’s a level of hospitality and appreciation you don’t find everywhere.” Louella Creemers at KCDC MVP Louella Creemers: Belief That Crosses Borders Sometimes, the most important journeys aren’t measured in miles—they’re measured in encouragement and support. For MVP Louella Creemers, her path from the Netherlands to the Kansas City Developer Conference began long before she ever booked a flight. Lou’s KCDC story started in 2021 when she began sharing online her aspirations to speak and run workshops. The KCDC organizers saw her potential and reached out with a simple message: “You should totally come to KCDC and give a talk.” That early support made Louella feel part of the community, even when she was still studying and unable to travel. “I flew 7,000+ km to KCDC because it’s the conference that believed in me before I’d ever set foot on a stage.” Louella recalls. In 2025, with a job that supports her passion for sharing knowledge, she finally fulfilled that promise—bringing her unique perspective to sessions on topics like inclusive design and cross-platform development. Stepping into the Kansas City developer community was an eye-opening experience for Louella. Coming from the Netherlands, where the tech scene is close-knit and meetups are limited by geography, she was struck by the sheer scale and variety on display at KCDC. Lou shared, “At KCDC I noticed a totally different scale. In the sponsor hall I passed the community booths and saw a full page with names of developer meetups in Kansas City. The number and variety left my mind blown. Talking to these developer group organizers made me realize how much density matters: when you have that many groups in one region, you get more cross-pollination, more chances to learn, and more choices for new people.” Photo: MVP Samuel Gomez at KCDC MVP Samuel Gomez: The Power of New Connections For MVP Samuel Gomez, KCDC stands out for its commitment to welcoming newcomers and fostering meaningful connections. During the opening ceremony, organizers encouraged attendees to reach out and talk to someone they didn’t know—a simple piece of advice that Samuel finds truly transformative. “I love that advice because you never know how meeting someone at an event can change your life—I know mine has!” Samuel’s experience is a testament to the power of community-driven events like KCDC, where a single conversation can spark new opportunities, collaborations, and lifelong friendships. By creating an environment where everyone feels encouraged to connect, KCDC continues to empower developers at every stage of their journey. MVP Ben Dechrai: Curiosity and Collaboration in Action For MVP Ben Dechrai, speaking at KCDC isn’t just another stop on the conference circuit—it’s a return to a place where curiosity and connection thrive. For him, returning to Kansas City each year means rejoining a community where curiosity is celebrated and meaningful connections thrive. “I keep coming back to KCDC because of the audience - they're what make this conference truly special. The rooms are packed with people who come with incredible questions and a real hunger for learning. It's not just polite Q&A; these attendees are genuinely there to absorb as much as they can, and they'll approach you afterward to continue the conversation, ask follow-up questions, or just thank you for sharing your knowledge.” Ben continued, what really sets KCDC apart from other conferences I've spoken at is this combination of deep engagement and Midwestern hospitality. Many of the 1,800 attendees are locals, and that welcoming spirit is palpable throughout the event. While audiences around the world are always welcoming, there's something special about KCDC's community that reminds me of my own hunger for knowledge when I first started attending conferences.” That same spirit extends to speakers, too. From the first outreach to on-site logistics, the organizers go above and beyond to make presenters feel appreciated and supported—one of many reasons KCDC stands out in the global developer community. The Heart of KCDC: Community, Curiosity, and Collaboration From insightful talks to behind-the-scenes planning, MVPs helped shape KCDC into a vibrant and impactful experience for attendees. Their dedication to the developer community shines through in every session, hallway conversation, and late-night planning meeting. Whether you're a seasoned speaker or a first-time attendee, their stories offer a glimpse into the heart of what makes KCDC—and the MVP community—so special. How has a community or conference empowered you in your technical journey? Share your story and join the conversation—because innovation starts with people who believe in each other. Learn more about KCDC and how you can get involved. #MVPBuzz64Views0likes0CommentsSeoul AI Hub & Microsoft MVPs Empower Citizens with AI Skills — No Coding Required
Seoul AI Hub, an AI-specialized support organization under the Seoul Metropolitan Government, is dedicated to fostering the city’s AI industry ecosystem through talent development, startup incubation, and public education. In partnership with Microsoft MVPs, the hub is making AI accessible to all through the AI Frontiers Series — blending expert talks with hands-on workshops. “AI is no longer just for experts; it’s a tool for everyone,” says Chan-jin Park, Director of Seoul AI Hub. The collaboration between Seoul AI Hub and Microsoft MVPs demonstrates the transformative power of community-led expertise. MVPs such as Jaeseok Lee, Heo seok, Haesun Park, and Minseok Song brought their technical leadership to the forefront — integrating advanced AI concepts with practical skills that citizens could immediately use. From explaining multi-agent architectures to building custom Copilot solutions, their sessions showed how complex AI tools can be democratized for non-developers. Beyond teaching, these MVPs are active contributors to the global AI ecosystem. Minseok Song maintains the Co-op Translator open-source project, integrating AI-based translation workflows into real-world scenarios. Jaeseok Lee leads Korea’s Power Platform User Group, connecting business users and developers to collaborate on Copilot Studio innovations. This kind of community-driven leadership extends the impact of Microsoft technologies far beyond corporate settings. These events also reflect how the MVP community is growing more diverse in expertise and audience reach. Participants came from varied backgrounds — students, entrepreneurs, office workers, and hobbyists — all united by a desire to understand and use AI meaningfully. For many attendees, this was their first encounter with building AI agents, and the supportive environment encouraged experimentation and collaboration. MVPs not only shared technical knowledge but also their own journeys: how they discovered Microsoft AI, grew into community leaders, and applied their skills to solve local and global challenges. Such stories inspire the next generation of community builders and potential MVPs. AI Frontiers Series Summer Sessions Recent events at the Seoul AI Hub where MVP participated included: July 22 featured a deep-dive seminar on “Open AI Technologies for Survival in the AI Frontier Era,” covering multi-agent strategies, LLM and multimodal trends, and real-world open-source AI applications. Aug 12 brought the AI Agent Bootcamp for Non-Developers, where 80 registered citizens learned to create Copilot agents without code. Participants explored integrating AI agents into Microsoft Teams and M365, building document-driven assistants, and deploying multi-channel solutions. “Copilot Studio allows anyone to build their own ChatGPT-like agent. The key is not just creating one agent, but learning how to design multiple agents that work together to solve real problems,” said Jaeseok Lee, Microsoft Copilot Studio MVP. These back-to-back sessions show what’s possible when technical expertise, open-source spirit, and a commitment to public education come together. The impact extends beyond the events themselves — sparking curiosity, building confidence, and equipping citizens to harness AI in ways that are relevant to their lives and work The AI Frontiers Series proves that when experts and communities connect, technology becomes more inclusive and impactful. By lowering the barrier to AI adoption, Seoul AI Hub and Microsoft MVPs are equipping citizens with skills for the future. To explore upcoming sessions or get involved, visit the Seoul AI Hub website and join the movement to make AI a tool for everyone.167Views4likes0CommentsAugust Calendar is here!
🌟 Community Spirit? CHECKED! 🌍 Amazing Members & Audiences? DOUBLE CHECK! 🎤 Phenomenal Speakers Locked In? CHECKED! 🚀 Global Live Sessions? YOU BET! The stage is set. The excitement is real. It’s that time again, time to ignite the community with another monthly calendar! 🔥✨ We’ve lined up a powerhouse of sessions packed with world-class content, covering the best of Microsoft, from Coding, Cloud, Migration, Data, Security, AI, and so much more! 💻☁️🔐🤖 But wait, that’s not all! For the first time ever, we’ve smashed through time zones! No matter where you are in the world, you can tune in LIVE and learn from extraordinary speakers sharing their insights, experiences, and passion. 🌏⏰ What do you need to do? It’s easy: 👉 Register for the sessions 👉 Mark your calendar 👉 Grab your coffee, tea, or ice-cold soda 👉 Join us and soak up the knowledge! We believe in what makes this community truly special, and that’s YOU. Let’s set August on fire together! 🔥 Are you ready to be inspired, to grow, and to connect with Microsoft Learn family? Don’t miss out, August is YOUR month! 💥🙌 📢 Shehan Perera 📖 https://streamyard.com/watch/dh62MQJHEv9B?wt.mc_id=MVP_350258 📅 5 Aug 2025 (19:00 AEST) (11:00 CEST) 📢Shahab Ganji 📖 https://streamyard.com/watch/qCXk9kkb34W8?wt.mc_id=MVP_350258 📅 8 Aug 2025 18:00 CEST 📢 Ronak Vachhani 📖https://streamyard.com/watch/hNjJAZeUcxTF?wt.mc_id=MVP_350258 📅16 Aug 2025 (16:00 AEST) (08:00 CEST) 📢Laïla Bougriâ 📖https://streamyard.com/watch/KWwF7Wd5mYAG?wt.mc_id=MVP_350258 📅22 Aug 2025 18:00 CEST 📢AJ Bajada 📖https://streamyard.com/watch/vaNSN3hVuXbr?wt.mc_id=MVP_350258 📅28 Aug 2025 (19:30 AEST) (11:30 CEST) 📢James Eastham 📖https://streamyard.com/watch/FNGJZNbAKjFi?wt.mc_id=MVP_350258 📅29 Aug 2025 17:00 CEST180Views1like0CommentsMVP Collective Launches In-Depth Guide on SharePoint Content AI
MVP and Regional Director Gokan Ozcifci, together with eight fellow Microsoft MVPs, has co-authored SharePoint Content AI, Solutions and Advanced Administration - a new book that delves into the intersection of artificial intelligence and SharePoint. We spoke with the authors to learn more about their collaboration, what inspired the project, and the key themes they explore. What inspired you to collaborate on this eBook, and how did the idea for focusing on AI and SharePoint come about? Gokan Ozcifci, Belgium: The inspiration behind this book stems from observing how rapidly AI is transforming the way organizations manage content, particularly within Microsoft 365 and SharePoint. As this transformation accelerated, I noticed a growing gap: many industry experts, digital transformation directors, and business leaders were eager to embrace these AI-driven tools but found it difficult to understand how capabilities like Autofill, AI-powered metadata, OCR, or governance solutions translate into real-world value. That’s when a group of SharePoint enthusiasts, MVPs, consultants, and practice owners joined forces with a shared goal—to bridge that gap. We set out to create more than just a technical guide. We aimed to build a resource grounded in practical experience, offering clear explanations and actionable insights for those navigating the evolving world of Content AI. SharePoint naturally became our focal point due to its central role in enterprise content management and its rapid evolution in tandem with Microsoft’s AI strategy. Our goal was to demystify the technology, understand the requirements, translate the needs, and show how AI can empower organizations to manage content more intelligently, securely, and efficiently. Can you share a real-world example where AI significantly enhanced SharePoint content management or administration? Frane Borozan – MVP, Croatia: A global enterprise had a use case to classify thousands of legacy contracts in SharePoint. SharePoint content AI extracts metadata, such as expiration dates, the names of signatories, and the validity period of the contract, as well as similar information that is available within the contract. This cut manual effort by almost 90% to hire the workforce to review all these legacy contracts. Noorez Khamis – MVP, Canada: For one client they previously had 5–10 interns manually logging into 20 banking websites each month to download client statements, upload them to SharePoint, tag metadata, and update Salesforce—an inefficient and error-prone process. Now, an automated solution using Power Automate Desktop Flows handles document retrieval, Syntex extracts key metadata, and a validation Power App ensures data accuracy before integrating with Salesforce for approvals and updates. This end-to-end system eliminates manual effort, increases accuracy, and streamlines document processing across platforms. Drew Madelung – MVP, USA: As an M365 consultant, I work with multiple customers and write unique and complex statements of work. I actively utilize SharePoint Content AI features, such as autofill columns, to help summarize and provide key metadata from statements of work, enabling me to discover details from prior and existing projects where overlap is likely to occur. Mike Maadarani – MVP, Canada: AI was deployed at a university to manage their application admissions process. The Content AI significantly improved the classification and extraction of information from various application formats. This process reduced manual work by over 98%, resulting in substantial savings and a high return on investment for the client. Antonio Maio – MVP, Canada: I had a client who greatly benefited from Content AI models in SharePoint Online. They’re a corporate real estate firm that utilizes Content AI models to process lease agreements and rental contracts, automatically extracting key metadata values from these content types. This metadata automatically populates columns in SharePoint libraries, which then drives business process automation and retention policies for those documents. This all happens by users simply uploading new documents into SharePoint libraries. How do you see the role of AI evolving in the SharePoint ecosystem over the next few years? Frane Borozan – MVP, Croatia: SharePoint, as the content management platform, doesn't have a future without the help of AI. Use cases are varied, ranging from extracting metadata from content to helping create new content. I believe that with the help of AI, the possibilities of SharePoint are unlimited. Noorez Khamis – MVP, Canada: Copilot is making SharePoint the go-to content management system by transforming how you discover, create, and interact with content. You can now ask for what you need, generate pages from your existing documents, and get personalized answers through AI-powered SharePoint Agents. With more intelligent automation, beautiful intranet design, and fewer clicks, SharePoint feels more like an intelligent assistant than a static site. Vlad Catrinescu – MVP, Canada: AI will continue transforming how we work, and SharePoint is no exception. Today, we’re already seeing AI help fill in metadata for document libraries. But imagine if AI could go further: automatically suggest and create the right columns, build content types based on the documents you upload, or even configure web parts through natural language prompts. SharePoint has always been a powerful platform, but it hasn’t always been the easiest to use. AI has the potential to make that power more accessible to every user, not just the experts. Drew Madelung – MVP, USA: I would like to see SharePoint Content AI evolve in a way that identifies opportunities for AI using existing content that automatically configures it without user intervention to improve discoverability. The ability to configure and work with these AI features should have a minimal learning curve and be integrated seamlessly, without requiring specialized technical skills. Mike Maadarani – MVP, Canada: As artificial intelligence continues to advance, it is anticipated that content management will become fully automated, reducing the need for administrators to establish and enforce rules. AI algorithms will be significantly more sophisticated, enhancing their ability to comprehend an organization's policies, the nature of the content being added, and the necessary actions required by the rules. Antonio Maio – MVP, Canada: I think auto-fill columns will have a significant impact on SharePoint Online. Metadata is a core element of good information management, but we know that users don’t want to fill in metadata. They’re busy and often move too quickly from task to task, leaving little time to provide a wealth of metadata elements. SharePoint’s auto-fill columns offer an easy way for us to automatically extract metadata, based on a prompt that’s supplied to the column. Joanne Klein – MVP, Canada: As a security and compliance professional, I observe the evolution of AI data governance, which aims to control the proliferation, access, and lifecycle of AI solutions across SharePoint within the enterprise. We need to elevate AI to be a core pillar within an organization's holistic data governance strategy. What advice would you give to SharePoint professionals who are just beginning to explore AI-powered capabilities? Frane Borozan – MVP, Croatia: If you're starting with AI in SharePoint, taxonomy tagging is a perfect first step. It shows how AI can reduce manual effort and bring structure to your content. Set up a managed metadata column linked to your term store and let AI handle tagging based on document content. It’s a simple way to improve search, consistency, and governance—without heavy customization. Start here, learn the basics, and expand as you go. Noorez Khamis – MVP, Canada: Start with the basics and learn how to prompt effectively while using the full Microsoft 365 Copilot capabilities across all the workplace tools you use every day, such as Teams, SharePoint, Word, Excel, PowerPoint, and Outlook. Take the time to revise your prompts and use templates that have been proven to work, saving you time, streamlining tasks, and boosting productivity. In SharePoint specifically, explore how to create pages, rewrite content, and use AI-powered SharePoint agents. Vlad Catrinescu – MVP, Canada: Get hands-on as early as possible. Theory is great, but real understanding comes from testing in your environment. Set up a lab, start small, and explore practical use cases where AI can help automate or enhance existing processes. With the Pay-As-You-Go model, there’s no upfront cost—you only pay for what you use. That said, I highly recommend setting a budget cap in Azure to avoid surprises. Drew Madelung – MVP, USA: SharePoint remains essentially unchanged in many ways, and it is still essential to understand the concepts of content types, columns, and permission hierarchies to implement advanced AI solutions against your organization's content effectively. Mike Maadarani – MVP, Canada: IT professionals must stay current with evolving technologies, particularly in the rapidly changing field of AI. With the emergence of AI agents, I recommend acquiring skills in creating, managing, and deploying agents within Microsoft 365 to enhance the integration and utilization of AI in their organizations. Antonio Maio – MVP, Canada: Be curious about AI - play with the AI technology that’s built into SharePoint; experiment with it to see what best benefits your organization to improve how you specifically manage information. Your experience will be different than everyone else’s, so try different things to figure out what works for you and your users. Joanne Klein – MVP, Canada: If your SharePoint setup is a mess, your AI will be too. Solid, well-defined structure and smart governance (site owner stewardship, explicit permissions, retention/deletion to clean up ROT, and data protection controls) are like laying concrete before building—skip it, and your AI’s standing on quicksand. Access the full 194-page e-book, SharePoint Content AI, Solutions and Advanced Administration, at the following link SharePoint Content AI, Solutions and Advanced Administration593Views1like0CommentsYellowHat 2025: A Global Stage for Deep Microsoft Security Insights
YellowHat 2025, held on March 6th, was a landmark event focused on Microsoft Security, drawing together a global audience of professionals and enthusiasts. Hosted at Microsoft's Amsterdam headquarters, the event featured over 150 in-person attendees and 1500+ online participants, all eager to delve into advanced security topics. MVP Myron Helgering and the organizing team shared their insights YellowHat's ideas, motivations and future prospects for the event What inspired you to organize YellowHat 2025? We felt there was a need for something new: an event organized by and for the community, focused solely on Microsoft Security content. One thing was also clear: We wanted it to be a deeply technical event, so level 400+. Our goal was to be visible worldwide, so we chose a hybrid event and focused on delivering a high-quality online and in-person experience. As it was our first edition, we aimed to create an exciting and easily recognizable event. How did you ensure that the content was relevant and immediately applicable to current security challenges? The most important thing was getting the right speakers on board for our event; they had to be top-notch. We selected our speakers based on their expertise, experience, and their ability to deliver engaging and relevant content. Luckily, we could attract visionary leaders and security experts like Raviv Tamir, Roberto Rodriguez, Dirk-Jan Mollema, Mattias Borg, Stefan Schörling, Thomas Naunheim, Ran Marom, and Eyal Haik. In addition to selecting the right speakers, we aimed to tell a cohesive story throughout the day. By interconnecting our deep-dive sessions and zooming out when necessary, we could highlight different security challenges and make the content applicable to a broad audience. How did you manage to attract such a large global audience, both in-person and online? Most of the YellowHat organizers (not all of them) are also organizers for the Dutch Microsoft Security Meetup, which has 2000+ members. We used the power of our community to our advantage, attracting our local in-person attendees and promoting our event globally. To reach the large global audience, we had the help of our international speakers and Microsoft Security MVPs who could promote the event, as well as Microsoft's very own Raviv Tamir and Dan Michelson (YellowHat's founder). Lastly, our very own Ninja Cat with a yellow hard hat mascot was all over the socials for weeks to do our marketing for us. How did the hybrid format (in-person and online) impact the overall experience for attendees? When organizing a hybrid event, organizing suddenly becomes a lot more complex because you have to provide an excellent experience to both online and in-person attendees simultaneously. We engaged our online attendees during breaks by providing them with live interviews and sponsor commercials, while our in-person attendees had time for food, drinks, and networking opportunities. Ultimately, I hope we made the people feel like they were part of that YellowHat experience we were going for by providing them with the same deep technical content, but not prioritizing one experience over the other. We received overwhelmingly positive feedback from our in-person and online attendees, which reassures us that we are on the right track and motivates us to continue improving the YellowHat experience. What are your plans for future iterations of YellowHat, and how do you envision the event evolving? Even though YellowHat 2025 was already a global event, the in-person attendees mostly visited from the Netherlands. We would love to grow and evolve YellowHat into something that can attract an international audience, which will be a focus of our plans. We haven't officially decided on anything yet, but YellowHat 2026 will definitely happen, and it will be bigger, bolder, and more exciting. How can interested community members get involved in organizing or participating in future YellowHat conferences? If you have any questions or suggestions or would like to get involved, please feel free to contact us using our contact form. If you want to be the first to receive sneak peeks, early announcements, and exclusive insider information then please go ahead and subscribe to our mailing list so you won't miss anything about YellowHat! Why YellowHat? Yellow (Hard) Hats are used by construction workers for "protection and security", which a reference to our work as Microsoft Security Defenders / Protectors. The content at the conference was aligned with that; we're focused on the defensive / preventive side of (Microsoft) security. One of our unofficial sayings at the conference was; wear your yellow hat to prevent cyber threats.407Views3likes2CommentsSmart Auditing: Leveraging Azure AI Agents to Transform Financial Oversight
In today's data-driven business environment, audit teams often spend weeks poring over logs and databases to verify spending and billing information. This time-consuming process is ripe for automation. But is there a way to implement AI solutions without getting lost in complex technical frameworks? While tools like LangChain, Semantic Kernel, and AutoGen offer powerful AI agent capabilities, sometimes you need a straightforward solution that just works. So, what's the answer for teams seeking simplicity without sacrificing effectiveness? This tutorial will show you how to use Azure AI Agent Service to build an AI agent that can directly access your Postgres database to streamline audit workflows. No complex chains or graphs required, just a practical solution to get your audit process automated quickly. The Auditing Challenge: It's the month end, and your audit team is drowning in spreadsheets. As auditors reviewing financial data across multiple SaaS tenants, you're tasked with verifying billing accuracy by tracking usage metrics like API calls, storage consumption, and user sessions in Postgres databases. Each tenant generates thousands of transactions daily, and traditionally, this verification process consumes weeks of your team's valuable time. Typically, teams spend weeks: Manually extracting data from multiple database tables. Cross-referencing usage with invoices. Investigating anomalies through tedious log analysis. Compiling findings into comprehensive reports. With an AI-powered audit agent, you can automate these tasks and transform the process. Your AI assistant can: Pull relevant usage data directly from your database Identify billing anomalies like unexpected usage spikes Generate natural language explanations of findings Create audit reports that highlight key concerns For example, when reviewing a tenant's invoice, your audit agent can query the database for relevant usage patterns, summarize anomalies, and offer explanations: "Tenant_456 experienced a 145% increase in API usage on April 30th, which explains the billing increase. This spike falls outside normal usage patterns and warrants further investigation." Let’s build an AI agent that connects to your Postgres database and transforms your audit process from manual effort to automated intelligence. Prerequisites: Before we start building our audit agent, you'll need: An Azure subscription (Create one for free). The Azure AI Developer RBAC role assigned to your account. Python 3.11.x installed on your development machine. OR You can also use GitHub Codespaces, which will automatically install all dependencies for you. You’ll need to create a GitHub account first if you don’t already have one. Setting Up Your Database: For this tutorial, we'll use Neon Serverless Postgres as our database. It's a fully managed, cloud-native Postgres solution that's free to start, scales automatically, and works excellently for AI agents that need to query data on demand. Creating a Neon Database on Azure: Open the Neon Resource page on the Azure portal Fill out the form with the required fields and deploy your database After creation, navigate to the Neon Serverless Postgres Organization service Click on the Portal URL to access the Neon Console Click "New Project" Choose an Azure region Name your project (e.g., "Audit Agent Database") Click "Create Project" Once your project is successfully created, copy the Neon connection string from the Connection Details widget on the Neon Dashboard. It will look like this: postgresql://[user]:[password]@[neon_hostname]/[dbname]?sslmode=require Note: Keep this connection string saved; we'll need it shortly. Creating an AI Foundry Project on Azure: Next, we'll set up the AI infrastructure to power our audit agent: Create a new hub and project in the Azure AI Foundry portal by following the guide. Deploy a model like GPT-4o to use with your agent. Make note of your Project connection string and Model Deployment name. You can find your connection string in the overview section of your project in the Azure AI Foundry portal, under Project details > Project connection string. Once you have all three values on hand: Neon connection string, Project connection string, and Model Deployment Name, you are ready to set up the Python project to create an Agent. All the code and sample data are available in this GitHub repository. You can clone or download the project. Project Environment Setup: Create a .env file with your credentials: PROJECT_CONNECTION_STRING="<Your AI Foundry connection string> "AZURE_OPENAI_DEPLOYMENT_NAME="gpt4o" NEON_DB_CONNECTION_STRING="<Your Neon connection string>" Create and activate a virtual environment: python -m venv .venv source .venv/bin/activate # on macOS/Linux .venv\Scripts\activate # on Windows Install required Python libraries: pip install -r requirements.txt Example requirements.txt: Pandas python-dotenv sqlalchemy psycopg2-binary azure-ai-projects ==1.0.0b7 azure-identity Load Sample Billing Usage Data: We will use a mock dataset for tenant usage, including computed percent change in API calls and storage usage in GB: tenant_id date api_calls storage_gb tenant_456 2025-04-01 1000 25.0 tenant_456 2025-03-31 950 24.8 tenant_456 2025-03-30 2200 26.0 Run python load_usage_data.py Python script to create and populate the usage_data table in your Neon Serverless Postgres instance: # load_usage_data.py file import os from dotenv import load_dotenv from sqlalchemy import ( create_engine, MetaData, Table, Column, String, Date, Integer, Numeric, ) # Load environment variables from .env load_dotenv() # Load connection string from environment variable NEON_DB_URL = os.getenv("NEON_DB_CONNECTION_STRING") engine = create_engine(NEON_DB_URL) # Define metadata and table schema metadata = MetaData() usage_data = Table( "usage_data", metadata, Column("tenant_id", String, primary_key=True), Column("date", Date, primary_key=True), Column("api_calls", Integer), Column("storage_gb", Numeric), ) # Create table with engine.begin() as conn: metadata.create_all(conn) # Insert mock data conn.execute( usage_data.insert(), [ { "tenant_id": "tenant_456", "date": "2025-03-27", "api_calls": 870, "storage_gb": 23.9, }, { "tenant_id": "tenant_456", "date": "2025-03-28", "api_calls": 880, "storage_gb": 24.0, }, { "tenant_id": "tenant_456", "date": "2025-03-29", "api_calls": 900, "storage_gb": 24.5, }, { "tenant_id": "tenant_456", "date": "2025-03-30", "api_calls": 2200, "storage_gb": 26.0, }, { "tenant_id": "tenant_456", "date": "2025-03-31", "api_calls": 950, "storage_gb": 24.8, }, { "tenant_id": "tenant_456", "date": "2025-04-01", "api_calls": 1000, "storage_gb": 25.0, }, ], ) print("✅ usage_data table created and mock data inserted.") Create a Postgres Tool for the Agent: Next, we configure an AI agent tool to retrieve data from Postgres. The Python script billing_agent_tools.py contains: The function billing_anomaly_summary() that: Pulls usage data from Neon. Computes % change in api_calls. Flags anomalies with a threshold of > 1.5x change. Exports user_functions list for the Azure AI Agent to use. You do not need to run it separately. # billing_agent_tools.py file import os import json import pandas as pd from sqlalchemy import create_engine from dotenv import load_dotenv # Load environment variables load_dotenv() # Set up the database engine NEON_DB_URL = os.getenv("NEON_DB_CONNECTION_STRING") db_engine = create_engine(NEON_DB_URL) # Define the billing anomaly detection function def billing_anomaly_summary( tenant_id: str, start_date: str = "2025-03-27", end_date: str = "2025-04-01", limit: int = 10, ) -> str: """ Fetches recent usage data for a SaaS tenant and detects potential billing anomalies. :param tenant_id: The tenant ID to analyze. :type tenant_id: str :param start_date: Start date for the usage window. :type start_date: str :param end_date: End date for the usage window. :type end_date: str :param limit: Maximum number of records to return. :type limit: int :return: A JSON string with usage records and anomaly flags. :rtype: str """ query = """ SELECT date, api_calls, storage_gb FROM usage_data WHERE tenant_id = %s AND date BETWEEN %s AND %s ORDER BY date DESC LIMIT %s; """ df = pd.read_sql(query, db_engine, params=(tenant_id, start_date, end_date, limit)) if df.empty: return json.dumps( {"message": "No usage data found for this tenant in the specified range."} ) df.sort_values("date", inplace=True) df["pct_change_api"] = df["api_calls"].pct_change() df["anomaly"] = df["pct_change_api"].abs() > 1.5 return df.to_json(orient="records") # Register this in a list to be used by FunctionTool user_functions = [billing_anomaly_summary] Create and Configure the AI Agent: Now we'll set up the AI agent and integrate it with our Neon Postgres tool using the Azure AI Agent Service SDK. The Python script does the following: Creates the agent Instantiates an AI agent using the selected model (gpt-4o, for example), adds tool access, and sets instructions that tell the agent how to behave (e.g., “You are a helpful SaaS assistant…”). Creates a conversation thread A thread is started to hold a conversation between the user and the agent. Posts a user message Sends a question like “Why did my billing spike for tenant_456 this week?” to the agent. Processes the request The agent reads the message, determines that it should use the custom tool to retrieve usage data, and processes the query. Displays the response Prints the response from the agent with a natural language explanation based on the tool’s output. # billing_anomaly_agent.py import os from datetime import datetime from azure.ai.projects import AIProjectClient from azure.identity import DefaultAzureCredential from azure.ai.projects.models import FunctionTool, ToolSet from dotenv import load_dotenv from pprint import pprint from billing_agent_tools import user_functions # Custom tool function module # Load environment variables from .env file load_dotenv() # Create an Azure AI Project Client project_client = AIProjectClient.from_connection_string( credential=DefaultAzureCredential(), conn_str=os.environ["PROJECT_CONNECTION_STRING"], ) # Initialize toolset with our user-defined functions functions = FunctionTool(user_functions) toolset = ToolSet() toolset.add(functions) # Create the agent agent = project_client.agents.create_agent( model=os.environ["AZURE_OPENAI_DEPLOYMENT_NAME"], name=f"billing-anomaly-agent-{datetime.now().strftime('%Y%m%d%H%M')}", description="Billing Anomaly Detection Agent", instructions=f""" You are a helpful SaaS financial assistant that retrieves and explains billing anomalies using usage data. The current date is {datetime.now().strftime("%Y-%m-%d")}. """, toolset=toolset, ) print(f"Created agent, ID: {agent.id}") # Create a communication thread thread = project_client.agents.create_thread() print(f"Created thread, ID: {thread.id}") # Post a message to the agent thread message = project_client.agents.create_message( thread_id=thread.id, role="user", content="Why did my billing spike for tenant_456 this week?", ) print(f"Created message, ID: {message.id}") # Run the agent and process the query run = project_client.agents.create_and_process_run( thread_id=thread.id, agent_id=agent.id ) print(f"Run finished with status: {run.status}") if run.status == "failed": print(f"Run failed: {run.last_error}") # Fetch and display the messages messages = project_client.agents.list_messages(thread_id=thread.id) print("Messages:") pprint(messages["data"][0]["content"][0]["text"]["value"]) # Optional cleanup: # project_client.agents.delete_agent(agent.id) # print("Deleted agent") Run the agent: To run the agent, run the following command python billing_anomaly_agent.py Snippet of output from agent: Using the Azure AI Foundry Agent Playground: After running your agent using the Azure AI Agent SDK, it is saved within your Azure AI Foundry project. You can now experiment with it using the Agent Playground. To try it out: Go to the Agents section in your Azure AI Foundry workspace. Find your billing anomaly agent in the list and click to open it. Use the playground interface to test different financial or billing-related questions, such as: “Did tenant_456 exceed their API usage quota this month?” “Explain recent storage usage changes for tenant_456.” This is a great way to validate your agent's behavior without writing more code. Summary: You’ve now created a working AI agent that talks to your Postgres database, all using: A simple Python function Azure AI Agent Service A Neon Serverless Postgres backend This approach is beginner-friendly, lightweight, and practical for real-world use. Want to go further? You can: Add more tools to the agent Integrate with vector search (e.g., detect anomaly reasons from logs using embeddings) Resources: Introduction to Azure AI Agent Service Develop an AI agent with Azure AI Agent Service Getting Started with Azure AI Agent Service Neon on Azure Build AI Agents with Azure AI Agent Service and Neon Multi-Agent AI Solution with Neon, Langchain, AutoGen and Azure OpenAI Azure AI Foundry GitHub Discussions That's it, folks! But the best part? You can become part of a thriving community of learners and builders by joining the Microsoft Learn Student Ambassadors Community. Connect with like-minded individuals, explore hands-on projects, and stay updated with the latest in cloud and AI. 💬 Join the community on Discord here and explore more benefits on the Microsoft Learn Student Hub.554Views5likes1CommentBreaking Barriers: Addressing Unconscious Bias in the Tech Industry
Unconscious bias affects everyone, including tech professionals who believe they are objective and data-driven. In this blog post, we delve into the experiences of South African Business Applications MVP Carike Botha, who has worked diligently to recognize and mitigate her own biases. Carike will share practical strategies for the tech community to identify unconscious bias, fostering better team dynamics and enhancing creativity. While everyone experiences unconscious bias, how can you become more aware of your own unconscious biases and work to counteract them? In my experience, becoming aware of unconscious bias starts with actively seeking out programs and resources that can help raise awareness. At my workplace, there is a dedicated program for addressing unconscious bias. Without this program, I might never have fully recognized my own biases. The program includes a Women’s Forum which began about 6 years ago. Women from this group participated in a course created by the Chapter Network, and upon completion we facilitated an Unconscious Bias session. The response was overwhelmingly positive, and from there, the training was rolled out across the whole business. The Unconscious Bias training is led by women in the forum and is a 90-minute session. The training creates a space where employees can openly voice biases they’ve experienced or those they’ve recognized in themselves or others. During these sessions, we emphasize that there is no right or wrong, but the key takeaway is that awareness is crucial to making a difference. The positive response led to the training being rolled out company-wide. Remaining open-minded and encouraging others to approach you with any concerns about bias is essential. This practice helps cultivate a learning environment where everyone can grow together. By inviting feedback, we ensure biases are addressed, fostering mutual respect, accountability, and a culture where everyone's voice is heard and valued. In your experience, how does unconscious bias manifest in everyday interactions among tech professionals? Unconscious bias often appears in everyday interactions within the tech industry, especially through stereotypes about what a woman in tech “should” look like. These biases show up in assumptions about our wardrobe, whether we’re gamers, or the constant need to prove and validate our skills, despite our progress. Unconscious biases towards women in tech manifest in various ways, such as assumptions that women are less technical or always need validation. For example, some may think women lack expertise in certain roles because they are perceived as soft, social, or friendly, or assume they are in the industry just to "fill a quota." These biases can result in women being overlooked for leadership positions, denied career advancement, or having their ideas dismissed in meetings. Over time, these challenges can erode confidence and hinder career growth. Additionally, it limits innovation and reduces the diversity of thought crucial to the success of teams and organizations in the tech sector. Many tech professionals believe they are objective and data driven. How can unconscious bias still affect their work despite this belief? Many tech professionals believe they are objective and data-driven, making unconscious bias harder to detect. However, even in data-driven fields, personal perspectives influence decisions, whether consciously or not. Therefore, it's crucial to continuously challenge assumptions and remain open to feedback. Unconscious biases can subtly affect decisions in data-driven environments. For instance, when interpreting data, individuals may unintentionally prioritize information that aligns with their pre-existing beliefs rather than examining it objectively. In hiring, unconscious bias might lead a manager to favor candidates who resemble themselves or fit a certain mold, even if data suggests other candidates are better suited for the role. Algorithmic biases are another example—tech professionals may not realize that the models they design or the datasets they use reflect their own biases, resulting in skewed outcomes. Moreover, relying on data to justify decisions can be problematic. Tech professionals might overlook the broader context or social implications, thereby ignoring how biases in data collection or model assumptions could perpetuate inequality or exclusion. How can tech leaders and managers actively mitigate unconscious bias within their teams? Tech leaders and managers can mitigate unconscious bias by fostering an inclusive environment, providing ongoing training, and encouraging open conversations. Creating an inclusive environment starts with intentional actions that demonstrate a commitment to fairness, respect, and diversity. Practical ways to foster inclusion include: Promoting Diverse Representation: Actively recruit from diverse talent pools and ensure diverse voices are heard in meetings. Mentorship programs pairing underrepresented groups with senior leaders can help break down barriers and build trust. Implementing Bias-Reducing Strategies in Hiring: Use blind hiring processes where personal information for example gender, race, or age is removed from resumes or applications to focus on skills and qualifications. Encouraging Open Dialogue: Facilitate regular discussions around unconscious bias and its impact, allowing team members to safely share experiences and learn from one another. This can be done through lunch-and-learns or town hall meetings that address diversity and inclusion. Training and Resources: Offer continuous training on recognizing and addressing bias, and creating accessible resources (e.g., reading materials, workshops, or bias assessment tools) for employees to explore at their own pace. Celebrating Diversity: Actively recognize the contributions of diverse team members and celebrate various cultural holidays and events that reflect the team’s diversity. This raises awareness and fosters an environment where people feel seen and valued for who they are. By implementing these strategies, leaders can create an environment where merit and contributions are recognized, and biases are less likely to influence decisions. In conclusion, by actively practicing these skills, we can begin to create a safe space for open dialogue and awareness. Encouraging feedback and promoting diverse representation are crucial steps tech leaders can take to mitigate unconscious bias and foster a culture of respect and inclusion. Acknowledging and confronting our biases allows us to enhance team collaboration, boost innovation, and foster a deeper sense of community within the tech industry.477Views3likes1Comment