ai
621 TopicsBringing AI to Meetings with the Sample Builder
We’re excited to share a significant update to the Azure Communication Services Sample Builder. This release integrates Azure’s latest AI and video calling capabilities, implementing meeting transcription and AI-generated call summaries to help organizations deliver insightful and effective meeting experiences. In just a few minutes, without the need for any coding, you can use the Sample Builder to start prototyping video calling with Azure AI integration. Click the link below to begin or continue reading for further information. 👉 Try the Sample Builder Note, this pattern of combining Azure Communication Services and Azure AI for meeting transcription and summarization is not limited to Sample Builder. You can take the code and overall design pattern and rebuild this experience using the underlying APIs and SDKs. What Is the Sample Builder? The Sample Builder is a no-code Azure Portal experience that you use to brand, customize, build, and deploy a GitHub based sample for prototyping. The sample integrates and deploys multiple Azure services for secure and engaging meetings: Application hosting of the meeting front-end is provided by Azure App Services High-definition video calling for mobile and desktop browsers is provided by Azure Communication Services Calling Role-based access for attendees and providers are implemented using Azure Communication Services Rooms Accessible, customizable, fluid user experience is built on the open-source Azure Communication Services UI library Designed for developers, IT teams, and solution architects, the Sample Builder gets you started quickly but doesn’t produce a production-ready application. After prototyping, you can take the code from GitHub, customize user experience, integrate your own systems, and fine-tune the AI interactions for production. Smarter Meetings with Transcription and Summarization Today’s update integrates Azure AI Speech and Azure AI Language services directly into your meetings, transforming how companies capture, understand, and act on conversations. You can fine-tune this integration and can take advantage of the latest innovation from Azure AI, ensuring your end-users benefit from advancements in LLM models, natural language understanding, and conversation summarization. Transcription and meeting summarization are valuable across industries. For example: Healthcare: Automatically document patient-provider interactions, reduce administrative burden, and support clinical accuracy. Financial Services: Capture detailed records of client meetings to meet regulatory requirements and improve transparency. Education: Provide students and instructors with accessible records of virtual sessions, supporting learning and retention. Real-Time Transcription With transcription enabled, Azure Communication Services uses Azure AI Speech to Text to convert spoken language into a live, speaker-attributed transcript. This allows participants to stay fully engaged in the conversation without worrying about taking notes. Key benefits include: Accurate, multilingual transcription for a wide variety of languages. You can see the full list of supported languages here Speaker attribution for clarity and accountability Searchable meeting records for easy reference and knowledge sharing Support for multilingual teams, with transcripts that can be translated or reviewed post-meeting Training and quality assurance, enabling review of real conversations to improve service delivery Transcripts can be stored securely and managed according to your organization’s compliance, privacy, and retention policies, making them suitable for regulated industries. AI-Generated Call Summaries After the meeting, the Azure AI Language Summarization API automatically analyzes the transcript and generates a concise, structured summary. This summary distills the conversation into key takeaways, including: Main discussion points Decisions made Action items and next steps This helps teams: Align quickly on outcomes and responsibilities Brief stakeholders who couldn’t attend Maintain consistent documentation for compliance, audits, or internal reporting Reduce meeting fatigue by eliminating the need to rewatch or reread entire transcripts How to Get Started You can try these new features today by following a few simple steps: Go through the Sample Builder using the official tutorial. Select the Rooms option in the booking and calling steps. Enable auto-transcription or allow users to turn on transcription in the booking and calling steps. Enable meeting summary in the post-call steps. Choose how you want to deploy and go through follow up steps. Once fully deployed, start a call to test. If not using auto-transcription, open the meeting controls and select “Start Transcription”. Choose the spoken language (Click here for the list of supported languages). After the meeting ends, participants can view the AI-generated summary and download the transcript. In the sample experience, transcripts and summaries are available temporarily. In production environments, you can store them securely and use them to support training, compliance, or analytics. Microsoft 365 Integration Today’s update focuses on integrating Azure AI with Azure hosted meetings. However Azure Communication Services are interoperable with Microsoft Teams, and you can use the Sample Builder to deploy a branded Azure application that joins Teams meetings. Interoperability can be incredibly helpful for organizations that are already using Microsoft Teams but want to build a custom meeting experience for business-2-consumer (B2C) interactions. Using Microsoft Teams as the meeting host allows you to leverage: Teams Premium AI features for generating meeting notes and recommending follow-ups. Teams Premium virtual appointment features for scheduling of B2C meetings and sending reminders across SMS, email, and other channels. Teams Phone capabilities so end-users can dial into the meeting using traditional telephony. Get Started Today Explore the new AI-powered features in the Sample Builder and start building smarter virtual appointment experiences: 👉 Try the Sample Builder With transcription and meeting summaries, your meetings can do more than connect people—they can capture insights, drive action, and deliver lasting value.Join us at Microsoft's campus for the Ultimate Partner LIVE event!
The countdown is on - Ultimate Partner LIVE in Redmond, WA on May 1st – 2nd is fast approaching, and you won’t want to miss it! This is the event for connecting with Microsoft executives, partners, and industry experts driving ecosystem growth and shaping the future. Join us as we take over the iconic Microsoft Conference Center for two action-packed days. It’s your opportunity to engage directly with Microsoft leaders, learn from expert panel discussions, immerse yourself in hands-on workshops, and experience a targeted partnering experience. Microsoft’s ongoing support of Ultimate Partner, along with its sponsorship of the Ultimate Partner LIVE event, highlights the importance of ecosystem-led growth. A special thank you to all the Microsoft leaders below who plan to take the stage and those who coordinate behind the scenes to make this event a success. Over 30 industry-leading speakers and award-winning partners will grace the stage and share insights that will shape the future of cloud go-to-market strategies. The two-day agenda will cover topics such as: Prepare Your Microsoft Business for FY26 The Marketplace Ecosystem Opportunity The State of the Marketplace Ecosystem Microsoft’s AI & Software Vision Forging the New World of Data & AI Defining the Marketplace of the Future for SMBs The Power of Partnerships: Building AI Together Perspective of a Microsoft Marketplace POTY Award Winner Embracing Change and Pivoting for Success Acre of Diamonds: How to Leverage the Opportunity with Microsoft Unlocking the Opportunity Through Ecosystem Thinking The Partner Perspective for Ecosystem Thinking Future of Distribution Co-Selling Journey Celebrating Microsoft: 50 Years of Tectonic Shifts And more! Why This Event is Critical: We are standing at a pivotal moment: AI is advancing faster than most organizations can keep up with. Accelerate Microsoft FY25 Q4 priorities and understand FY26 opportunities. Microsoft Marketplace is poised for explosive growth. Go-to-market success now demands tighter alignment and precision from partners than ever before. Why You Can’t Afford to Miss This Event: Exclusive Access to Microsoft Decision-Makers The executives setting the agenda for FY26 will be in the room. You’ll gain clarity on FY26. Nicole Dezen, Chief Partner Officer & CVP Global Partner Solutions will kick us off, and you will hear from leaders across the Software, Services, Reseller, Marketplace, and Sales Organizations. Know exactly how to align your business, resources, and messaging to what Microsoft actually cares about this fiscal year. Interactive Workshops to Sharpen Your Skills How to co-sell smarter with Microsoft How to design GTM plays that convert How to win with commercial marketplace motions With Industry Experts like Reis Barrie, CEO of Carve Partners, John Jahnke, CEO of Tackle.io, Sam Gong, SVP Marketing at WorkSpan, Rebecca Jones, Chief Growth Officer of Bridge Partners, Erin Figer, Founder of Core Consulting, and leaders from The Partner Masters, Suger and more hosting immersive workshops, be prepared to learn and implement. Proven Frameworks for Delivering Results Vince Menzione will share his 7 Principles of Successful Partnerships—developed from working with top-performing partners across the ecosystem. Other experts will share frameworks for marketplace, co-selling, GTM, and more Networking Opportunities to Accelerate Your Business This will be a curated executive room where you'll connect with partner leaders, advisors, and Microsoft stakeholders in high-trust conversations that spark real opportunities. You’ll leave with a tighter, more strategic network—and future deals in motion. An Intentionally Designed Experience with Real-World Impact Every detail of this event—from the location in Redmond to the experience design—is built to support meaningful conversations, clarity, and action. What People Say: “What an incredible experience at the Ultimate Partner Live Executive Summit. Two days packed with relationship building, business growth, and learning – it felt like months of progress compressed into 2 days.” Steven Karachinsky, CEO at Ziro “Really loved the vibe and amazing conversations with the partners at the Ultimate event! I think you absolutely have the right formula to create impact for the entire ecosystem with such a gathering.” Sandy Gupta, VP, Ecosystems of Global Software Companies at Microsoft “The event was informative, insightful, and inspiring. Your ability to put into words the tectonic shift we are all experiencing is refreshing! Thanks for being the trailblazer by providing thoughtful content and curated partner experiences. We have all been craving this for so long.” Regina Manfredi, EVP, General Manager at Crayon Group US “Attending this event was like striking gold 🙂 As a startup founder focused on co-sell and partnership, the validation and insights I gained at this event were invaluable; the future of partnerships and alliances is clearly bright. Vince Menzione, you are truly a powerhouse, and I wish you continued success! Most importantly, thank you for streaming the entire event, was truly incredible.” Archana Vadya, Founder & CEO at PartneRite “A big shout out to Vince Menzione for bringing this scale event (Ultimate Partner LIVE in Dallas) to life! It was a mega effort, and the results were amazing. Just look at the abundance of LinkedIn posts! It was an absolute pleasure sharing the stage with so many incredible speakers and colleagues from Microsoft and several of our partners like WorkSpan, Carve Partners, BDO, EY, Archive360, Sage, PartnerTap.” Kevin Peesker, (former) President, SMC - Small, Medium, Corporate Business at Microsoft This will prove to be the most valuable two days for your business in the first half of 2025. Ultimate Partner LIVE is a premium, focused, two-day immersive experience that will equip you with the tools and insights to lead through change and drive measurable results in FY26. Register now and use code ULTIMATEVIP50 at checkout for an exclusive discount.387Views0likes0CommentsAI is changing the game—and so are the threats.
As generative AI and large language models (LLMs) become central to modern applications, they introduce new, unique security challenges that traditional software wasn’t built to handle. From prompt injection to model poisoning and jailbreaks, the attack surface is evolving fast. In this edition of Microsoft’s Software Development Company Security Series, they dive into the top AI security risks, how they map to the OWASP Top 10 for LLMs, and the practical mitigations dev teams can apply today. Whether you're building with OpenAI, Azure AI, or custom models, this is a must-read for anyone shipping secure, responsible AI. 👉 Read the full breakdown: Navigating AI security: Identifying risks and implementing mitigations13Views0likes0CommentsM365 Developers Update | May 2025
Spotlight Learn how you can build advanced agents for Microsoft 365 Copilot by leveraging the Microsoft 365 Agents SDK. Join the session Get an in-depth look at building agents in Copilot Studio, with a special focus on the latest innovations and what's ahead. Add to schedule Discover how you can add more knowledge to Microsoft 365 Copilot with Copilot connectors and actions. See more details From Copilot Studio to Visual Studio and Azure AI Foundry, join this session to discover the various ways you can build agents for Microsoft 365. Register now Learn Explore how to build Microsoft Teams collaborative agents as virtual colleagues with Visual Studio Code. Sign up Join us for a session on building advanced copilot Studio agents by integrating Azure AI Search, the Azure model catalog, and Model-Context Protocol (MCP) into your agents. Learn more Build task -specific declarative agents for Microsoft 365 Copilot using advanced recipes in Copilot Studio. View session See how you can build agent in Copilot Studio with deep integration into Azure AI Foundry services. Explore more Tune into this breakout session to learn how to build declarative agents for Microsoft 365 Copilot. Save to favorites Keep up to date Microsoft build labs- Learn how to use all of the latest dev tools in our hands-on labs. Sign up Updated Teams AI library- Create even more powerful agents for Microsoft Teams. Read how LinkedIn- Get the lates news, product announcements, demos and more Follow us Community calls- Learn from our experts on a variety of Microsoft 365 platform topics. Join a call26Views0likes0CommentsNeon Serverless Postgres is Now Generally Available!
We are excited to announce that Neon Serverless Postgres is now generally available as an Azure Native Integration. This milestone marks a significant step forward in providing developers with a powerful, serverless Postgres solution, seamlessly integrated within the Azure ecosystem. We’re excited to bring Neon to all Azure developers, especially AI platforms. Neon Serverless Postgres scales automatically to match your workload and can branch instantly for an incredible developer experience. And for AI developers concerned about scale, cost efficiency, and data privacy, Neon enables them to easily adopt database-per-customer architectures, ensuring real-time provisioning and data isolation for their customers. - Nikita Shamgunov, CEO, Neon. Neon as a natively integrated offering on Azure Azure Native Integrations enable developers to seamlessly provision, manage, and tightly integrate independent software vendor (ISV) software and services within the Azure ecosystem. Neon's native Integration enables developers to use Neon Serverless Postgres with the same experience as any other Azure service, fully integrated within Azure's workflows, identity systems, and billing framework. Developers can provision Neon Serverless Postgres directly from the Azure Portal, where it is listed alongside Microsoft’s database offerings in the Databases section, complete with a partner badge. Once provisioned, Neon projects, branches, and connection strings can be managed using the Azure portal, CLI, and SDK. The deep integration also simplifies security and procurement, allowing authentication using Microsoft Entra ID (formerly Azure Active Directory) and consolidating Neon usage into your organization’s Azure bill. New Features Available with GA Release With General Availability release, several new features are now accessible. The intent is to make all innovative Neon features available to developers in the Azure eco-system. With GA, the following features are now added: Neon Project and Branch Creation from Azure: Developers now can create Neon projects and branches directly through the Azure portal. This enhancement simplifies team operations, facilitates tighter CI/CD integration, allows for more detailed environment management, and accelerates the transition from provisioning to production. Connect to database: The connect experience as part of Azure resource allows developers to connect Neon Serverless Postgres to their application stack directly from the Azure portal. This simplifies the process of connecting Neon databases to a few clicks from Azure portal. Azure CLI and SDKs (Java, Python, .NET, Go, JS): Developers can use the Azure CLI and SDK to manage Neon organization, projects, branches, and connection strings. This flexibility allows developers to choose the client of their preference and integrate Neon seamlessly into their current development environment. Semantic Kernel: Neon integrates directly with Semantic Kernel, enabling developers to orchestrate full Retrieval-Augmented Generation (RAG) pipelines. Generate embeddings with Azure OpenAI, store and query them in Neon using pgvector, and retrieve relevant results in milliseconds. You can read more about this release in the Neon launch announcement as well. 🎉 Neon for Enterprises and AI startups Neon offers instant provisioning, branching, seamless integration, and cost-effective development workflows. This makes it the perfect solution which can power organizations of all sizes and growth stage, whether you are an Enterprise or a fast-scaling AI startup. For Enterprises, Neon offers efficient scalability, strong recovery guarantees, and developer workflows that align with modern software development practices. Neon’s compute layer scales automatically based on workload, ensuring smoother performance during peak traffic, lower infrastructure costs, and reduced operational overhead. For AI startups, Neon provides the speed, flexibility, and automation needed for modern AI applications. If you are part of Microsoft for Startups program, you can leverage the native integration of the serverless Postgres to experience features like scale to zero and branching for faster and a cost-effective AI solution. Try Out Neon Serverless Postgres on Azure We invite you to try out Neon Serverless Postgres on Azure. Follow the Docs and start leveraging the power of serverless Postgres in your applications. The collaboration between Microsoft and Neon has resulted in a developer-focused roadmap, ensuring continuous improvements and new features. Link to our Feedback communities If you have new scenarios you would like to propose, please share your feedback with us on Azure feedback community. You can reach out to Neon via Neon discord community or contact feedback@neon.tech. We are committed to listening to you and enhancing the Neon experience to solve your problems. We look forward to seeing how developers will leverage Neon Serverless Postgres to build intelligent and scalable Azure applications. Next steps For all the links and resources, please refer the following: Subscribe to Neon Serverless Postgres on Azure portal or Azure Marketplace Learn more about Neon Serverless Postgres at Microsoft docs Read the launch blogpost by Neon Discover more about Neon Learn about Microsoft’s investment in Neon609Views3likes1CommentLearn How to Build Smarter AI Agents with Microsoft’s MCP Resources Hub
If you've been curious about how to build your own AI agents that can talk to APIs, connect with tools like databases, or even follow documentation you're in the right place. Microsoft has created something called MCP, which stands for Model‑Context‑Protocol. And to help you learn it step by step, they’ve made an amazing MCP Resources Hub on GitHub. In this blog, I’ll Walk you through what MCP is, why it matters, and how to use this hub to get started, even if you're new to AI development. What is MCP (Model‑Context‑Protocol)? Think of MCP like a communication bridge between your AI model and the outside world. Normally, when we chat with AI (like ChatGPT), it only knows what’s in its training data. But with MCP, you can give your AI real-time context from: APIs Documents Databases Websites This makes your AI agent smarter and more useful just like a real developer who looks up things online, checks documentation, and queries databases. What’s Inside the MCP Resources Hub? The MCP Resources Hub is a collection of everything you need to learn MCP: Videos Blogs Code examples Here are some beginner-friendly videos that explain MCP: Title What You'll Learn VS Code Agent Mode Just Changed Everything See how VS Code and MCP build an app with AI connecting to a database and following docs. The Future of AI in VS Code Learn how MCP makes GitHub Copilot smarter with real-time tools. Build MCP Servers using Azure Functions Host your own MCP servers using Azure in C#, .NET, or TypeScript. Use APIs as Tools with MCP See how to use APIs as tools inside your AI agent. Blazor Chat App with MCP + Aspire Create a chat app powered by MCP in .NET Aspire Tip: Start with the VS Code videos if you’re just beginning. Blogs Deep Dives and How-To Guides Microsoft has also written blogs that explain MCP concepts in detail. Some of the best ones include: Build AI agent tools using remote MCP with Azure Functions: Learn how to deploy MCP servers remotely using Azure. Create an MCP Server with Azure AI Agent Service : Enables Developers to create an agent with Azure AI Agent Service and uses the model context protocol (MCP) for consumption of the agents in compatible clients (VS Code, Cursor, Claude Desktop). Vibe coding with GitHub Copilot: Agent mode and MCP support: MCP allows you to equip agent mode with the context and capabilities it needs to help you, like a USB port for intelligence. When you enter a chat prompt in agent mode within VS Code, the model can use different tools to handle tasks like understanding database schema or querying the web. Enhancing AI Integrations with MCP and Azure API Management Enhance AI integrations using MCP and Azure API Management Understanding and Mitigating Security Risks in MCP Implementations Overview of security risks and mitigation strategies for MCP implementations Protecting Against Indirect Injection Attacks in MCP Strategies to prevent indirect injection attacks in MCP implementations Microsoft Copilot Studio MCP Announcement of the Microsoft Copilot Studio MCP lab Getting started with MCP for Beginners 9 part course on MCP Client and Servers Code Repositories Try it Yourself Want to build something with MCP? Microsoft has shared open-source sample code in Python, .NET, and TypeScript: Repo Name Language Description Azure-Samples/remote-mcp-apim-functions-python Python Recommended for Secure remote hosting Sample Python Azure Functions demonstrating remote MCP integration with Azure API Management Azure-Samples/remote-mcp-functions-python Python Sample Python Azure Functions demonstrating remote MCP integration Azure-Samples/remote-mcp-functions-dotnet C# Sample .NET Azure Functions demonstrating remote MCP integration Azure-Samples/remote-mcp-functions-typescript TypeScript Sample TypeScript Azure Functions demonstrating remote MCP integration Microsoft Copilot Studio MCP TypeScript Microsoft Copilot Studio MCP lab You can clone the repo, open it in VS Code, and follow the instructions to run your own MCP server. Using MCP with the AI Toolkit in Visual Studio Code To make your MCP journey even easier, Microsoft provides the AI Toolkit for Visual Studio Code. This toolkit includes: A built-in model catalog Tools to help you deploy and run models locally Seamless integration with MCP agent tools You can install the AI Toolkit extension from the Visual Studio Code Marketplace. Once installed, it helps you: Discover and select models quickly Connect those models to MCP agents Develop and test AI workflows locally before deploying to the cloud You can explore the full documentation here: Overview of the AI Toolkit for Visual Studio Code – Microsoft Learn This is perfect for developers who want to test things on their own system without needing a cloud setup right away. Why Should You Care About MCP? Because MCP: Makes your AI tools more powerful by giving them real-time knowledge Works with GitHub Copilot, Azure, and VS Code tools you may already use Is open-source and beginner-friendly with lots of tutorials and sample code It’s the future of AI development connecting models to the real world. Final Thoughts If you're learning AI or building software agents, don’t miss this valuable MCP Resources Hub. It’s like a starter kit for building smart, connected agents with Microsoft tools. Try one video or repo today. Experiment. Learn by doing and start your journey with the MCP for Beginners curricula.287Views1like1Comment🤖 Zero-Code Business Intelligence with OpenAI and Microsoft Teams
📘 Introduction Manually compiling and distributing daily performance reports can be time-consuming. This post demonstrates how to automate this process using Azure Logic Apps (Standard), Azure OpenAI, and Prompt Templates to generate insightful daily summaries and deliver them directly to Microsoft Teams. Here's a visual representation of the automated workflow: 🏢 Scenario Imagine needing to provide a daily summary of branch performance for a multi-location restaurant business. Instead of manually gathering data, calculating rankings, and writing a summary, a Logic App can: Run automatically each day. Read the latest sales and performance data from Azure Blob Storage. Use Azure OpenAI with a carefully crafted prompt template to analyze the data. Generate a formatted, engaging leaderboard summary. Post the summary directly to a specified user or channel in Microsoft Teams. 🧠 Demystifying Prompt Templates Think of Prompt Templates as smart blueprints for your AI conversations. They are pre-defined text structures using Jinja2 syntax, allowing you to insert dynamic data (like daily sales figures) exactly where needed when the Logic App runs. For our Logic App, this translates to: Crafting a base prompt with placeholders, such as {{ branch.TotalSalesEuro}} for sales figures. Letting the Logic App automatically fill these placeholders using data retrieved earlier (e.g., from the blob storage file). Ensuring the AI receives well-formed, context-rich instructions every time, without manual intervention. ✨ Why Use Prompt Templates in Logic Apps? Integrating Prompt Templates into your Logic Apps offers several advantages: Standardized Prompts: Keep your AI instructions consistent by defining the core logic in one template, rather than scattering variations throughout your workflow actions. Build Once, Use Often: Design a template for one report (like our leaderboard) and easily adapt or reuse it for other analytical tasks or across different Logic Apps. Simplified Updates: Need to refine the AI's instructions? Modify the central template, without editing the entire flow. Seamless Data Integration: Connect the dots effortlessly. Data flowing through your Logic App (from triggers, storage, APIs, etc.) can be directly wired into your prompt template variables. This approach streamlines the development of data-driven AI automations, making your solutions more robust and easier to manage. 🧰 Prerequisites To build this, you'll need: An Azure Logic App (Standard) resource. An Azure OpenAI resource with a deployed model (e.g., gpt-4). An Azure Blob Storage account to store the daily sales data (e.g., sales.json). A connection configured for Microsoft Teams in your Logic App. A connection configured for Azure Blob Storage. A connection configured for Azure OpenAI. 🔧 Build the Logic App workflow Here’s a breakdown of the key steps in the Logic App: 0.Build the Logic App workflow create workflow based on the steps showing below 1. Trigger: Recurrence Set to run daily to automate the report generation. You can specify interval 2. Action: Read blob content based on URI Add an action Search for Blob Connects to Azure Blob Storage. Note: This connection uses managed identity. You must grant the "Storage Blob Data Contributor" role to this Logic App via your storage account's Access Control (IAM) settings. Ensure the sales.json file is uploaded to your Azure Blob Storage. In the "Read blob content based on URI" configuration, set the Blob URI to the path of this file (e.g., mycontainer/myfolder/sales.json). Note: Enter the URI path without quotation marks. 3. Action: data (Compose) Takes the file content output from the previous step. We will create a 'Compose' component, then we rename it to 'data' Our goal is to store the json file data in a compose component, it is helping us to monitor the data easier. This makes the data easily accessible for the prompt template. As shown in the figure, click on the insert dynamic content (step 3 in the figure), and then choose Response from read blob action Content 4. Action: question (Compose) Defines the core task for the AI. Similar to previous step, we need to create a compose component, we then call it question, then input is as below Input: [ { "role": "user", "content": "Summaris sales data for me" } ] In this example, it's a static input, but it could be made dynamic (e.g., taken from the trigger or another source). 5. Action: Get chat completions using Prompt Template Next action is to create an Azure OpenAI service, select Get chat completions using Prompt Template Connects to your Azure OpenAI service. You need to fill in the required parameters based on your Azure OpenAI Service portal. Configuration: Deployment Identifier: Your AOAI model deployment name (e.g., gpt-4). Prompt Template: Contains the instructions, formatting rules, example output, and placeholders for dynamic data (data) and the user query (question). (See details below for the template). Prompt Template Variables: Maps the outputs from the data and question Compose actions to the variables used in the template. You need to choose Prompt Template Variable in order to be able to add your dynamic inputs for the prompt template. You need to assign output of compose components as an input for your prompt template as shown in the figure below 6. Action: Post message in a chat or channel Add a new action, search for Microsoft Teams and select Post message in a chat or channel Connects to Microsoft Teams by signing in into your account. Note: You need to use your School or Corporate Microsoft Teams for this feature Sends the AI-generated summary. Configuration: Post As: Flow bot Post in: chat with Flow bot Recipient: Your email address Message: Insert output of your OpenAI response 7.RUN Workflow is done and it is ready to be run. Here is the output you can get in Microsoft Teams 📝 Prompt Template Details The core of the AI generation lies in the prompt template provided to the Get chat completions using Prompt Template action. Let's break it down: system: You are a helpful assistant for a multi-branch restaurant business... Your job is to: <Your instructions> Generate a clear and engaging leaderboard-style summary for Microsoft Teams. Start with a title: "📊 Daily Branch Leaderboard - @{formatDateTime(convertTimeZone(utcNow(), 'UTC', 'Europe/Dublin'), 'MMMM,d,yyyy')}\r\n}" Briefly summarize how the branches performed in ranked order... Use emojis... Highlight top performers... Be sure that you are using \n between each two branch names... <Your instructions> <Example> #Example for generating response: 📊 Daily Branch Leaderboard – May 5, 2025 Here’s how your branches performed today: 1️⃣ Galway Bay – 🥇 Top performer... 2️⃣ Dublin Central – Strong sales... <Example> <Sales Data> # Data # Branch Sales Data Here is the branch sales data... {% for branch in data %} Branch ID: {{branch.BranchID}} Branch Name: {{branch.BranchName}} Total Sales (€): {{branch.TotalSalesEuro}} ... {% endfor %} <Sales Data> <User Query> # Question The employee has asked the following: {% for item in question %} {{item.role}}: {{item.content}} {% endfor %} <User Query> system:: Sets the persona and overall goal for the AI. <Your instructions>: Provides specific rules for formatting, tone, and content. @{formatDateTime(...)}}: Logic Apps expression to dynamically insert the current date. <Example>: Shows the desired output format. <Sales Data>: Defines how the input data should be structured for the AI. {% for branch in data %}: Jinja loop iterates through the branch data read from the blob. {{ branch.BranchName }} etc.: Jinja placeholders inject specific data fields for each branch. <User Query>: Injects the question defined in the question Compose action. This structured template ensures the AI receives all necessary context and instructions to generate the desired leaderboard summary consistently. ✅ Final Output When the Logic App runs, it will post a message similar to the example provided in the prompt template directly into the specified Teams chat or channel, providing a clear, concise, and engaging daily performance summary. 💬 Feedback Let me know if you found this helpful or have ideas for other Logic Apps + AI automation scenarios!Microsoft's Verification and Support System
Microsoft's verification and support system is absolutely appalling. Despite providing every piece of required information for my company, their system continues to reject my verification attempts. I've gone above and beyond by paying extra to create additional email accounts in the format email address removed for privacy reasons. I've submitted QR-coded invoices officially verified by government authorities. Our WHOIS data is current and accurate. My name matches perfectly across all documentation and our DUNS number is correct. Yet, I keep receiving the same frustrating response: "This is because the primary contact details did not match." When I reach out through their official support channels, it takes them five days just to respond. It's been two months now, and I'm still waiting for approval just to register. It's absurd that such a large company maintains such a disastrous system. All they need to do is verify real users directly without relying on AI algorithms. They're so focused on automating everything with artificial intelligence that they've created a dysfunctional process. Despite all their technology, humans still remain superior at handling these verification tasks, yet Microsoft seems determined to automate a process they haven't properly designed.118Views0likes9CommentsWindows Recall keeps crashing
Windows recall keeps crashing for me with this error in the event viewer I feel like this could be related to the crashing, but I'm not sure Faulting application name: AIXHost.exe, version: 2125.8200.0.0, time stamp: 0x67e02391 Faulting module name: AIXView.dll, version: 2125.8000.0.0, time stamp: 0x67ddbaa7 Exception code: 0xc0000409 Fault offset: 0x00000000000874a1 Faulting process id: 0x4AC0 Faulting application start time: 0x1DBBFAAEE149DAA Faulting application path: C:\WINDOWS\SystemApps\MicrosoftWindows.Client.AIX_cw5n1h2txyewy\AIXHost.exe Faulting module path: C:\WINDOWS\SystemApps\MicrosoftWindows.Client.CoreAI_cw5n1h2txyewy\AIXView.dll Report Id: b7747b32-c2ed-4a0d-880d-248aebabb9ae Faulting package full name: MicrosoftWindows.Client.AIX_1000.26100.3915.0_x64__cw5n1h2txyewy Faulting package-relative application ID: AIXApp63Views1like2CommentsNavigating AI security: Identifying risks and implementing mitigations
As artificial intelligence becomes central to software innovation, it also introduces unique security challenges—especially in applications powered by large language models (LLMs). In this edition of the Software Development Company Security Series, we explore the evolving risks facing AI-powered products and share actionable strategies to secure AI solutions throughout the development lifecycle. *Data based on 2024–2025 global reports from Cyberhaven, Swimlane, FS-ISAC, Capgemini, Palo Alto Networks, and Pillar Security analyzing AI security incidents across sectors. Understanding the Evolving AI Threat Landscape AI systems, particularly LLMs, differ from traditional software in one fundamental way: they’re generative, probabilistic, and nondeterministic. This unpredictability opens the door to novel security risks, including: Sensitive Data Exposure: Leaked personal or proprietary data via model outputs. Prompt Injection: Manipulated inputs crafted to subvert AI behavior. Supply Chain Attacks: Risks from compromised training data, open-source models, or third-party libraries. Model Poisoning: Insertion of malicious content during training to bias outcomes. Jailbreaks & Misuse: Circumventing safeguards to produce unsafe or unethical content. Compliance & Trust Risks: Legal, regulatory, and reputational consequences from unvalidated AI outputs. These risks underscore the need for a security-first approach to designing, deploying, and operating AI systems. Key Risks: The OWASP Top 10 for LLMs The OWASP Top 10 LLM Risks offer a framework for understanding threats specific to generative AI. Key entries include: Prompt Injection Sensitive Data Disclosure Model and Data Poisoning Excessive Model Permissions Hallucination & Misinformation System Prompt Leakage Vector Embedding Exploits Uncontrolled Resource Consumption Each of these risks presents opportunities for attackers across the AI lifecycle—from model training and prompt design to output handling and API access. Inherent Risks of LLM-Based Applications Three core attributes contribute to LLM vulnerabilities: Probabilistic Outputs: Same prompt, different results. Non-Determinism: Inconsistent behavior, compounded over time. Linguistic Flexibility: Prone to manipulation and hallucination. Common attack scenarios include: Hallucination: Fabricated content presented as fact—dangerous in domains like healthcare or legal. Indirect Prompt Injection: Malicious prompts hidden in user content (emails, docs). Jailbreaks: Bypassing guardrails using clever or multi-step prompting. Mitigations include retrieval-augmented generation (RAG), output validation, prompt filtering, and user activity monitoring. Microsoft’s Approach to Securing AI Applications Securing AI requires embedding Zero Trust principles and responsible AI at every stage. Microsoft supports this through: Zero Trust Architecture Verify explicitly based on identity and context Use least privilege access controls Assume breach with proactive monitoring and segmentation Shared Responsibility Model Customer-managed models: You manage model security and data. Microsoft-managed platforms: Microsoft handles infrastructure; you configure securely. End-to-End Security Controls Protect infrastructure, APIs, orchestration flows, and user prompts. Enforce responsible AI principles: fairness, privacy, accountability, and transparency. Tools & Ecosystem Microsoft Defender for Cloud: Monitors AI posture and detects threats like credential misuse or jailbreak attempts. Azure AI Foundry: Scans models for embedded risks and unsafe code. Prompt Shield: Filters harmful inputs in real-time. Red Team Tools (e.g., Pirate): Simulate attacks to harden defenses pre-deployment. Action Steps for Software Companies Securing AI Products Here’s a focused checklist for AI builders and software development companies: Embed Security Early Apply Zero Trust by default Use identity and access management Encrypt data in transit and at rest Leverage Microsoft Security Ecosystem Enable Defender for Cloud for AI workload protection Scan models via Azure AI Foundry Deploy Prompt Shield to defend against jailbreaks and injection attacks Secure the Supply Chain Maintain a Software Bill of Materials (SBOM) Regularly audit and patch dependencies Sanitize external data inputs Mitigate LLM-Specific Risks Validate outputs and restrict unsafe actions Use RAG to reduce hallucination Monitor prompt usage and filter malicious patterns Build for Multi-Tenancy and Compliance Use Well-Architected Framework for OpenAI Isolate tenant data Ensure alignment with data residency and privacy laws Continuously Improve Conduct regular red teaming Monitor AI systems in production Establish incident response playbooks Foster a Security-First Culture Share responsibility across engineering, product, and security teams Train users on AI risks and responsible usage Update policies to adapt to evolving threats Conclusion: Secure AI Is Responsible AI AI’s potential can only be realized when it is both innovative and secure. By embedding security and responsibility across the AI lifecycle, software companies can deliver solutions that are not only powerful—but trusted, compliant, and resilient. Explore More OWASP Top 10 for Large Language Model Applications | OWASP Foundation Overview - AI threat protection - Microsoft Defender for Cloud | Microsoft Learn Prompt Shields in Azure AI Content Safety - Azure AI services | Microsoft Learn AI Red Teaming Agent - Azure AI Foundry | Microsoft Learn AI Trust and AI Risk: Tackling Trust, Risk and Security in AI Models What is Azure AI Content Safety? - Azure AI services | Microsoft Learn Overview of Responsible AI practices for Azure OpenAI models - Azure AI services | Microsoft Learn Architecture Best Practices for Azure OpenAI Service - Microsoft Azure Well-Architected Framework | Microsoft Learn Azure OpenAI Landing Zone reference architecture AI Workload Documentation - Microsoft Azure Well-Architected Framework | Microsoft Learn Announcing new tools in Azure AI to help you build more secure and trustworthy generative AI applications | Microsoft Azure Blog HiddenLayer Model Scanner helps developers assess the security of open models in the model catalog | Microsoft Community Hub Inside AI Security with Mark Russinovich | BRK227 The Price of Intelligence - ACM Queue169Views0likes0Comments