Blog Post

Microsoft Security Community Blog
3 MIN READ

Secure AI Workloads in Azure: Join Our Next Azure Decoded Session on April 8th

ShirleyseHaley's avatar
Mar 26, 2026

As organizations move from experimenting with AI to deploying it in production, one thing is clear: securing AI workloads takes a different approach than securing traditional cloud applications. Join us on April 8th to learn how Microsoft Defender for Cloud helps you discover, assess, and protect AI workloads in Azure.

AI introduces new risks—like prompt injection, data leakage, and model misuse—which means security teams need visibility and guardrails that extend beyond traditional cloud controls. In our next Azure Decoded session, we’ll focus on securing AI workloads in Azure with Microsoft Defender for Cloud.

Register now for the Azure Decoded session on April 8th at 12 PM PST.

Bringing AI security architecture to life with Azure Decoded

In the Lockdown AI workloads with Microsoft Defender for Cloud session, we move from concepts to implementation and show how these protections appear in the platform.

We’ll walk through where Microsoft Defender for Cloud fits into an end-to-end AI security strategy—and how discovery, posture management, and runtime protection work together to secure AI workloads built on Azure.

You’ll also see how to connect the dots across the workflow—so signals from AI resources, identity, and data controls roll up into actionable recommendations and alerts.

  • Enable and scope the AI workloads protections in Defender for Cloud
  • Use the Data & AI security dashboard to understand coverage and priority risks
  • Review posture findings (CSPM) and translate them into remediation steps
  • Investigate runtime detections (CWP) and see how they map into Microsoft Defender XDR

Our goal isn’t theory for theory’s sake. It’s to help you see how AI security shows up in real architecture and real workflows—so you can apply it confidently in your own environment.

Who is this session for?

We built this session for practitioners who are actively working with AI in Azure, including:

  • Developers building AI applications and agents
  • Security engineers responsible for protecting AI workloads
  • Cloud architects designing enterprise‑ready AI solutions

If you’re balancing innovation with security and governance, this session is designed to help you translate AI security concepts into concrete steps in Azure.

Before you join: Familiarity with core Azure concepts (subscriptions, resource groups, identity, networking) is helpful. You don’t need to be a machine learning expert—the focus is on securing the cloud resources and workflows that power AI solutions.

From AI security concepts to platform protections

If you’d like to get the most out of the session, start with the Microsoft Learn module Protect AI workloads with Microsoft Defender for Cloud. It introduces the building blocks of AI workloads in Azure and the security considerations that come with them.

In the module, you’ll learn how to:

  • Identify the layers that make up AI workloads in Azure
  • Understand AI-specific risks, including prompt injection, data leakage, and model misuse
  • Use Microsoft Foundry guardrails and observability to monitor and constrain model behavior
  • See how Defender for Cloud, Microsoft Purview, and Microsoft Entra ID work together for defense in depth and governance

Think of this as your foundation: it connects AI workload architecture to the controls you’ll configure in Azure, so you can protect inputs and outputs, maintain visibility, and apply governance without slowing delivery.

Catch up on the previous Azure Decoded session

If you missed the previous Azure Decoded session—or want a refresher—you can watch it on demand on YouTube:

▶️ Watch the previous Azure Decoded session on YouTube

It’s a helpful refresher and sets the stage for the April 8 discussion.

Turn learning into hands-on skills

If you want to move the show, you can do this in your environment. The Microsoft Applied Skills credential, Secure AI Solutions in the Cloud, is a great next step after the Azure Decoded session. You will:

  • Scope and enable protections for AI-related resources and workloads in Azure
  • Validate coverage and prioritize risks using the Data & AI security dashboard
  • Find and remediate posture gaps (CSPM) that increase exposure for AI workloads
  • Investigate runtime detections (CWP) and understand what they mean in the context of AI workload behavior
  • Triage AI-related alerts and incidents in Microsoft Defender XDR and decide on next steps

Get started

1️⃣ Register for Azure Decoded: Lock Down AI Workloads with Microsoft Defender for Cloud

2️⃣ Watch the previous Azure Decoded session before April 8th (optional refresher)

3️⃣ Earn the Microsoft Applied Skills: Secure AI Solutions in the Cloud credential to showcase skills.

The goal is to leave with something reusable: a practical sequence you can apply to new projects to confirm coverage, reduce posture gaps, and respond quickly when Defender signals suspicious activity tied to AI workloads.

Updated Mar 26, 2026
Version 1.0
No CommentsBe the first to comment