security copilot
42 TopicsCyber Dial Agent: Protecting Your Custom Copilot Agents
Introducing…the Cyber Dial Agent; a browser add-on and agent that streamlines security investigations by providing analysts with a unified, menu-driven interface to quickly access relevant pages in Microsoft Defender, Purview, and Defender for Cloud. This tool eliminates the need for manual searches across multiple portals, reducing investigation time and minimizing context switching for both technical and non-technical users. Visit the full article “Safeguard & Protect Your Custom Copilot Agents (Cyber Dial Agent)” in the Microsoft Purview Community Blog. You’ll access detailed, visual, step-by-step guides on all of the following: Importing the agent that was built via Microsoft Copilot Studio solution into another tenant and publishing it afterward To add your browser add-on solution in Microsoft Edge (or any modern browser) Using Purview DSPM for AI to Secure (Cyber Dial Custom Agent) Copilot Studio Agents Read the full article by Hesham_Saad.123Views0likes0CommentsSecurity Copilot- Demystifying SCUs Deep Dive and AMA
Security Compute Units (SCUs) are the required resource units that power Microsoft Security Copilot, ensuring dependable and consistent performance across both standalone and embedded product experiences within Microsoft Security. In this session, we’ll demystify SCUs by unpacking: What SCUs are and how they function The billing models that govern their usage Optimization strategies to maximize value Best practices for SCU planning and deployment You’ll also have the opportunity to engage directly with Security Copilot experts to ask your SCU-related questions and gain practical insights. What is an AMA? An 'Ask Microsoft Anything' (AMA) session is an opportunity for you to engage directly with Microsoft employees! This AMA will consist of a short presentation followed by taking questions on-camera from the comment section down below! Ask your questions/give your feedback and we will have our awesome Microsoft Subject Matter Experts engaging and responding directly in the video feed. We know this timeslot might not work for everyone, so feel free to ask your questions at any time leading up to the event and the experts will do their best to answer during the live hour. This page will stay up so come back and use it as a resource anytime. We hope you enjoy!70Views0likes0CommentsNo More Guesswork—Copilot Makes Azure Security Crystal Clear
Elevating Azure Security and Compliance In today’s rapidly evolving digital landscape, security and compliance are more critical than ever. As organizations migrate workloads to Azure, the need for robust security frameworks and proactive compliance strategies grows. Security Copilot, integrated with Azure, is transforming how technical teams approach these challenges, empowering users to build secure, compliant environments with greater efficiency and confidence. As a security expert, I’d like to provide clear guidance on how to effectively utilize Security Copilot in the ever-evolving landscape of security and compliance. Security Copilot is a premium offering; it includes advanced capabilities that go beyond standard Azure security tools. These features may require specific licensing or subscription tiers. It provides deeper insights, enhanced automation, and tailored guidance for complex security scenarios. Below, I’ll highlight a range of security topics with sample Copilot prompts that you can use to help create a more secure and compliant environment. Getting Started with Microsoft Security Copilot Before leveraging the advanced capabilities of Security Copilot, it's important to understand the foundational requirements and setup steps: Azure Subscription Requirement Security Copilot is not automatically available in all Azure subscriptions. To use it, your organization must have an active Azure subscription. This is necessary to provision Security Compute Units (SCUs), which are the core resources that power Copilot workloads. Provisioning Security Compute Units (SCUs) SCUs are billed hourly and can be scaled based on workload needs. At least one SCU must be provisioned to activate Security Copilot. You can manage SCUs via the Azure portal or the Security Copilot portal, adjusting capacity as needed for performance and cost optimization. Role-Based Access Control To set up and manage Security Copilot: You need to be an Azure Owner or Contributor to provision SCUs. Users must be assigned appropriate Microsoft Entra roles (e.g., Security Administrator) to access and interact with Copilot features. Embedded Experience Security Copilot can be used as a standalone tool or embedded within other Microsoft services like Defender for Endpoint, Intune, and Purview, offering unified security management experience. Data Privacy and Security: Foundational Best Practices Why settle for generic security advice when Security Copilot delivers prioritized, actionable guidance backed by Microsoft’s best practices? Copilot doesn’t just recommend security measures, it actively helps you implement them, leveraging advanced features like encryption and granular access controls to safeguard every layer of your Azure environment. While Security Copilot doesn’t directly block threats like a firewall or Web Application Firewall (WAF), it enhances data integrity and confidentiality by analyzing security signals across Azure, identifying vulnerabilities, and guiding teams with prioritized, actionable recommendations. It helps implement encryption, access controls, and compliance-aligned configurations, while integrating with existing security tools to interpret logs and suggest containment strategies. By automating investigations and supporting secure-by-design practices, Copilot empowers organizations to proactively reduce breach risks and maintain a strong security posture. Secure Coding and Developer Productivity While Security Copilot supports secure coding by identifying vulnerabilities like SQL injection, Cross-Site Scripting (XSS), and buffer overflows, it is not a direct replacement for traditional code scanning tools, instead, it complements these tools by leveraging telemetry from integrated Microsoft services and applying AI-driven insights to prioritize risks and guide remediation. Copilot enhances developer productivity by interpreting signals, offering tailored recommendations, and embedding security practices throughout the software lifecycle. Understanding Security Protocols and Mechanisms Azure’s security stands on robust protocols and mechanisms but understanding them shouldn’t require a cryptography degree. Security Copilot demystifies encryption, authentication, and secure communications—making complex concepts accessible and actionable. With Security Copilot as your guide, teams can confidently configure Azure resources and respond to threats with informed, best-practice decisions. Compliance and Regulatory Alignment Regulatory requirements such as GDPR, HIPAA, and PCI-DSS don’t have to slow you down. Security Copilot streamlines Azure compliance with ready-to-use templates, clear guidelines, and robust documentation support. From maintaining audit logs to generating compliance reports, Security Copilot keeps every action tracked and organized—reducing non-compliance risk and making audits a breeze. Incident Response Planning No security strategy is complete without a solid incident response plan. Security Copilot equips Azure teams with detailed protocols for identifying, containing, and mitigating threats. It enhances Security Information and Event Management (SIEM) and Security Orchestration, Automation, and Response (SOAR) solutions through ready-made playbooks tailored to diverse scenarios. With built-in incident simulations, Copilot enables teams to rehearse and refine their responses—minimizing breach impact and accelerating recovery. Security Best Practices for Azure Staying ahead of threats means never standing still. Security Copilot builds on Azure’s proven security features—like multi-factor authentication, regular updates, and least privilege access—by automating their implementation, monitoring usage patterns, and surfacing actionable insights. It connects with tools like Microsoft Defender and Entra ID to interpret signals, recommend improvements, and guide teams in real time. With Copilot, your defenses don’t just follow best practices, they evolve dynamically to meet emerging threats, keeping your team sharp and your environment secure. Integrating Copilot into Your Azure Security Strategy Security Copilot isn’t just a technical tool—it’s your strategic partner for Azure security. By weaving Copilot into your workflows, you unlock advanced security enhancements, optimized code, and robust privacy protection. Its holistic approach ensures security and compliance are seamlessly integrated into every corner of your Azure environment. Conclusion Security Copilot is changing the game for Azure security and compliance. By blending secure coding, advanced security expertise, regulatory support, incident response playbooks, and best practices, Copilot empowers technical teams to build resilient, compliant cloud environments. As threats evolve, Copilot keeps your data protected and your organization ahead of the curve. Ready to take your Azure security and compliance to the next level? Start leveraging Security Copilot today to empower your team, streamline operations, and stay ahead of evolving threats. Dive deeper into best practices, hands-on tutorials, and expert guidance to maximize your security posture and unlock the full potential of Copilot in your organization. Explore, learn, and secure your cloud—your journey starts now! Further Reading & Resources Microsoft Security Copilot documentation Get started with Microsoft Security Copilot Microsoft Copilot in Azure Overview Security best practices and patterns - Microsoft Azure Azure compliance documentation Copilot Learning Hub Microsoft Security Copilot Blog Author: Microsoft Principal Technical Trainer, https://www.linkedin.com/in/eliasestevao/ #MicrosoftLearn #SkilledByMTTGraph RAG for Security: Insights from a Microsoft Intern
As a software engineering intern at Microsoft Security, I had the exciting opportunity to explore how Graph Retrieval-Augmented Generation (Graph RAG) can enhance data security investigations. This blog post shares my learning journey and insights from working with this evolving technology.Secure and govern AI apps and agents with Microsoft Purview
The Microsoft Purview family is here to help you secure and govern data across third party IaaS and Saas, multi-platform data environment, while helping you meet compliance requirements you may be subject to. Purview brings simplicity with a comprehensive set of solutions built on a platform of shared capabilities, that helps keep your most important asset, data, safe. With the introduction of AI technology, Purview also expanded its data coverage to include discovering, protecting, and governing the interactions of AI apps and agents, such as Microsoft Copilots like Microsoft 365 Copilot and Security Copilot, Enterprise built AI apps like Chat GPT enterprise, and other consumer AI apps like DeepSeek, accessed through the browser. To help you view, investigate interactions with all those AI apps, and to create and manage policies to secure and govern them in one centralized place, we have launched Purview Data Security Posture Management (DSPM) for AI. You can learn more about DSPM for AI here with short video walkthroughs: Learn how Microsoft Purview Data Security Posture Management (DSPM) for AI provides data security and compliance protections for Copilots and other generative AI apps | Microsoft Learn Purview capabilities for AI apps and agents To understand our current set of capabilities within Purview to discover, protect, and govern various AI apps and agents, please refer to our Learn doc here: Microsoft Purview data security and compliance protections for Microsoft 365 Copilot and other generative AI apps | Microsoft Learn Here is a quick reference guide for the capabilities available today: Note that currently, DLP for Copilot and adhering to sensitivity label are currently designed to protect content in Microsoft 365. Thus, Security Copilot and Coplot in Fabric, along with Copilot studio custom agents that do not use Microsoft 365 as a content source, do not have these features available. Please see list of AI sites supported by Microsoft Purview DSPM for AI here Conclusion Microsoft Purview can help you discover, protect, and govern the prompts and responses from AI applications in Microsoft Copilot experiences, Enterprise AI apps, and other AI apps through its data security and data compliance solutions, while allowing you to view, investigate, and manage interactions in one centralized place in DSPM for AI. Follow up reading Check out the deployment guides for DSPM for AI How to deploy DSPM for AI - https://aka.ms/DSPMforAI/deploy How to use DSPM for AI data risk assessment to address oversharing - https://aka.ms/dspmforai/oversharing Address oversharing concerns with Microsoft 365 blueprint - aka.ms/Copilot/Oversharing Explore the Purview SDK Microsoft Purview SDK Public Preview | Microsoft Community Hub (blog) Microsoft Purview documentation - purview-sdk | Microsoft Learn Build secure and compliant AI applications with Microsoft Purview (video) References for DSPM for AI Microsoft Purview data security and compliance protections for Microsoft 365 Copilot and other generative AI apps | Microsoft Learn Considerations for deploying Microsoft Purview AI Hub and data security and compliance protections for Microsoft 365 Copilot and Microsoft Copilot | Microsoft Learn Block Users From Sharing Sensitive Information to Unmanaged AI Apps Via Edge on Managed Devices (preview) | Microsoft Learn as part of Scenario 7 of Create and deploy a data loss prevention policy | Microsoft Learn Commonly used properties in Copilot audit logs - Audit logs for Copilot and AI activities | Microsoft Learn Supported AI sites by Microsoft Purview for data security and compliance protections | Microsoft Learn Where Copilot usage data is stored and how you can audit it - Microsoft 365 Copilot data protection and auditing architecture | Microsoft Learn Downloadable whitepaper: Data Security for AI Adoption | Microsoft Explore the roadmap for DSPM for AI Public roadmap for DSPM for AI - Microsoft 365 Roadmap | Microsoft 365PMPurMicrosoft Purview Powering Data Security and Compliance for Security Copilot
Microsoft Purview provides Security and Compliance teams with extensive visibility into admin actions within Security Copilot. It offers tools for enriched users and data insights to identify, review, and manage Security Copilot interaction data in DSPM for AI. Data security and compliance administrators can also utilize Purview’s capabilities for data lifecycle management and information protection, advanced retention, eDiscovery, and more. These features support detailed investigations into logs to demonstrate compliance within the Copilot tenant. Prerequisites Please refer to the prerequisites for Security Copilot and DSPM for AI in the Microsoft Learn Docs. Key Capabilities and Features Heightened Context and Clarity As organizations adopt AI, implementing data controls and a Zero Trust approach is essential to mitigate risks like data oversharing, leakage, and non-compliant usage. Microsoft Purview, combined with Data Security Posture Management (DSPM) for AI, empowers security and compliance teams to manage these risks across Security Copilot interactions. With this integration, organizations can: Discover data risks by identifying sensitive information in user prompts and responses. Microsoft Purview surfaces these insights in the DSPM for AI dashboard and recommends actions to reduce exposure. Identify risky AI usage using Microsoft Purview Insider Risk Management to investigate behaviors such as inadvertent sharing of sensitive data or to detect suspicious activity within Security Copilot usage. These capabilities provide heightened visibility into how AI is used across the organization, helping teams proactively address potential risks before they escalate. Compliance and Governance Building on this visibility, organizations can take action using Microsoft Purview’s integrated compliance and governance solutions. Here are some examples of how teams are leveraging these capabilities to govern Security Copilot interactions: Audit provides a detailed log of user and admin activity within Security Copilot, enabling organizations to track access, monitor usage patterns, and support forensic investigations. eDiscovery enables legal and investigative teams to identify, collect, and review Security Copilot interactions as part of case workflows, supporting defensible investigations. Communication Compliance helps detect potential policy violations or risky behavior in administrator interactions, enabling proactive monitoring and remediation. Data Lifecycle Management allows teams to automate the retention, deletion, and classification of Security Copilot data—reducing storage costs and minimizing risk from outdated or unnecessary information. Together, these tools provide a comprehensive governance framework that supports secure, compliant, and responsible AI adoption across the enterprise. Getting Started Enable Purview Audit for Security Copilot Sign into your Copilot tenant at https://securitycopilot.microsoft.com/, and with the Security Administrator permissions, navigate to the Security Copilot owner settings and ensure Audit logging is enabled. Microsoft Purview To start using DSPM for AI and the Microsoft Purview capabilities, please complete the following steps to get set up and then feel free to experiment yourself. Navigate to Purview (Purview.Microsoft.com) and ensure you have adequate permissions to access the different Purview solutions as described here. DSPM for AI Select the DSPM for AI “Solution” option on the left-most navigation. Go to the policies or recommendations tab turn on the following: a. “DSPM for AI – Capture interactions for Copilot Experiences”: Captures prompts and responses for data security posture and regulatory compliance from Security Copilot and other Copilot experiences. b. “Detect Risky AI Usage”: Helps to calculate user risk by detecting risky prompts and responses in Copilot experiences. c. “Detect unethical behavior in AI apps”: Detects sensitive info and inappropriate use of AI in prompts and responses in Copilot experiences. To begin reviewing Security Copilot usage within your organization and identifying interactions that contain sensitive information, select Reports from the left navigation panel. a. The "Sensitive interactions per AI app" report shows the most common sensitive information types used in Security Copilot interactions and their frequency. For instance, this tenant has a significant amount of IT and IP Address information within these interactions. Therefore, it is important to ensure that all sensitive information used in Security Copilot interactions is utilized for legitimate workplace purposes and does not involve any malicious or non-compliant use of Security Copilot. b. “Top unethical AI interactions” will show an overview of any potentially unsafe or inappropriate interactions with AI apps. In this case, Security Copilot only has seven potentially unsafe interactions that included unauthorized disclosure and regulatory collusion. c. “Insider risk severity per AI app” shows the number of high risk, medium risk, low risk and no risk users that are interacting with Security Copilot. In this tenant, there are about 1.9K Security Copilot users, but very few of them have an insider risk concern. d. To check the interaction details of this potentially risky activity, head over to Activity Explorer for more information. 5. In Activity Explorer, you should filter the App to Security Copilot. You will also have the option to filter based on the user risk level and sensitive information type. To identify the highest risk behaviors, filter for users with a medium to high risk level or those associated with the most sensitive information types. a. Once you have filtered, you can start looking through the activity details for more information like the user details, the sensitive information types, the prompt and response data, and more. b. Based on the details shown, you may decide to investigate the activity and the user further. To do so, we have data security investigation and governance tools. Data Security Investigations and Governance If you find Security Copilot actions in DSPM for AI Activity Explorer to be potentially inappropriate or malicious, you can look for further information in Insider Risk Management (IRM), through an eDiscovery case, Communication Compliance (CC), or Data Lifecycle Management (DLM). Insider Risk Management By enabling the quick policy in DSPM for AI to monitor risky Copilot usage, alerts will start appearing in IRM. Customize this policy based on your organization's risk tolerance by adjusting triggering events, thresholds, and indicators for detected activity. Examine the alerts associated with the "DSPM for AI – Detect risky AI usage" policy, potentially sorting them by severity from high to low. For these alerts, you will find a User Activity scatter plot that provides insights into the activities preceding and following the user's engagement with a risky prompt in Security Copilot. This assists the Data Security administrator in understanding the necessary triage actions for this user/alert. After thoroughly investigating these details and determining whether the activity was malicious or an inadvertent insider risk, appropriate actions can be taken, including issuing a user warning, resolving the case, sharing the case with an email recipient, or escalating the case to eDiscovery for further investigation. eDiscovery To identify, review and manage your Security Copilot logs to support your investigations, use the eDiscovery tool. Here are the steps to take in eDiscovery: a. Create an eDiscovery Case b. Create a new search c. In Search, go to condition builder and select Add conditions -> KeyQL d. Enter the query as: - KQL Equal (ItemClass=IPM.SkypeTeams.Message.Copilot.Security.SecurityCopilot) e. Run the query f. Once completed, add the search to a review set (Button at the top) g. In the review set, view details of the Security Copilot conversation Communication Compliance In Communication Compliance, like IRM, you can investigate details around the Security Copilot interactions. Specifically, in CC, you can determine if these interactions contained non-compliant usage of Security Copilot or inappropriate text. After identifying the sentiment of the Security Copilot communication, you can take action by resolving the alert, sending a warning notice to the user, escalating the alert to a reviewer, or escalating the alert for investigation, which will create a new eDiscovery case. Data Lifecycle Management For regulatory compliance or investigation purposes, navigate to Data Lifecycle Management to create a new retention policy for Security Copilot activities. a. Provide a friendly name for the retention policy and select Next b. Skip Policy Scope section for this validation c. Select “Static” type of retention policy and select Next d. Choose “Microsoft Copilot Experiences” to apply retention policy to Security Copilot interactions Billing Model Microsoft Purview audit logging of Security Copilot activity remains included at no additional cost as part of Microsoft 365 E5 licensing. However, Microsoft Purview now offers a combination of entitlement-based (per-user-per-month) and Pay-As-You-Go (PAYG) pricing models. The PAYG model applies to a broader set of Purview capabilities—including Insider Risk Management, Communication Compliance, eDiscovery, and other data security and governance solutions—based on usage volume or complexity. This flexible pricing structure ensures that organizations only pay for what they use as data flows through AI models, networks, and applications. For further details, please refer to this Microsoft Security Community Blog: New Purview pricing options for protecting AI apps and agents | Microsoft Community Hub Looking Ahead By following these steps, organizations can leverage the full potential of Microsoft Purview to enhance the security and compliance of their Security Copilot interactions. This integration not only provides peace of mind but also empowers organizations to manage their data more effectively. Please reach out to us if you have any questions or additional requirements. Additional Resources Use Microsoft Purview to manage data security & compliance for Microsoft Security Copilot | Microsoft Learn How to deploy Microsoft Purview DSPM for AI to secure your AI apps Learn how Microsoft Purview Data Security Posture Management (DSPM) for AI provides data security and compliance protections for Copilots and other generative AI apps | Microsoft Learn Considerations for deploying Microsoft Purview Data Security Posture Management (DSPM) for AI | Microsoft Learn Learn about Microsoft Purview billing models | Microsoft LearnNEW Conditional Access Optimization Agent in Microsoft Entra + Security Copilot in Entra updates
Instead of switching between logs, PowerShell, and spreadsheets, Security Copilot centralizes insights for faster, more focused action. Resolve compromised accounts, uncover ownerless or high-risk apps, and tighten policy coverage with clear insights, actionable recommendations, and auto-generated policies. Strengthen security posture and reclaim time with a smarter, more efficient approach powered by Security Copilot. Diana Vicezar, Microsoft Entra Product Manager, shares how to streamline investigations and policy management using AI-driven insights and automation. Skip the scripting. Ask questions in plain language and get back policy and risk insights in seconds. Microsoft Entra now has built-in AI with Security Copilot. Stay ahead of threats. Use AI to track auth changes, elevated roles, and risky signals with Security Copilot in Entra. Start here. Improve your security posture. Receive personalized recommendations of policies and configurations to make using Microsoft Security Copilot in Microsoft Entra. Take a look. QUICK LINKS: 00:00 — Microsoft Entra with Security Copilot 01:26 — Conditional Access Optimization Agent 03:35 — Investigate risky users 05:49 — Investigate risky apps 07:34 — Personalized security posture recommendations 08:20 — Wrap up Link References Check out https://aka.ms/SecurityCopilotAgentsinMicrosoftEntra Unfamiliar with Microsoft Mechanics? As Microsoft’s official video series for IT, you can watch and share valuable content and demos of current and upcoming tech from the people who build it at Microsoft. Subscribe to our YouTube: https://www.youtube.com/c/MicrosoftMechanicsSeries Talk with other IT Pros, join us on the Microsoft Tech Community: https://techcommunity.microsoft.com/t5/microsoft-mechanics-blog/bg-p/MicrosoftMechanicsBlog Watch or listen from anywhere, subscribe to our podcast: https://microsoftmechanics.libsyn.com/podcast Keep getting this insider knowledge, join us on social: Follow us on Twitter: https://twitter.com/MSFTMechanics Share knowledge on LinkedIn: https://www.linkedin.com/company/microsoft-mechanics/ Enjoy us on Instagram: https://www.instagram.com/msftmechanics/ Loosen up with us on TikTok: https://www.tiktok.com/@msftmechanics Video Transcript: -Microsoft Entra has built-in AI with Security Copilot. In fact, if you are new to the experience or haven’t looked at it in a while, you’ll find that it is continuously being fine-tuned with skills to accelerate your daily troubleshooting and risk assessments, which means whether you’re a seasoned admin or just getting started, you don’t need deep expertise in filtering, PowerShell, or Graph API. You can just use natural language and have Security Copilot surface the information for you. Additionally, new specialized agents like the one for Conditional Access Optimization work with you to continuously look for misaligned policies along with gaps in coverage that could be putting your organization at risk. -Today, I’ll walk through examples of just how powerful Security Copilot in Microsoft Entra can be, starting with a pretty common challenge, policy coverage and conflicts, where right now, you might try to work through these issues by using filters to identify new users in the Entra audit logs or by using PowerShell with the Microsoft Graph module, then perhaps, you might export log outputs into a spreadsheet for manual analysis, and repeat the same process to identify new Enterprise apps, all with the goal of identifying coverage or gaps in policies. It’s a manual effort that can take hours from your day. And that’s where the Conditional Access Optimization Agent comes in. It can be accessed and enabled from the agents page in the Microsoft Entra admin center. From there, the Conditional Access agent works alongside you, proactively surfacing issues and suggestions like gaps in protection, users, or apps that should be added to an existing policy and policy overlaps. And you can track the status of agent suggestions as you work through them. -Clicking into a suggestion gives you the details. For this one about adding users, the agent has listed userIDs for the new users. And I can review the user impact of the suggested policy before I apply the changes. You can also dive into the agent’s activity to explore its path of analysis and the reasoning behind each suggestion to validate its logic, making sure its behaving in the way you want it to. Then moving back to the policy details, before you apply any changes, you can review the summary of changes and even the detailed JSON view if you want a deeper look, down to the individual configuration options for the policy. And at the tenant level, if you need to fine-tune the agent’s behavior, you can do so in the agent Settings tab using Custom Instructions. -For example, you can instruct the agent to make exceptions like excluding break-glass admin accounts, which the agent will take into account on its next run. And beyond just giving you suggestions and recommendations, the agent can go a step further and create a fully configured policy if no existing equivalent policy is found. By default, these are report-only policies. And from here, you can even turn it on to enable the policy directly. And from Edit, you can review the policy details. The Conditional Access Optimization Agent is great for consistently tracking your policy coverage as users, apps, and access policies evolve over time. Additionally, the specialized Microsoft Entra skills in Security Copilot will also help save you time and even help you add to your existing expertise. -For example, let me show you how Security Copilot helps automate the manual steps when investigating and fixing a known compromised user account. Typically, you would need to use sign-in logs to isolate what they are trying to access or audit the actions that they have taken with visibility into their sign-in events as well as any group memberships giving them access to resources or examine any current or recently elevated role assignments, which could increase the severity of the compromise. Already I’m jumping between tabs, and it’s time-consuming to collect all of that information to see why they’re showing up as risky. Security Copilot on the other hand can pull everything together in a fraction of the time. In this case, I know that a user, Michael, has had an account compromise. -So, I’ll ask Copilot if his account was recently flagged as risky, which even if he is low risk now, could be a sign of a persistence attack, where his account is compromised and the attacker is waiting for the right timing. The response from Copilot shows me that he is high risk with an at-risk state that started on May 19th. So, I’ll ask for the risk details for his account. Copilot spots an attempted Primary Refresh Token or PRT access. Threat Intelligence has flagged his account. There are sign-in attempts from a known malicious IP address and an anonymized IP address. So, the account was definitely compromised. I’ll ask Copilot if Michael’s authentication methods have changed. And it looks like he added a new phone on May 15th, then updated details again on the 19th. Finally, I’ll ask about Michael’s account type and whether he has privileged roles assigned. And it looks like he has Cloud Device and Device Join admin permissions. This would let him easily register and modify other managed devices, for example, to have them send file contents or sign-in tokens to other cloud storage locations. So very quickly, I was able to get the visibility I needed to decide what to do next. - Now let’s move from risky user accounts to risky apps, which can present a vulnerability. Normally, you’d spend a long time digging through app lists just to isolate which apps are even worth worrying about, trying to understand the overall risk to determine what apps are created by my organization or maybe a 3rd party that might require more scrutiny. Who owns the app, or does it no longer have an owner? What protocols are the apps using? And are they risky? And which applications are stale or unused that you may want to purge from the list. Investigations like this can take hours. Let’s use Copilot for this instead. I’ll start by asking it to list some external apps that are not owned by my tenant with verified publisher details for each app. And it pulls together a list of seven apps with additional details like the app name, App ID, and Verified Publisher, so I’m not wasting time on low-risk noise. That said, sometimes it’s the apps owned by at-risk users that can be the real problem. -So, I want to ask Copilot, do the risky users in my tenant own any applications? And it finds an app that is owned by a high-risk user. Another potential problem that presents a hidden risk are apps and service principals in your environment that are currently ownerless. I’ll ask Copilot, what proportion of apps and service principals are ownerless? And Copilot tells me that more than half or 55% of my apps are ownerless and 92% of our service principals are also ownerless. And beyond finding and pointing out problems with my policies and settings, Copilot can even give me detailed recommendations to improve identity posture. -In this case, I’ll ask, give me recommendations to improve the security posture of at-risk apps in my tenant. Show this as a bulleted list with impacted resources as applications. And Copilot gives me seven actionable recommendations of policies and configurations to make, including the removal of the unused service principals that I presented earlier, as well as outdated authentication protocols and more. So, with just a few simple prompts, I have achieved something that otherwise might have taken hours in just a few minutes. -As you’ve seen, Security Copilot in Microsoft Entra simplifies troubleshooting and risk assessments, with specialized skills and agents. And while I showed you the Conditional Access Optimization agent today, there are more on the way. To learn more, check out aka.ms/SecurityCopilotAgentsinMicrosoftEntra. Keep checking back to Microsoft Mechanics for the latest updates and thanks for watching.1.2KViews0likes0Comments