data lake
24 TopicsUsing Microsoft Sentinel MCP Server with GitHub Copilot for AI-Powered Threat Hunting
Introduction This post walks through how to get started with the Microsoft Sentinel MCP Server and showcases a hands-on demo integrating with Visual Studio Code and GitHub Copilot. Using the MCP server, you can run natural language queries against Microsoft Sentinel’s security data lake, enabling faster investigations and simplified threat hunting using tools you already know. This blog includes a real-world prompt you can use in your own environment and highlights the power of AI-assisted security workflows. What is the Microsoft Sentinel MCP Server? The Model Context Protocol (MCP) allows AI models to access structured security data in a standard, context-aware way. The Sentinel MCP server connects to your Microsoft Sentinel data lake and enables tools like GitHub Copilot or Security Copilot to: Search security data using natural language Summarize findings and explain risks Build intelligent agents for security operations Prerequisites Make sure you have the following in place: Onboarded to Microsoft Sentinel Data Lake Assigned the Security Reader role Installed: Visual Studio Code GitHub Copilot extension (Optional) Security Copilot plugin if building agents Setting Up MCP Server in VS Code Step 1: Add the MCP Server In VS Code, press Ctrl + Shift + P Search for: MCP: Add Server Choose HTTP or Server-Sent Events Enter one of the following MCP endpoints: Use Case Endpoint Data Exploration https://sentinel.microsoft.com/mcp/data-exploration Agent Creation https://sentinel.microsoft.com/mcp/security-copilot-agent-creation Give the server a friendly name (e.g., Sentinel MCP Server) Choose whether to apply it to all workspaces or just the current one When prompted, Allow authentication using an account with Security Reader access Verify the Connection Open Chat: View > Chat or Ctrl + Alt + I Switch to Agent Mode Click the Configure Tools icon to ensure MCP tools are active Using GitHub Copilot + Sentinel MCP Once connected, you can use natural language prompts to pull insights from your Sentinel data lake without writing any KQL. Demo Prompt: 🔍 “Find the top three users that are at risk and explain why they are at risk.” This prompt is designed to: Identify the highest-risk users in your environment Explain the reasoning behind each user's risk status Help prioritize investigation and response efforts You can enter this prompt in either: VS Code Chat window (Agent Mode) Copilot inline prompt area Expected Behavior The MCP server will: Query multiple Microsoft Sentinel sources (Identity Protection, Defender for Identity, Sign-in logs) Correlate risk events (e.g., risky sign-ins, alerts, anomalies) Return a structured response with top users and risk explanation Sample Output from My Tenant Results Found: User 1: 233 risk score - 53 failed attempts from suspicious IPs User 2: 100% failure rate indicating service account compromise User 3: Admin account under targeted brute force attack This demo shows how the integration of Microsoft Sentinel MCP Server with GitHub Copilot and VS Code transforms complex security investigations into simple, conversational workflows. By leveraging natural language and AI-driven context, we can surface high-risk users, understand the underlying threats, and take action — all within a familiar development environment, and without writing a single line of KQL. More details here: What is Microsoft Sentinel’s support for MCP? (preview) - Microsoft Security | Microsoft Learn Get started with Microsoft Sentinel MCP server - Microsoft Security | Microsoft Learn Data exploration tool collection in Microsoft Sentinel MCP server - Microsoft Security | Microsoft LearnDesigning system to enable Adhoc queries
Hi, we are designing a data processing system in which the data goes through three different stages as shown below. What azure platforms or technologies do you recommend for a dynamic scenario like the one below where the input file format can change all the time, the transformations applied are not standard and the reports generated vary every time? Extract Data size can be around 1 GB. Can be of various formats and from various sources like FTP, API etc. Transform Transformations are applied on the data. Results After the transformations, results are exported to a final report table from which reports are generated.996Views0likes1CommentBig Data on Azure with No Limits Data, Analytics and Managed Clusters
First published on MSDN on Feb 24, 2017 HDInsight Reliable with an industry leading SLA Enterprise-grade security and monitoring Productive platform for developers and scientists Cost effective cloud scale Integration with leading ISV applications Easy for administrators to manage Resources & Hands on Labs for teaching https://github.948Views0likes0CommentsGateway Timout on Azure Data Factory Copy Task
I'm trying to set up a copy job that connects to a text file in Data Lake Storage (v1) and copies the data to somewhere... I've set up the Active Directory application I've created a Data Factory (tried v1 and v2) I've created the copy task and connected to the Data Lake. I've successfully picked a file on the lake. The fie is a CSV file. On the file format settings screen I get a Gateway Timeout. Activity ID:2f860074-7a71-470d-87b9-b5523a13d8a6 when setting up the file. I've tried a simple file with 2 lines and 3 columns all the way to a zipped file with lots of columns I get a similar error on the v1 factory. Any ideas on what I've done wrong?922Views0likes0CommentsAzure Data Lake Tools for VSCode supports Azure blob storage integration
We are pleased to announce the integration of VSCode explorer with Azure blob storage. If you are a data scientist and want to explore the data in your Azure blob storage, please try the Data Lake Explorer blob storage integration. If you are a developer and want to access and manage your Azure blob storage files, please try the Data Lake Explorer blob storage integration. The Data Lake Explorer allows you easily navigate to your blob storage, access and manage your blob container, folder and files. Read about it in the Azure blog.1.1KViews0likes0CommentsGet started with U-SQL: It’s easy!
Azure Data Lake Analytics combines declarative and imperative concepts in the form of a new language called U-SQL. The idea of learning a new language is daunting. Don’t worry! U-SQL is easy to learn. You can learn the vast majority of the language in a single day. If you are familiar with SQL or languages like C# or Java, you will find that learning U-SQL is natural and that you will be productive incredibly fast. A common question we get is “How can I get started with U-SQL?” This blog will show you all the core steps you need to get ramped up on U-SQL. Read about it in the Azure blog.959Views0likes0CommentsControl Azure Data Lake costs using Log Analytics to create service alerts
Azure Data Lake customers use the Data Lake Store and Data Lake Analytics to store and run complex analytics on massive amounts of data. However, it is challenging to manage costs, keep up-to-date with activity in the accounts, and proactively know when usage thresholds are nearing certain limits. Using Log Analytics and Azure Data Lake we can address these challenges and know when the costs are increasing or when certain activities take place. In this post, you will learn how to use Log Analytics with your Data Lake accounts to create alerts that can notify you of Data Lake activity events and when certain usage thresholds are reached. It is easy to get started! Read more about it in the Azure blog.2.6KViews0likes0Comments