training
1260 TopicsThe City Leader's Dilemma: How AI Is turning urban strain into strategic advantage
Ready to transform how your city plans and operates? Download the Trend Report 2025: Planning and operating thriving cities – innovation for smarter urban living to access the complete playbook on AI-powered urban innovation, complete with case studies from Bangkok, Singapore, Barcelona, and Manchester. Urban challenges aren’t slowing down. Populations are growing, climate pressures are intensifying, and residents expect seamless services, while budgets remain flat and workforces stretch thin. Traditional approaches can’t keep pace. The good news? Cities worldwide are showing that AI and digital innovation can drive meaningful improvements. Recent studies indicate that more than half of surveyed cities are already using AI to upgrade operations, and most plan to expand adoption in the next three years. For many leaders, the question is less about whether to act and more about how to act responsibly and effectively. After studying the latest research and real-world deployments, three strategic shifts stand out, each offering a different lens on how forward-thinking city leaders are turning pressure into progress. Shift One: From Fragmented services to unified citizen experiences Residents expect seamless problem-solving, not organizational complexity. Yet many cities operate in silos, transit systems, permitting offices, 311 reporting, and community engagement often run on separate platforms. The result? Multiple apps for residents, duplicated effort for staff, and missed insights locked in departmental databases. Leading cities are breaking this pattern through unified digital platforms powered by AI. Bangkok’s Traffy Fondue: Citizens report issues like broken streetlights or flooding via a mobile interface. AI categorizes each report and routes it to the right department. By mid-2025, the platform handled nearly one million citizen reports, improving engagement and reducing administrative overhead. The outcome? Reduced administrative overhead, and something harder to measure but equally important: residents who believe their government actually listens. Buenos Aires took a similar path with "Boti," a WhatsApp chatbot that evolved from a COVID-era tool into a citywide digital assistant. Citizens report issues, ask questions, and access services through the messaging app they already use daily. Technology that meets residents where they are improves efficiency and strengthens trust, when guided by principles of transparency and fairness. Shift Two: From reactive planning to predictive foresight Traditional urban planning relies on static models: masterplans, zoning maps, historical growth trends. These tools served their purpose. But they cannot capture the complexity of future risks, extreme weather, evolving mobility patterns, or the cascading effects of a single development decision. Digital twins complement human expertise by integrating geospatial data, climate models, and policy scenarios, helping cities make smarter decisions with limited budgets. Singapore's Digital Urban Climate Twin integrates geospatial data with climate models to simulate how different policies would affect temperature and thermal comfort across neighborhoods. These tools support informed decision-making while maintaining human oversight and accountability. The result? Strategic adaptation rather than reactive firefighting. Sydney built an urban digital twin that correlates environmental conditions with traffic accidents, using machine learning to predict crash risk on specific road segments. City planners can now test interventions virtually, what happens if we lower speed limits here? Add a bike lane there? Before committing resources. Even smaller cities are finding value. Imola, Italy uses a microclimate digital twin to model heat distribution street by street, guiding decisions about where to plant trees or specify cool pavement materials. The paradigm shift is profound: instead of planning based on what happened, cities can now plan based on what's likely to happen. This is how you make smart bets with limited budgets. Shift Three: From tech adoption to governance architecture Here's where many cities stumble. They invest in flashy pilots without building the institutional structures to sustain them. The cities getting this right treat governance as a strategic asset, not a compliance burden. Singapore's Model AI Governance Framework provides practical guidelines for transparency, fairness, and human-centric design. Its AI Verify toolkit lets organizations test their systems for resilience, accountability, and bias before deployment. Barcelona takes a different but equally rigorous approach, treating municipal data as a public asset under its Data Commons program. The city's procurement strategy favors open-source solutions, preventing vendor lock-in while supporting local innovation ecosystems. Both models share a common insight: rapid innovation doesn't automatically produce equitable outcomes. Governance creates the guardrails that allow experimentation without derailment. For city leaders, this means building cross-sector governance councils, adopting clear data strategies, creating ethical AI frameworks, and investing in workforce capability. These aren't obstacles to innovation; they're the foundation that makes sustained innovation possible. The Path Forward Cities that thrive in combine strategic vision with disciplined, responsible technology use. They embed digital capabilities into decision-making, supported by robust policies and cross-department collaboration. Learn how Microsoft helps governments build tech-empowered cities and resilient infrastructure at Microsoft for government. The Smart Cities World 2025 Trend Report provides the detailed case studies, governance frameworks, and implementation roadmaps to make this real. Download your copy now and start building the city your residents deserve.77Views0likes0CommentsConsolidate by category
I am trying the following practice challenge in an online course. I cannot for the life of me figure out the solution for step 7. Any help would be GREATLY appreciated "The next few instructions are a bit tricky. You should not need to add any columns to achieve these tasks. These tasks are similar to what was in the Consolidate by Category (Reference) video. In that video, we consolidated sheets that had different categories and in a different order by selecting Use labels in: Left column. Consolidate by Category can also condense multiple rows with the same category down to a single row per category. When you select the references, note that you will need to select at least 2 columns. The first column will be used for the labels and the other column(s) will be consolidated using the Function that you select in the Consolidate dialog. STEP 7: Use the Consolidate tool to generate a summary of the number of tickets raised per priority for May Week 4, June Week 1 and June Week 2 (combined). Sort the consolidated data by Priority. STEP 8: Use the Consolidate tool to generate a summary of the average number of days a ticket was open per priority for May Week 4, June Week 1 and June Week 2. Sort the consolidated data by Priority and change number of decimal places to 2 (change the format, do not use a rounding function). IMPORTANT: Before the next step, make sure you delete the other references in the Consolidate tool! STEP 9: Use the Consolidate tool to generate a summary of the number of tickets given each satisfaction rating for May Week 4, June Week 1 and June Week 2. Use a COUNT function. Sort the consolidated data by Satisfaction Rating." Links to excel files https://d3c33hcgiwev3.cloudfront.net/QsR8jRaoEeiV3A7PVRrSig_43591af016a811e8a3c03fd758b931b1_W1_AdvPracticeChallenge.xlsx?Expires=1764100925&Signature=ayov~GEp79FgN-SAkfGConpKqHsqzwlbG8M0jrn02RwCODRz9HsQXaHNI8IfYvEGy6I68FiyPqLJglNKFW3MBfk4AjEdeSv4dB2NiPrQTqwPFXQOvLGs3THeY7vSJiKj31--Y93AJkiyIfjQBZZyO640jN0cXLZpKl5DXLoBaDM_&Key-Pair-Id=APKAJLTNE6QMUY6HBC5A https://d3c33hcgiwev3.cloudfront.net/_f48c8a2f4c6065a2a708dc316cbcf999_Help-Desk-April-Summary.xlsx?Expires=1764100925&Signature=EnuRHABT4F-ZT925yZKt2cZzr-HYIKEod2gx1-kObdLIzueISMkjx79svRv8ZVWwVO~Y5uteVJNvS5KvdjFh2daaNvIX9OXsw5wXcKaN6zkNqVrjLe21drPYC8JIqW36Z2G3Rxpq-eHwRyKCHlBfxu85zSuTzybcjcP9hFnQ-XA_&Key-Pair-Id=APKAJLTNE6QMUY6HBC5A https://d3c33hcgiwev3.cloudfront.net/_3ffee287bc89bcdd309bfe7d8e5b00da_Help-Desk-May.xlsx?Expires=1764100925&Signature=axvArXcDo65he42Pk13z99VoJ4HS3kpzeTEMmHOEAQ9B0ou444oPGWb4Pp65MEKasrQoMQZ9A40lr2sqXqEYexJk~vENmc9Y2dGDwiEabfxQ4liQbf58sME1cUvrom26neAH40zScJbWh7rFwNZIfI7060YRYVDw4D736pH64zY_&Key-Pair-Id=APKAJLTNE6QMUY6HBC5A https://d3c33hcgiwev3.cloudfront.net/_f0e969b80e0ab5054d797a0644bd82eb_Help-Desk-June.xlsx?Expires=1764100925&Signature=LjcstfMamG2vqJqd2vcD9hiQciNCa48ngWqswE~Al7KgjCvu6M~RHzQ65G5W~Q0dAjfCmikPvIx7j4dZuCXnKWhfE3ULqy3JuJy7EExzsVCUVubmsftcP5tqTRo6No9rtPtBid1e4qkMpyE9Zwj0ny3rJn2LDSulWNYuJNXTIBw_&Key-Pair-Id=APKAJLTNE6QMUY6HBC5A84Views0likes1Commenti need the data from all the sheets in the workbook to link to one data sheet.
A colleague of mine made a data spreadsheet (lets call it SHEET 1), he no longer works for the company and over time the worksheet has been amended, so doesn't work like it should. There are several sheets in the workbook and all data that is copied into these needs to go into SHEET 1 (doing it itself, not manually). My question is when I download my data and paste into SHEET 2, how do I get it to automatically go on to the SHEET 1, taking only certain parts of the data, in this case B and F. The attached is just a small example of the data I need from SHEET 2. On SHEET 1 There is a list of numbers on the left and more, how can I get this data to filter itself onto SHEET 1 in the right column then adding the numbers on the left together. So, on SHEET 1, 160 -T will show 7 because there is 7 1's below for that Org. Sorry if i have made this sound long winded, I am awful at explaining. I have basic knowledge of excel but I cannot get my head around formulas. It should look something like this. Any advice would be great. Thanks182Views0likes2Commentsmulti Vehicle maintenance log record
i need to make a vehicle maintenance log that will have 7 different vehicles, track all the oil changes, inspections, registrations that will notify us when something is expiring, and be able to attach receipts and documents to it. what is the best way to do this?17KViews0likes2CommentsIgnite your future with new security skills during Microsoft Ignite 2025
Ignite your future with new security skills during Microsoft Ignite 2025 AI and cloud technologies are reshaping every industry. Organizations need professionals who can secure AI solutions, modernize infrastructure, and drive innovation responsibly. Ignite brings together experts, learning, and credentials to help you get skilled for the future. Take on the Secure and Govern AI with Confidence Challenge Start your journey with the Azure Skilling Microsoft Challenge. These curated challenges help you practice real-world scenarios and earn recognition for your skills. One of the challenges featured is the Secure and Govern AI with Confidence challenge. This challenge helps you: Implement AI governance frameworks. Configure responsible AI guardrails in Azure AI Foundry. Apply security best practices for AI workloads. Special Offer: Be among the first 5,000 participants to complete this challenge and receive a discounted certification exam voucher—a perfect way to validate your skills and accelerate your career. Completing this challenge earns you a badge and prepares you for advanced credentials—ideal for anyone looking to lead in AI security. Join the challenge today! Validate Your Expertise with this new Microsoft Applied Skill. Applied Skills assessments are scenario-based, so you demonstrate practical expertise—not just theory. Earn the Secure AI Solutions in the Cloud credential—a job-ready validation of your ability to: Configure security for AI services using Microsoft Defender for Cloud. Implement governance and guardrails in Azure AI Foundry. Protect sensitive data and ensure compliance across AI workloads. This applied skill is designed for professionals who want to lead in AI security, accelerate career growth, and stand out in a competitive market. To learn how to prepare and take the applied skill, visit here. Your Next Steps: Security Plans Ignite isn’t just about live sessions—it’s about giving you on-demand digital content and curated learning paths so you can keep building skills long after the event ends. With 15 curated security plans that discuss topics such as controlling access with Microsoft Entra and securing your organization’s data, find what is relevant to you on Microsoft Ignite: Keep the momentum going page.Step-by-Step Guide: Integrating Microsoft Purview with Azure Databricks and Microsoft Fabric
Co-Authored By: aryananmol, laurenkirkwood and mmanley This article provides practical guidance on setup, cost considerations, and integration steps for Azure Databricks and Microsoft Fabric to help organizations plan for building a strong data governance framework. It outlines how Microsoft Purview can unify governance efforts across cloud platforms, enabling consistent policy enforcement, metadata management, and lineage tracking. The content is tailored for architects and data leaders seeking to execute governance in scalable, hybrid environments. Note: this article focuses mainly on Data Governance features for Microsoft Purview. Why Microsoft Purview Microsoft Purview enables organizations to discover, catalog, and manage data across environments with clarity and control. Automated scanning and classification build a unified view of your data estate enriched with metadata, lineage, and sensitivity labels, and the Unified Catalog gives business-friendly search and governance constructs like domains, data products, glossary terms, and data quality. Note: Microsoft Purview Unified Catalog is being rolled out globally, with availability across multiple Microsoft Entra tenant regions; this page lists supported regions, availability dates, and deployment plans for the Unified Catalog service: Unified Catalog Supported Regions. Understanding Data Governance Features Cost in Purview Under the classic model: Data Map (Classic), users pay for an “always-on” Data Map capacity and scanning compute. In the new model, those infrastructure costs are subsumed into the consumption meters – meaning there are no direct charges for metadata storage or scanning jobs when using the Unified Catalog (Enterprise tier). Essentially, Microsoft stopped billing separately for the underlying data map and scan vCore-hours once you opt into the new model or start fresh with it. You only incur charges when you govern assets or run data processing tasks. This makes costs more predictable and tied to governance value: you can scan as much as needed to populate the catalog without worrying about scan fees and then pay only for the assets you actively manage (“govern”) and any data quality processes you execute. In summary, Purview Enterprise’s pricing is usage-based and divided into two primary areas: (1) Governed Assets and (2) Data Processing (DGPUs). Plan for Governance Microsoft Purview’s data governance framework is built on two core components: Data Map and Unified Catalog. The Data Map acts as the technical foundation, storing metadata about assets discovered through scans across your data estate. It inventories sources and organizes them into collections and domains for technical administration. The Unified Catalog sits on top as the business-facing layer, leveraging the Data Map’s metadata to create a curated marketplace of data products, glossary terms, and governance domains for data consumers and stewards. Before onboarding sources, align Unified Catalog (business-facing) and Data Map (technical inventory) and define roles, domains, and collections so ownership and access boundaries are clear. Here is a documentation that covers roles and permissions in Purview: Permissions in the Microsoft Purview portal | Microsoft Learn. The imageabove helps understand therelationship between the primary data governance solutions, Unified Catalog and Data Map, and the permissions granted by the roles for each solution. Considerations and Steps for Setting up Purview Steps for Setting up Purview: Step 1: Create a Purview Account. In the Azure Portal, use the search bar at the top to navigate to Microsoft Purview Accounts. Once there, click “Create”. This will take you to the following screen: Step 2: Click Next: Configuration and follow the Wizard, completing the necessary fields, including information on Networking, Configurations, and Tags. Then click Review + Create to create your Purview account. Consideration: Private networking: Use Private Endpoints to secure Unified Catalog/Data Map access and scan traffic; follow the new platform private endpoints guidance in the Microsoft Purview portal or migrate classic endpoints. Once your Purview Account is created, you’ll want to set up and manage your organization’s governance strategy to ensure that your data is classified and managed according to the specific lifecycle guidelines you set. Note: Follow the steps in this guide to set up Microsoft Purview Data Lifecycle Management: Data retention policy, labeling, and records management. Data Map Best Practices Design your collections hierarchy to align with organizational strategy—such as by geography, business function, or data domain. Register each data source only once per Purview account to avoid conflicting access controls. If multiple teams consume the same source, register it at a parent collection and create scans under subcollections for visibility. The imageaboveillustrates a recommended approach for structuring your Purview DataMap. Why Collection Structure Matters A well-structured Data Map strategy, including a clearly defined hierarchy of collections and domains, is critical because the Data Map serves as the metadata backbone for Microsoft Purview. It underpins the Unified Catalog, enabling consistent governance, role-based access control, and discoverability across the enterprise. Designing this hierarchy thoughtfully ensures scalability, simplifies permissions management, and provides a solid foundation for implementing enterprise-wide data governance. Purview Integration with Azure Databricks Databricks Workspace Structure In Azure Databricks, each region supports a single Unity Catalog metastore, which is shared across all workspaces within that region. This centralized architecture enables consistent data governance, simplifies access control, and facilitates seamless data sharing across teams. As an administrator, you can scan one workspace in the region using Microsoft Purview to discover and classify data managed by Unity Catalog, since the metastore governs all associated workspaces in a region. If your organization operates across multiple regions and utilizes cross-region data sharing, please review the consideration and workaround outlined below to ensure proper configuration and governance. Follow pre-requisite requirements here, before you register your workspace: Prerequisites to Connect and manage Azure Databricks Unity Catalog in Microsoft Purview. Steps to Register Databricks Workspace Step 1: In the Microsoft Purview portal, navigate to the Data Map section from the left-hand menu. Select Data Sources. Click on Register to begin the process of adding your Databricks workspace. Step 2: Note: There are two Databricks data sources, please review documentation here to review differences in capability: Connect to and manage Azure Databricks Unity Catalog in Microsoft Purview | Microsoft Learn. You can choose either source based on your organization’s needs. Recommended is “Azure Databricks Unity Catalog”: Step 3: Register your workspace. Here are the steps to register your data source: Steps to Register an Azure Databricks workspace in Microsoft Purview. Step 4: Initiate scan for your workspace, follow steps here: Steps to scan Azure Databricks to automatically identify assets. Once you have entered the required information test your connection and click continue to set up scheduled scan trigger. Step 5: For Scan trigger, choose whether to set up a schedule or run the scan once according to your business needs. Step 6: From the left pane, select Data Map and select your data source for your workspace. You can view a list of existing scans on that data source under Recent scans, or you can view all scans on the Scans tab. Review further options here: Manage and Review your Scans. You can review your scanned data sources, history and details here: Navigate to scan run history for a given scan. Limitation: The “Azure Databricks Unity Catalog” data source in Microsoft Purview does not currently support connection via Managed Vnet. As a workaround, the product team recommends using the “Azure Databricks Unity Catalog” source in combination with a Self-hosted Integration Runtime (SHIR) to enable scanning and metadata ingestion. You can find setup guidance here: Create and manage SHIR in Microsoft Purview Choose the right integration runtime configuration Scoped scan support for Unity Catalog is expected to enter private preview soon. You can sign up here: https://aka.ms/dbxpreview. Considerations: If you have delta-shared Databricks-to-Databricks workspaces, you may have duplication in your data assets if you are scanning both Workspaces. The workaround for this scenario is as you add tables/data assets to a Data Product for Governance in Microsoft Purview, you can identify the duplicated tables/data assets using their Fully Qualified Name (FQN). To make identification easier: Look for the keyword “sharing” in the FQN, which indicates a Delta-Shared table. You can also apply tags to these tables for quicker filtering and selection. The screenshot highlights how the FQN appears in the interface, helping you confidently identify and manage your data assets. Purview Integration with Microsoft Fabric Understanding Fabric Integration: Connect Cross-Tenant: This refers to integrating Microsoft Fabric resources across different Microsoft Entra tenants. It enables organizations to share data, reports, and workloads securely between separate tenants, often used in multi-organization collaborations or partner ecosystems. Key considerations include authentication, data governance, and compliance with cross-tenant policies. Connect In-Same-Tenant: This involves connecting Fabric resources within the same Microsoft Entra tenant. It simplifies integration by leveraging shared identity and governance models, allowing seamless access to data, reports, and pipelines across different workspaces or departments under the same organizational umbrella. Requirements: An Azure account with an active subscription. Create an account for free. An active Microsoft Purview account. Authentication is supported via: Managed Identity. Delegated Authentication and Service Principal. Steps to Register Fabric Tenant Step 1: In the Microsoft Purview portal, navigate to the Data Map section from the left-hand menu. Select Data Sources. Click on Register to begin the process of adding your Fabric Tenant (which also includes PowerBI). Step 2: Add in Data Source Name, keep Tenant ID as default (auto-populated). Microsoft Fabric and Microsoft Purview should be in the same tenant. Step 3: Enter in Scan name, enable/disable scanning for personal workspaces. You will notice under Credentials automatically created identity for authenticating Purview account. Note: If your Purview is behind Private Network, follow the guidelines here: Connect to your Microsoft Fabric tenant in same tenant as Microsoft Purview. Step 4: From your Microsoft Fabric, open Settings, Click on Tenant Settings and enable “Service Principals can access read-only admin APIs”, “Enhanced admin API responses within detailed metadata” and “Enhance Admin API responses with DAX and Mashup Expressions” within Admin API Settings section. Step 5: You will need to create a group, add the Purviews' managed identity to the group and add the group under “Service Principals can access read-only admin APIs” section of your tenant settings inside Microsoft Fabric Step 6: Test your connection and setup scope for your scan. Select the required workspaces, click continue and automate a scan trigger. Step 7: From the left pane, select Data Map and select your data source for your workspace. You can view a list of existing scans on that data source under Recent scans, or you can view all scans on the Scans tab. Review further options here: Manage and Review your Scans. You can review your scanned data sources, history and details here: Navigate to scan run history for a given scan. Why Customers Love Purview Kern County unified its approach to securing and governing data with Microsoft Purview, ensuring consistent compliance and streamlined data management across departments. EY accelerated secure AI development by leveraging the Microsoft Purview SDK, enabling robust data governance and privacy controls for advanced analytics and AI initiatives. Prince William County Public Schools created a more cyber-safe classroom environment with Microsoft Purview, protecting sensitive student information while supporting digital learning. FSA (Food Standards Agency) helps keep the UK food supply safe using Microsoft Purview Records Management, ensuring regulatory compliance and safeguarding critical data assets. Conclusion Purview’s Unified Catalog centralizes governance across Discovery, Catalog Management, and Health Management. The Governance features in Purview allow organizations to confidently answer critical questions: What data do we have? Where did it come from? Who is responsible for it? Is it secure and compliant? Can we trust its quality? Microsoft Purview, when integrated with Azure Databricks and Microsoft Fabric, provides a unified approach to cataloging, classifying, and governing data across diverse environments. By leveraging Purview’s Unified Catalog, Data Map, and advanced governance features, organizations can achieve end-to-end visibility, enforce consistent policies, and improve data quality. You might ask, why does data quality matter? Well, in today’s world, data is the new gold. References Microsoft Purview | Microsoft Learn Pricing - Microsoft Purview | Microsoft Azure Use Microsoft Purview to Govern Microsoft Fabric Connect to and manage Azure Databricks Unity Catalog in Microsoft Purview1.2KViews3likes0CommentsOpenAI’s open‑source model: gpt‑oss on Azure AI Foundry and Windows AI Foundry
Open-weight models give customer decision makers control and flexibility - no black boxes, fewer trade-offs, and more options across deployment, compliance, and cost. With OpenAI’s gpt-oss open-weight models on Azure AI Foundry, you can: Fine-tune and distill the models using your own data and deploy with confidence. Mix open and proprietary models to match task-specific needs. Spin up inference endpoints using gpt oss in the cloud with just a few CLI commands. And Foundry Local makes gpt‑oss-20b usable on a high-performance Windows PC – enabling use-cases in offline settings, buildings in a secure network, or running at the edge. Check out more here!273Views0likes0CommentsSumming rows values while a specific column answers a specific condition.
Its much smaller than the sheet I work with but for this example I want to scan column 1 representative and if the value in a cell form A2:A6="D" I want to sum its rows. in this case I expect the answer 3+1+8+1=13. Tried the sumif but it works for 1 column at a time and not scanning multiplied arrays. Tried sumproduct and still couldnt get the cell to show the total amount. since in my actual worksheet there are 37X57 array, going manually and put formula in each specific row is less ideal soo I am looking for a specific command to solve it and calculate along the data that changes on the sheet.95Views0likes1CommentUnlocking Developer Innovation with Microsoft Sentinel data lake
Introduction Microsoft Sentinel is evolving rapidly, transforming to be both an industry-leading SIEM and an AI-ready platform that empowers agentic defense across the security ecosystem. In our recent webinar: Introduction to Sentinel data lake for Developers, we explored how developers can leverage Sentinel’s unified data lake, extensible architecture, and integrated tools to build innovative security solutions. This post summarizes the key takeaways and actionable insights for developers looking to harness the full power of Sentinel. The Sentinel Platform: A Foundation for Agentic Security Unified Data and Context Sentinel centralizes security data cost-effectively, supporting massive volumes and diverse data types. This unified approach enables advanced analytics, graph-enabled context, and AI-ready data access—all essential for modern security operations. Developers can visualize relationships across assets, activities, and threats, mapping incidents and hunting scenarios with unprecedented clarity. Extensible and Open Platform Sentinel’s open architecture simplifies onboarding and data integration. Out-of-the-box connectors and codeless connector creation make it easy to bring in third-party data. Developers can quickly package and publish agents that leverage the centralized data lake and MCP server, distributing solutions through Microsoft Security Store for maximum reach. The Microsoft Security Store is a storefront for security professionals to discover, buy, and deploy vetted security SaaS solutions and AI agents from our ecosystem partners. These offerings integrate natively with Microsoft Security products—including the Sentinel platform, Defender, and Entra, to deliver end‑to‑end protection. By combining curated, deploy‑ready solutions with intelligent, AI‑assisted workflows, the Store reduces integration friction and speeds time‑to‑value for critical tasks like triage, threat hunting, and access management. Advanced Analytics and AI Integration With support for KQL, Spark, and ML tools, Sentinel separates storage and compute, enabling scalable analytics and semantic search. Jupyter Notebooks hosted in on-demand Spark environments allow for rich data engineering and machine learning directly on the data lake. Security Copilot agents, seamlessly integrated with Sentinel, deliver autonomous and adaptive automation, enhancing both security and IT operations. Developer Scenarios: Unlocking New Possibilities The webinar showcased several developer scenarios enabled by Sentinel’s platform components: Threat Investigations Over Extended Timelines: Query historical data to uncover slow-moving attacks and persistent threats. Behavioral Baselining: Model normal behavior using months of sign-in logs to detect anomalies. Alert Enrichment: Correlate alerts with firewall and NetFlow data to improve accuracy and reduce false positives. Retrospective Threat Hunting: React to new indicators of compromise by running historical queries across the data lake. ML-Powered Insights: Build machine learning models for anomaly detection, alert enrichment, and predictive analytics. These scenarios demonstrate how developers can leverage Sentinel’s data lake, graph capabilities, and integrated analytics to deliver powerful security solutions. End-to-End Developer Journey The following steps outline a potential workflow for developers to ingest and analyze their data within the Sentinel platform. Data Sources: Identify high-value data sources from your environment to integrate with Microsoft Security data. The journey begins with your unique view of the customer’s digital estate. This is data you have in your platform today. Bringing this data into Sentinel helps customers make sense of their entire security landscape at once. Data Ingestion: Import third-party data into the Sentinel data lake for secure, scalable analytics. As customer data flows from various platforms into Sentinel, it is centralized and normalized, providing a unified foundation for advanced analysis and threat detection across the customer’s digital environment. Sentinel data lake and Graph: Run Jupyter Notebook jobs for deep insights, combining contributed and first-party data. Once data resides in the Sentinel data lake, developers can leverage its graph capabilities to model relationships and uncover patterns, empowering customers with comprehensive insights into security events and trends. Agent Creation: Build Security Copilot agents that interact with Sentinel data using natural language prompts. These agents make the customer’s ingested data actionable, allowing users to ask questions or automate tasks, and helping teams quickly respond to threats or investigate incidents using their own enterprise data. Solution Packaging: Package and distribute solutions via the Microsoft Security Store, reaching customers at scale. By packaging these solutions, developers enable customers to seamlessly deploy advanced analytics and automation tools that harness their data journey— from ingestion to actionable insights—across their entire security estate. Conclusion Microsoft Sentinel’s data lake and platform capabilities open new horizons for developers. By centralizing data, enabling advanced analytics, and providing extensible tools, Sentinel empowers you to build solutions that address today’s security challenges and anticipate tomorrow’s threats. Explore the resources below, join the community, and start innovating with Sentinel today! App Assure: For assistance with developing a Sentinel Codeless Connector Framework (CCF) connector, you can contact AzureSentinelPartner@microsoft.com. Microsoft Security Community: aka.ms/communitychoice Next Steps: Resources and Links Ready to dive deeper? Explore these resources to get started: Get Educated! Sentinel data lake general availability announcement Sentinel data lake official documentation Connect Sentinel to Defender Portal Onboarding to Sentinel data lake Integration scenarios (e.g. hunt | jupyter) KQL queries Jupyter notebooks (link) as jobs (link) VS Code Extension Sentinel graph Sentinel MCP server Security Copilot agents Microsoft Security Store Take Action! Bring your data into Sentinel Build a composite solution Explore Security Copilot agents Publish to Microsoft Security Store List existing SaaS apps in Security StoreSora 2 now available in Azure AI Foundry
Sora 2 in Azure AI Foundry stands out by combining OpenAI’s most advanced video generation capabilities with the trusted infrastructure and security controls of Microsoft Azure. And through Azure AI Foundry and our responsible AI principles, we empower customers with embedded security, safety, and privacy controls. Azure AI Foundry offers a curated catalog of generative media models, including OpenAI’s Sora, GPT-image-1 and GPT-image-1-mini, Black Forest Lab’s Flux 1.1 and Kontext Pro, and more to empower developers in your and customer organizations to serve creatives with new and unique capabilities, all without sacrificing the safety, reliability, and integration businesses expect. Learn more here!342Views0likes0Comments