azure
7864 TopicsPer certification designed badges
Hi First Microsoft opted out from awesome Credly (awesome, as learners collected “all” personal certifications in one place, no matter the vendor - easy to share the Credly profile link for various reasons) And now you have quit creating “per certification branded badge”s, and only provide standard “Associate” & “Expert” badges with a “Learn diploma) showing the name of the certification “in text” (the new Fabric exam as example) For us globally in roles like “Alliance Managers”, “Partner Managers”, driving and summarizing partners excellence in the area of Microsoft + pushing with marketing us and Microsoft- this is bad! Example on how we earlier are using the per certification badges Is it just by mistake you have taken this path? Or is it just me and my learners that have missed where they can download per exam branded badges for newer certifications now? Regards Gabriel1.3KViews6likes4CommentsEnterprise Identity Meets Secure File Transfer: Entra ID Public Preview on Azure Blob Storage SFTP
We are excited to announce the public preview of Entra ID-based access for Azure Blob Storage SFTP. This new capability enables you to use Microsoft Entra ID (formerly Azure Active Directory) identities (including guest users via Entra External Identities) to securely connect to Azure Blob Storage via SFTP without needing local users. This feature eliminates the operational overhead of managing local SFTP users and passwords by introducing enterprise-grade identity management powered by Microsoft Entra ID. For IT administrators and security teams, this means no more creating, tracking, rotating, or decommissioning local SFTP credentials. For developers and architects, it means seamless integration with your existing identity infrastructure. For business users, it means faster, more secure access to the data they need, all while maintaining compliance with enterprise security policies. Azure Blob Storage SFTP Azure Blob Storage SFTP natively enables secure file access and management without third-party solutions. This simplifies operations for customers and removes the need for complex, custom SFTP solutions. Until this Public Preview, Azure Blob Storage SFTP utilized a form of identity management called local users as the only authorization mechanism. Local users must use either a password or a Secure Shell (SSH) private key credential for authentication. Learn more about local users here. The Challenge: SFTP Local User Management Organizations currently face challenges when managing SFTP access at scale with Azure Storage SFTP Local Users. Local User based SFTP access require IT teams to: Manually create and provision local user accounts for each SFTP user Generate, distribute, and securely store SSH keys or passwords Implement custom workflows for lifecycle management Manage offboarding processes to ensure departed users lose access immediately Audit and track access across disconnected identity silos Handle external partner and vendor access through ad-hoc, often insecure methods The Solution: Enterprise Identity Meets Secure File Transfer With Entra ID-based access for Azure Blob Storage SFTP, you can now leverage your organization's centralized identity platform to authenticate and authorize SFTP users. This integration brings the full power of Microsoft Entra ID to your file transfer workflows, delivering the following benefits: 1. Eliminate Local User Management Simplify SFTP management by assigning access with Entra ID—no separate SFTP accounts needed. No local credential generation or distribution—users authenticate with their existing corporate credentials No orphaned accounts when users change roles or leave the organization Reduced attack surface by eliminating static, long-lived local credentials Centralized user lifecycle management through your existing identity platform 2. Enterprise-Grade Identity and Security Leverage the full security capabilities of Microsoft Entra ID for your SFTP infrastructure: Multi-Factor Authentication (MFA): Require additional verification factors beyond passwords, significantly reducing the risk of account compromise Conditional Access: Define policies that grant or block access based on user location, device compliance, sign-in risk, and other conditions Identity Protection: Benefit from Microsoft threat intelligence and risk detection to identify and respond to compromised accounts Privileged Identity Management (PIM): Provide just-in-time elevated access for administrative operations 3. Native Azure RBAC, ABAC, and ACL Integration Your SFTP access control seamlessly integrates with Azure comprehensive authorization framework: Role-Based Access Control (RBAC): Assign built-in or custom roles at the storage account, container, or even blob level Attribute-Based Access Control (ABAC): Create sophisticated access policies based on resource tags, user attributes, and environmental conditions Access Control Lists (ACLs): Apply fine-grained permissions at the directory and file level for hierarchical namespace-enabled accounts Unified Permission Model: SFTP access respects the same permissions as REST API, Azure CLI, and other access methods—no separate permission system to manage 4. Faster SFTP Onboarding and Time-to-Value Onboard new SFTP users or partners in minutes instead of hours or days, saving significant time and boosting business agility. 5. Secure External Collaboration with Entra External Identities Seamlessly enable secure external SFTP access by allowing partners to authenticate with their own credentials using Entra External Identities (Azure AD B2B). External users authenticate with credentials they already manage Full audit trail of external user activity Ability to apply Conditional Access policies to external users Automatic access revocation when B2B relationships end Real World Scenarios Financial Services: A bank receives daily transaction files from merchants via SFTP. Merchants authenticate with their own Entra ID credentials (B2B collaboration), MFA is enforced, and access is restricted to assigned directories. Access is instantly revoked when a merchant is removed from the B2B directory. Healthcare: A hospital exchanges patient data with insurers and labs. Entra ID authentication ensures only authorized staff access sensitive PII, with full audit logs for HIPAA compliance. Conditional Access restricts connections to approved locations and devices. Media & Entertainment: A production company enables freelance editors and agencies to transfer large media files. Entra External Identities provide time-limited access and automatic revocation when projects end—no need for local SFTP accounts. Manufacturing: A manufacturer receives CAD files and orders from suppliers using SFTP. With Entra ID, suppliers use unified credentials and access policies across all systems, streamlining supply chain management. How It Works Entra ID simplifies SFTP access to Azure Blob Storage by authenticating users with their corporate credentials. After authentication, users receive a short-lived Open SSH certificate to connect. The service verifies certificate validity and user permissions, enabling secure file operations and automatic access revocation in line with current identity policies. Learn more here. Getting started with the Public Preview We encourage you to try Entra ID-based access for Azure Blob Storage SFTP in your non-production environments today. Learn more about how to register for the preview and get started with the detailed ms docs learn guide here. This preview gives you an opportunity to shape the feature development by providing feedback on what works well and what could be improved. Note: Local user accounts for SFTP access are still supported, but we strongly recommend switching to Entra ID-based access for greater security, simpler management, and automatic access control. Questions or feedback? We would love to hear from you! Reach out to our team at blobsftp@microsoft.com We are excited to bring enterprise-grade identity management to Azure Blob Storage SFTP, and we cannot wait to see how you use this capability to simplify operations, enhance security, and enable new collaboration scenarios. Happy transferring!325Views0likes0CommentsEntra ID Object Drift – Are We Measuring Tenant Health Correctly?
In many enterprise environments: Secure Score is green. Compliance dashboards look healthy. Yet directory object inconsistency silently accumulates. Stale devices. Hybrid join remnants. Intune orphan records. Over time, this becomes governance debt. In large tenants this often leads to inaccurate compliance reporting and Conditional Access targeting issues. I recently wrote a breakdown of: • Entra ID drift patterns • Hybrid join inconsistencies • Intune orphan objects • Lifecycle-based cleanup architecture Curious how others approach object hygiene at scale. Full article: https://www.modernendpoint.tech/entra-id-cleanup-patterns/?utm_source=techcommunity&utm_medium=social&utm_campaign=entra_cleanup_launch&utm_content=discussion One pattern I keep seeing is duplicate device identities after re-enrollment or Autopilot reset. Curious how others handle lifecycle cleanup in large Entra ID environments.77Views0likes3CommentsUpdate To API Management Workspaces Breaking Changes: Built-in Gateway & Tiers Support
What’s changing? If your API Management service uses preview workspaces on the built-in gateway and meets the tier-based limits below, those workspaces will continue to function as-is and will automatically transition to general availability once built-in gateway support is fully announced. API Management tier Limit of workspaces on built-in gateway Premium and Premium v2 Up to 30 workspaces Standard and Standard v2 Up to 5 workspaces Basic and Basic v2 Up to 1 workspace Developer Up to 1 workspace Why this change? We introduced the requirement for workspace gateways to improve reliability and scalability in large, federated API environments. While we continue to recommend workspace gateways, especially for scenarios that require greater scalability, isolation, and long-term flexibility, we understand that many customers have established workflows using the preview workspaces model or need workspaces support in non-Premium tiers. What’s not changing? Other aspects of the workspace-related breaking changes remain in effect. For example, service-level managed identities are not available within workspaces. In addition to workspaces support on the built-in gateway described in the section above, Premium and Premium v2 services will continue to support deploying workspaces with workspace gateways. Resources Workspaces in Azure API Management Original breaking changes announcements Reduced tier availability Requirement for workspace gateways2.5KViews2likes8Comments🚨 Partner‑Exclusive Event: AMA with Fabric Leadership
We’re excited to invite Fabric Partner Community members to a live Ask Me Anything (AMA) with Fabric leadership—a rare opportunity to get direct answers and insights from the team shaping Azure Data and Microsoft Fabric. Featured Guest Shireesh Thota CVP, Azure Data Databases Tuesday, March 24 8:00–9:00 AM PT With FabCon + SQLCon wrapping just days before, this session is designed for partners who want to go deeper—ask follow‑up questions, pressure‑test ideas, and understand what’s next as they plan with customers. Topics may include: What’s next for Azure SQL, Cosmos DB, and PostgreSQL Guidance on SQL Server roadmap direction Deep‑dive questions on SQL DB in Fabric Questions about the new DP‑800 Analytics Engineer exam going into beta this month Partners can submit any questions—technical, roadmap‑focused, certification‑related, or customer‑scenario driven. This event is exclusively available to members of the Fabric Partner Community. Not a member yet? Join the Fabric Partner Community to attend this AMA and unlock access to partner‑only events like this: https://aka.ms/JoinFabricPartnerCommunity12Views0likes0CommentsMicrosoft partners with DataBahn to accelerate enterprise deployments for Microsoft Sentinel
Enterprise security teams are collecting more telemetry than ever across cloud platforms, endpoints, SaaS applications, and on-premises infrastructure. Security teams want broader data coverage and longer retention without losing control of cost and data quality. This post explains the new DataBahn integration with Microsoft Sentinel, why it matters for SIEM operations, and how to think about using a security data pipeline alongside Sentinel for onboarding, normalization, routing, and governance. DataBahn joins Microsoft Sentinel partner ecosystem This integration reflects Microsoft Sentinel’s open partner ecosystem, giving customers choice in the partners they use alongside Microsoft Sentinel to manage their security data pipelines. DataBahn joins a broader set of complementary partners, enabling customers to tailor solutions for their unique security data needs. DataBahn is available through Microsoft Marketplace and is eligible for customers to apply existing Azure Consumption Commitments toward the purchase of DataBahn. Why this matters for security operations teams Security teams are under relentless pressure to ingest more data, move faster through SIEM migrations, and preserve data fidelity for detections and investigations, all while managing costs effectively. The challenge isn’t just ingesting data, but ensuring the right telemetry arrives in a consistent, governed format that analysts and detections can trust. This is where a security data pipeline, alongside Microsoft Sentinel’s native connectors and DCRs, can add value. It helps streamline onboarding of third-party and custom sources, improve normalization consistency, and provide operational visibility across diverse environments as deployments scale. What DataBahn integration is positioned to do with Microsoft Sentinel Security teams want broader coverage and need to ensure third-party data is consistently shaped, routed, and governed at scale. This is where a security data pipeline like DataBahn complements Microsoft Sentinel. Sitting upstream of ingestion, the pipeline layer standardizes onboarding and shaping across sources while providing operational visibility into data flow and pipeline health. Together, the collaboration focuses on reducing onboarding friction, improving normalization consistency, enabling intentional routing, and strengthening governance signals so teams can quickly detect source changes, parser breaks, or data gaps—while staying aligned with Sentinel analytics and detection workflows. This model gives Sentinel customers more choice to move faster, onboard data at scale, and retain control over data routing. Key capabilities Bidirectional data integration The integration enables seamless delivery of telemetry into Sentinel while aligning with Sentinel detection logic and schema expectations. This helps ensure telemetry pipelines remain consistent with: Sentinel detection formats Custom analytics rules Sentinel data models and schemas Automated table and DCR management As detections evolve, pipeline configurations can adapt to maintain detection fidelity and data consistency. Advanced management API DataBahn provides an advanced management API that allows organizations to programmatically configure and manage pipeline integrations with Sentinel. This enables teams to: Automate pipeline configuration Manage operational workflows Integrate pipeline management into broader security or DevOps automation processes Automatic identification of configuration conflicts In complex environments with multiple telemetry sources and routing rules, configuration conflicts can arise across filtering logic, enrichment pipelines, and detection dependencies. The integration helps automatically: Detect conflicts in filtering rules and pipeline logic Identify clashes with detection dependencies Highlight missing configurations or coverage gaps Automated detection of configuration conflicts and pipeline rule dependencies This visibility allows SOC teams to quickly identify issues that could impact detection reliability. Centralized pipeline management The integration enables centralized management of data collection and transformation workflows associated with Sentinel telemetry pipelines. This provides unified visibility and control across telemetry sources while maintaining compatibility with Sentinel analytics and detections. Centralized management simplifies operations across large environments where multiple telemetry pipelines must be maintained. Centralized pipeline management for telemetry sources across the environment Flexible data transformation and customization Security telemetry often arrives in inconsistent formats across vendors and platforms. The platform supports flexible transformation capabilities that allow organizations to: Normalize logs into standard or custom Sentinel table formats Add or derive fields required by Sentinel detections Apply filtering or enrichment rules before ingestion Configuration can be performed through a single-screen workflow, enabling teams to modify schemas and define filtering logic without disrupting downstream analytics. Flexible data transformation to align telemetry with Microsoft Sentinel ASIM schemas The platform also provides schema drift detection and source health monitoring, helping teams maintain reliable telemetry pipelines as environments evolve. Closing Effective security operations depend on how quickly a SOC can onboard new data, scale effectively, and maintain high‑quality investigations. Sentinel provides a cloud‑native, AI-ready foundation to ingest security data from first- and third‑party data sources—while enabling economical, large‑scale retention and deep analytics using open data formats and multiple analytics engines. DataBahn’s partnership with Sentinel is positioned as a pipeline layer that can help teams onboard third-party sources, shape and normalize data, and apply routing and governance patterns before data lands in Sentinel. Learn more DataBahn for Microsoft Sentinel DataBahn Press Release - Databahn Deepens Partnership with Microsoft Sentinel Microsoft Sentinel data lake overview - Microsoft Security | Microsoft Learn Microsoft Sentinel—AI-Ready Platform | Microsoft Security Connect Microsoft Sentinel to the Microsoft Defender portal - Unified security operations | Microsoft Learn Microsoft Sentinel data lake is now generally available | Microsoft Community Hub935Views2likes1CommentFebruary 2026 Recap: Azure Database for MySQL
We're excited to share a summary of the Azure Database for MySQL updates from the last couple of months. Extended Support Timeline Update Based on customer feedback requesting additional time to complete major version upgrades, we have extended the grace period before extended support billing begins for Azure Database for MySQL: MySQL 5.7: Extended support billing start date moved from April 1, 2026 to August 1, 2026. MySQL 8.0: Extended support billing start date moved from June 1, 2026 to January 1, 2027. This update provides customers additional time to plan, validate, and complete upgrades while maintaining service continuity and security. We continue to recommend upgrading to a supported MySQL version as early as possible to avoid extended support charges and benefit from the latest improvements. Learn more about performing a major version upgrade in Azure Database for MySQL. When upgrading using a read replica, you can optionally use the Rename Server feature to promote the replica and avoid application connection‑string updates after the upgrade completes. Rename Server is currently in Private Preview and is expected to enter Public Preview around the April 2026 timeframe. Private Preview - Fabric Mirroring for Azure Database for MySQL This capability enables real‑time replication of MySQL data into Microsoft Fabric with a zero‑ETL experience, allowing data to land directly in OneLake in analytics‑ready formats. Customers can seamlessly analyse mirrored data using Microsoft Fabric experiences, while isolating analytical workloads from their operational MySQL databases. Stay Connected We welcome your feedback and invite you to share your experiences or suggestions at AskAzureDBforMySQL@service.microsoft.com Stay up to date by visiting What's new in Azure Database for MySQL, and follow us on YouTube | LinkedIn | X for ongoing updates. Thank you for choosing Azure Database for MySQL!