Recent Discussions
- 504Views0likes1Comment
IoT Hub Distributed Tracing
Hi I have been following this guide: https://learn.microsoft.com/en-us/azure/iot-hub/iot-hub-distributed-tracing and have done everything and messages are being sent with tracestates but I am not receiving any logs in my container or log analytics workspace, I get logs for other things like connections but not distributed tracing logs. what could the issue be? Thanks709Views1like1CommentTwo node Azure Local cluster updated to different versions
I'm not really sure how it's happened, but after trying to run an update against my Azure Local cluster, one of the two nodes has ended up at a higher version and now the update process is failing as it's detected that the nodes are at two different versions. Node 1 is at 26100.32690 Node 2 is at 26100.32522 Retrying the update process is failing as it's obviously detecting that the two nodes are at different update versions. Is there a way to update the node that has fallen behind to the the same version as the other?15Views0likes1CommentLooking for an Azure solution for time series validation solution
Hi folks, I am planning to replace one of my old on prem based time series based process validation tool with Azure based solution. Required solution should be able to perform validation of time series coming from various different sources like DBs or CSV file etc(eventually data is coming from different plants in those DBs or CSV files) and it should be able to perform certain calculations(based on some defined mathematical functions) on those time series. Post that it should be able to send it to some centralized Datawarehouse where reporing etc can be performed on that gathered data with tools like Power BI or Graphana. I found one similar sort of product which Azure timeseries but it is going to be out of support in 2025.329Views0likes1CommentCreating Dashboards using KQL in Grafana
I want to create a dashboard in Grafana using KQL to see number of Incoming & Outgoing messages on each topics of EventHUbs. i tried using KubePodInventory & ContainerLog tables in Log Analytics but i couldnot find anything related to messages .374Views0likes1CommentMicrosoft Entra /Azure Connect Reinstallation and Source Anchor Change
Hello everyone, I would like to talk about the possibility of changing the SourceAnchor in Azure Connect. Officially, this is not supported by Microsoft, but there is still a way to do this via a few detours. the running AD Sync must be stopped first of all. to make the changeover, all users must first be soft-deleted. The most practical way to do this is to synchronize an OU in which there are no users. Now the Entra objects are stored under deleted It is important to note that before restoring the users who do not have an Exchange Online mailbox, Entra P1 or 2 must be removed so that a second mailbox is not created here. Now all users must be restored. After this has been done, the Immutable id of the users must be removed via PowerShell. This is possible with the following command: Get-MsolUser -all | Set-MsolUser -ImmutableID "$Null" (If this command is required for individual users, replace -all with -userpricipalname "example@email,de") If the Immutable Id has been removed for all users, the status in Entra must be set to Cloud Only. If this is the case, you can start with the next steps. It is important to note that the actions carried out above can lead to short-term failures and should therefore ideally be carried out before the weekend! In the next step, a clean uninstallation of Azure Connect must be carried out. Here I would recommend the article ADsync uninstallation from MSXFAQ where it is well explained. When uninstalling, only the steps that do not hinder a new installation should be carried out, but this is well explained in the article. after successfully uninstalling the AD Sync, there may be delays, which is why I would recommend waiting 24 hours before reinstalling. The waiting time can be skipped, but it still worked for me. as soon as you have installed the AD Sync with the new desired attribute, you can start the sync. The users should now be matched with the existing cloud objects via Softmatch. If this does not work, it is possible to delete the Immutable ID again or to correct the errors via the AD Connect error display of the Entra ID. Under the function other errors, several errors may be displayed, this was fixed by us by fixing all duplicate attribute errors. I hope this has helped you a little. I am always open to feedback!553Views0likes1CommentFrom AWS to Azure: Practical Lessons and Best Practices from Real-World Migrations
Cloud-to-cloud migrations—especially from AWS to Azure—are often seen as straightforward “lift-and-shift” exercises. In reality, they involve careful planning across architecture, networking, identity, and deployment practices to ensure stability, scalability, and long-term maintainability. Based on my experience working on large-scale migration programs, here are some key best practices that can significantly improve the success of AWS-to-Azure transitions. 1. Start with Architecture, Not Migration One of the most common pitfalls is jumping directly into migration without defining the target architecture. Before moving workloads: Define landing zones and environment structure (Dev/UAT/Prod) Align networking, identity, and security models Map AWS services to Azure equivalents (e.g., EC2 → VM/VMSS, ALB → Application Gateway 2. Prioritize Infrastructure as Code (IaC) Manual changes during migration create long-term drift and instability. Best practices: Use IaC (Terraform/Bicep) for all infrastructure provisioning Capture any portal-level fixes back into code Maintain version-controlled deployments 3. Plan Capacity and Quotas Early Capacity-related issues are often discovered too late during migration. From experience: Validate VM sizes and availability in target regions Plan capacity reservations if needed Align quotas with expected workload scale 4. Design Networking and Private Access Upfront Networking is one of the most critical components in migration. Key considerations: Use private endpoints for PaaS services Design subnet segmentation and NSGs carefully Ensure DNS resolution works across environments. 5. Standardize Monitoring and Observability Migration is not complete until the system is observable. Enable diagnostics and logs across all resources Integrate with Log Analytics / monitoring tools Define alerts for critical failures 6. Manage Security and Access with RBAC Use Azure AD-based authentication Assign least-privilege roles Store secrets in Key Vault 7. Expect Iterations — Not One-Time Deployment Real-world migrations are iterative: Initial deployment Fixes and adjustments Re-deployments Stabilization 8. Strengthen Cross-Team Alignment Large migrations involve multiple teams: Infrastructure Application Database Platform From experience: Early alignment reduces rework Clear ownership improves execution Structured communication avoids last-minute confusion 9. Capture Learnings and Standardize Every migration teaches something: Capacity gaps Deployment challenges Configuration improvements Document: lessons learned reusable templates standard deployment patterns 10. Leverage Automation and AI for Efficiency As migrations scale, automation becomes critical. Use scripts and pipelines to reduce manual effort Automate repetitive validation steps Explore AI-driven approaches for log analysis and troubleshooting AWS to Azure migration is not just a technical shift—it’s an opportunity to modernize, standardize, and optimize your cloud platform. The key is to: design before deploying automate everything possible plan for scale and security and continuously improve based on real-world learnings84Views0likes0CommentsIntegrate Jenkins with Azure Databricks & GitHub into VSCode
Hello Team, Greetings of the Day!!! Hope you have a great day ahead!!! We have installed extension of Azure Databricks, GitHub & Jenkins in VSCode. Now the configuration parts come into the picture, so we have configured Azure Databricks & Logged in GitHub in VSCode. Now Turn comes of Jenkins. We want to know that how can we configure Jenkins with GitHub. All Notebooks from Azure Databricks will be version controlled in GitHub for doing that we want to use Jenkins. There is no documentation to do so. Can you guide us how to do it. Reference Link :- https://learn.microsoft.com/en-us/azure/databricks/dev-tools/ci-cd/ci-cd-jenkins Thank You in advance for any Support or Suggestion : ) Looking forward for your valuable input. Regards, Niral Dave.480Views0likes1CommentCloud-Native vs. Hybrid for the 2026 Workplace
When to choose Cloud-Native vs. Hybrid for the 2026 Workplace? Hi everyone, I am starting a discussion on the foundational phase of one project. As a Computer Engineer, I believe the most critical decision we face in 2026 is determining exactly when to step to a Full Cloud model versus maintaining a Hybrid Infrastructure. In my view, the decision is not about cost, it is about resiliency, high availability and more avalability. I would like to exchange views with other engineers on these area: latency, edge requirements, integration and aglility. In your experience, what are the Tipps that makes you choose one over the other for a 2026 environment? I'm looking for technical architectural insights, not sales approaches.63Views0likes2CommentsSend custom data to Log Analytics in Azure Functions
When I send custom biz data to Azure Log Analytics in an Azure Function(C# code), I want to use a OperationalInsightsDataColloector, but I can't find the class in Microsoft.Azure.Operationallnsight package from nuget. Anyone knows whehter it is obseleted?490Views0likes1CommentAVD Environment- FSLogix Profile Login Failure – Write Protected Error
Hi, We are currently facing an issue with FSLogix user profiles in our environment and would appreciate your assistance in identifying and resolving the problem. Issue Description: Users are unable to log in successfully, and we are encountering the following error message: "No Create access → The media is write protected." Environment Details: Session Hosts: Microsoft Entra joined Users: Hybrid identities Profile Storage: Azure File Share Authentication Method: Identity-based access using Microsoft Entra Kerberos Configuration Details: We have assigned the FSLogix user group the role "Storage File Data SMB Share Contributor" on the Azure file share. Registry entry for Kerberose Ticket is also created. NTFS permissions have been configured via Azure Portal (Manage Access), granting Modify permissions to the FSLogix profile users on the file share folder. We can see that user profiles and corresponding VHDX files are being created successfully during login attempts. Problem Statement: Despite the successful creation of profiles and VHDX files, users are still unable to log in, and the error mentioned above persists. We would like your guidance on: Possible causes for the "write protected" error despite correct role and NTFS permissions. Any additional configurations or validations required for FSLogix with Entra Kerberos authentication. Recommended troubleshooting steps or logs we should review to isolate the issue. Please let us know if you need any additional logs, screenshots, or configuration details from our end. Looking forward to your support. Best regards, Ravi Yadav72Views0likes1CommentAzure Automation Hybrid Runbook Worker Supported OS
Hi everyone, we are currently in the process of updating or environment to Server 2025. Since the mainstream support of Server 2022 ends October this year, we would also like to update our on-premise Azure Automation Hybrid Runbook Worker from 2022 to 2025. As far as I can see from the https://learn.microsoft.com/en-us/azure/automation/extension-based-hybrid-runbook-worker-install?tabs=windows%2Cps#supported-operating-systems, OS is only supported up to Server 2022, but not Server 2025. Since the mainstream support end is closing in, is there any information on official support for Server 2025 for Azure Automation HRWs? Do you already have one successfully running with Server 2025? Thanks!Solved50Views0likes2CommentsPatterns for low-code Azure config state snapshot + recovery solution for resource groups
I’m looking for patterns that capture resource configuration changes over time and support best-effort recovery (redeployment) of resource config state. I understand that authoritative IaC (Bicep) would be the most mature option, however, I am wondering if anyone has ever implemented a solution similar to what I have described above. Ideally this would be a low-code, Azure native solution.42Views0likes1CommentMFA required for Global Admin without Conditional Access or PIM enforcement
Hi, I'm analyzing a break-glass account scenario in Microsoft Entra ID and would like to validate a behavior I'm observing. The account: Has Global Administrator role (permanent assignment) Is excluded from all Conditional Access policies (fully validated) Is excluded from Authentication Methods policies and MFA Registration Campaign (fully validated) Has no per-user MFA enabled (disabled) PIM is not enforcing MFA (role is permanently active, no activation required) Security Defaults are disabled SSPR is not enforcing MFA All configurable sources that could require MFA have been reviewed and fully ruled out. However, when signing into Microsoft Admin Portals (Entra/Azure), MFA is still required and cannot be skipped. In Sign-in logs: Conditional Access → Not Applied Authentication Details show: "MFA required in Azure AD" "App requires multifactor authentication" Additionally, there is a Microsoft-managed policy: "Multifactor authentication for admins accessing Microsoft Admin Portals" but it is in Report-only mode. Question: Is Microsoft Entra ID enforcing MFA automatically for privileged roles (like Global Administrator) in admin portals, even when no Conditional Access or PIM policy requires it? And if so, is there any supported way to fully exclude a break-glass account from this behavior? Thanks in advance.Solved79Views0likes1CommentUsing Github Copilot from Azure Subscription
Hello, I have a question on how GitHub Copilot can be accessed and managed through an Azure subscription. If I am getting a Github Copilot license, how is my azure subscription getting linked to the billing and licensing? Specifically, I would like clarification on how the Azure subscription is linked to GitHub Copilot billing and licensing.112Views0likes1CommentSign in to Azure DevOps
The https://dev.azure.com URL redirects to the landing page for the Azure DevOps product. I used to promote this as an URL to use to login to the product. Since this year the page is missing the "Already have an account? Sing in to Azure DevOps" link. As far as I can see there is no way to login to Azure DevOps trough this interface now. There is the usual "sing in" in the top right, which will redirect you to the azure portal (or at least for me it does). How are we supposed to login to Azure DevOps? Old login:Solved455KViews5likes16CommentsAzure Artifact Signing: SignTool "Access is denied" with active Public Trust profile
I’m blocked on Azure Artifact Signing for Windows EXE signing. What is already confirmed: - Account endpoint: https://wus2.codesigning.azure.net/ - Code signing account: notarios - Certificate profile: notarios-public-trust (Public Trust, Active) - Identity validation: Completed - User object id: 9aa27294-c04d-4aab-a7b2-3a8b10be96f9 - RBAC includes: - Artifact Signing Identity Verifier - Artifact Signing Certificate Profile Signer (also assigned at certificate profile scope) Signing command (signtool 10.0.26100.0 x64 + dlib): ... sign /v /debug /fd SHA256 /tr http://timestamp.acs.microsoft.com /td SHA256 /dlib "<...>\\Azure.CodeSigning.Dlib.dll" /dmdf "C:\temp\metadata-corr.json" "C:\temp\notarial-app-test.exe" Error every time: - SignTool Error: Access is denied. - Number of files successfully Signed: 0 I also tested Azure CLI auth and explicit AccessToken in metadata; same result. CorrelationId for troubleshooting: - notarios-20260425-1859 If anyone from Microsoft can check backend logs for that CorrelationId, I’d appreciate the exact reason and remediation.36Views0likes1CommentLegacy SSRS reports after upgrading Azure DevOps Server 2020 to 2022 or 25H2
We are currently planning an upgrade from Azure DevOps Server 2020 to Azure DevOps Server 2022 or 25H2, and one of our biggest concerns is reporting. We understand that Microsoft’s recommended direction is to move to Power BI based on Analytics / OData. However, for on-prem environments with a large number of existing SSRS reports, rebuilding everything from scratch would require significant time and effort. Since Warehouse and Analysis Services are no longer available in newer versions, we would like to understand how other on-prem teams are handling legacy SSRS reporting during and after the upgrade. Have you rebuilt your reports in Power BI, moved to another reporting approach, or found a practical way to keep existing SSRS reports available during the transition? Any real-world experience, lessons learned, or recommended approaches would be greatly appreciated.75Views0likes1CommentAzure RBAC Custom Role Best Practices or Common Build Patterns
As a platform admin, I want to grant application admins Contributor access while removing their ability to write or delete most Microsoft.Network resource types, with a few exceptions such as Private Endpoints, Network Interfaces, and Application Gateways. Based on the effective control plane permissions logic, we designed two custom roles. The first role is a duplicate of the Contributor role, but with Microsoft.Network//Write and Microsoft.Network//Delete added to notActions. The second role adds back specific Microsoft.Network operations using wildcarded resource types, such as Microsoft.Network/networkInterfaces/*. Application Admin Effective Permissions = Role 1 (Contributor - Microsoft.Network) + Role 2 (for example, Microsoft.Network/networkInterfaces/, Microsoft.Network/networkSecurityGroups/, Microsoft.Network/applicationGateways/write, etc.) I understand that Microsoft RBAC best practices recommend avoiding wildcard (*) operations. However, my team has found that building roles with individual operations is extremely tedious and time-consuming, especially when trying to understand the impact of each operation. Does anyone have suggestions for a simpler or more maintainable pattern for implementing this type of custom RBAC design?128Views1like2Comments
Events
Join our upcoming live webcast for a transparent discussion about this recent Azure service incident — led by our engineering teams.
Control plane issues in East US
Tracking ID: 5GP8-W0G | Impact...
Thursday, May 14, 2026, 09:30 AM PDTOnline
0likes
2Attendees
0Comments
Recent Blogs
- 3 MIN READMemory safety vulnerabilities—largely arising from widely used programming languages such as C and C++—remain a leading cause of exploitable software defects across systems, from embedded devices to ...May 08, 202694Views0likes0Comments
- Inspektor Gadget, the CNCF eBPF tool for Kubernetes and Linux observability, has completed its first independent security audit, conducted by Shielder and coordinated by OSTIF and CNCF. The audit fou...May 08, 202669Views0likes0Comments