Recent Discussions
Data Collection Rule : XPath queries to filter 7036 without WMI etc
Hi, In PowerShell on server I’m trying to filter out some events from Event Id 7036 Service Control Manager Start stop services. I’m trying to filter out WMI Performance Adapter, so I don’t want to have those events imported in log analytic workspace with data collection rule. Can you help me what I’m doing wrong ? $XPath = 'System!*[System[(EventID="7036")]] and [EventData[Data[@Name="param1"]!="WMI Performance Adapter"]]' Get-WinEvent -FilterXPath $XPath Get-WinEvent : Could not retrieve information about the Security log. Error: Attempted to perform an unauthorized operation.. At line:3 char:1 + Get-WinEvent -FilterXPath $XPath + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [Get-WinEvent], Exception + FullyQualifiedErrorId : LogInfoUnavailable,Microsoft.PowerShell.Commands.GetWinEventCommand Get-WinEvent : No events were found that match the specified selection criteria. At line:3 char:1 + Get-WinEvent -FilterXPath $XPath + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : ObjectNotFound: (:) [Get-WinEvent], Exception + FullyQualifiedErrorId : NoMatchingEventsFound,Microsoft.PowerShell.Commands.GetWinEventCommand $XPath = 'System!*[System[(EventID="7036")]] and [EventData[Data[@Name="param1"]!="WMI Performance Adapter"]]' Get-WinEvent -LogName 'System' -FilterXPath $XPath Get-WinEvent : The specified query is invalid At line:2 char:1 + Get-WinEvent -LogName 'System' -FilterXPath $XPath + ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ + CategoryInfo : NotSpecified: (:) [Get-WinEvent], EventLogException + FullyQualifiedErrorId : System.Diagnostics.Eventing.Reader.EventLogException,Microsoft.PowerShell.Commands.GetWinEventCommand461Views0likes1CommentAzure Static Web App CI/CD
Hi everyone! I know this is a silly question, but I want to ask why, after connecting my Azure Static Web App to my GitHub and it would connect the Git Workflow, the commit would fail. Although, I haven't finished setting up some other resources yet, and I just connected my StatWebApp URL to my Azure Maps, there are other resources that I still need to deploy, and I still need to properly wire the backend to my Azure AI Services. Thanks in advance!23Views0likes1Comment[Design Pattern] Handling race conditions and state in serverless data pipelines
Hello community, I recently faced a tricky data engineering challenge involving a lot of Parquet files (about 2 million records) that needed to be ingested, transformed, and split into different entities. The hard part wasn't the volume, but the logic. We needed to generate globally unique, sequential IDs for specific columns while keeping the execution time under two hours. We were restricted to using only Azure Functions, ADF, and Storage. This created a conflict: we needed parallel processing to meet the time limit, but parallel processing usually breaks sequential ID generation due to race conditions on the counters. I documented the three architecture patterns we tested to solve this: Sequential processing with ADF (Safe, but failed the 2-hour time limit). 2. Parallel processing with external locking/e-tags on Table Storage (Too complex and we still hit issues with inserts). 3. A "Fan-Out/Fan-In" pattern using Azure Durable Functions and Durable Entities. We ended up going with Durable Entities. Since they act as stateful actors, they allowed us to handle the ID counter state sequentially in memory while the heavy lifting (transformation) ran in parallel. It solved the race condition issue without killing performance. I wrote a detailed breakdown of the logic and trade-offs here if anyone is interested in the implementation details: https://medium.com/@yahiachames/data-ingestion-pipeline-a-data-engineers-dilemma-and-azure-solutions-7c4b36f11351 I am curious if others have used Durable Entities for this kind of ETL work, or if you usually rely on an external database sequence to handle ID generation in serverless setups? Thanks, Chameseddine12Views0likes1CommentAzure passowrd protection
We have a hybrid Azure infrastructure with an AD Connector installed on-prem and configured for PTA. We installed the password protection server and registered it with the Azure tenant, then deployed the DC agent on all domain controllers. Both the proxy and agents are operational. We published a few banned words to block in case anyone uses them. For testing, I changed my password to include one of the banned words. To my surprise, I was able to change the password. I checked the corresponding logon server, and the DC event viewer showed that the password was validated, but the banned word was in the password list that Azure set to enforce. Why is it not blocking the change?23Views0likes1CommentHow to troubleshoot if a cookie is being sent to application gateway with each and every request
I have a rule on WAF policy associated with application gateway with a rule (set as topmost rule) to allow traffic if a particular cookie is sent with the request. But we are seeing some requests that are not hitting that rule and instead hitting different rule and thus getting blocked. My thinking is that the cookie is not being sent by the application in that request, although the developer says that it should be sent with each request. How can I log enough detail on application gateway to see if a cookie was really sent with the request that was blocked or not.12Views0likes1CommentConsumer REST API for Azure Event Hub
Hello, We have an existing setup where we use Kafka and we have a Kafka client written in Java that talks to the Kafka server. We have a Java producer that sends data to consumer for a topic and a Java consumer that subscribes to this topic and gets data fro kafka. We are now trying to offer similar support for Azure Event Hub too. From the documentation and examples, I can see that we can add similar code in Java. Basically a Java producer and consumer that talks to Azure Event Hub server. We are now trying to do the same with REST API. I see that we have an API to send data to an Azure Event Hub server. But I don't see any field in that to include the topic. In addition, is there an API to consume data for a particular topic or subscribe for a particular topic? (Since the documentation mentions nothing about consumer API, I am assuming there is no support for consumer REST API in Azure Event hub. But wanted to confirm. Logically also, using REST API for this kind of system where producer and consumer behave in async fashion is not a good idea) Please let me know. Thanks, Om538Views1like1CommentI passed the GH‑900: GitHub Foundations exam!
Hi everyone, I’m excited to share that I cleared the GH‑900 (GitHub Foundations) exam with a good score! This certification validates my understanding of Git, repository collaboration, pull requests, and GitHub’s core features. Preparation Approach: I studied using Microsoft Learn resources and the GH‑900 study guide. For extra practice and exam-style questions, I used dumps-4-azure — it really gave me the extra edge for exam readiness. I also practiced hands-on with real GitHub workflows (branches, pull requests, projects) to reinforce my understanding. Key Takeaways: The exam tests foundational Git + GitHub collaboration skills — not just theory. Practical experience combined with mock questions made a big difference. Consistency in daily preparation is the key. Next Steps: After GH‑900, I’m planning to go for GH‑100 (GitHub Administration) to deepen my GitHub skills at the organizational level.68Views1like1CommentMetricsQueryClient returning different results based on timespan
I'm using the Python MetricsQueryClient to list out how many tokens were used on certain days via the APIM policy "azure-openai-emit-token-metric". The problem is that when I call the query_resource() function with "timespan" set for the entire month of October, I get different results for token count usage for today's date than when I set the "timespan" to just the last 48 hours. For example, when setting the timespan to be from 10/20/2024 to 10/22/2024, I see 34 prompt tokens for today's date. But if I set the timespan to be 10/1/24 to 11/1/24, I see 0 prompt tokens for today's date. Is this a known issue? It is documented somewhere?141Views0likes1CommentPAAS resource metrics using Azure Data Collection Rule to Log Analytics Workspace
Hi Team, I want to build a use case to pull the Azure PAAS resources metrics using azure DCR and push that data metrics to log analytics workspace which eventually will push the data to azure event hub through streaming and final destination as azure postgres to store all the resources metrics information in a centralized table and create KPIs and dashboard for the clients for better utilization of resources. I have not used diagnose setting enabling option since it has its cons like we need to manually enable each resources settings also we get limited information extracted from diagnose setting. But while implementing i saw multiple articles stating DCR is not used for pulling PAAS metrics its only compatible for VM metrics. Want to understand is it possible to use DCR for PAAS metrics? Thanks in advance for any inputs.Solved57Views0likes2CommentsIssue with Hyper-V VM on Tagged VLAN – Traffic Reaches Local Hosts but Not External Networks
Hi everyone, I’m having an issue getting a Hyper-V VM to work correctly when using a tagged VLAN interface. I have a test VM configured with a trunk port and a tagged VLAN. Here is the configuration I’m using: Set-VMNetworkAdapterVlan -VMName "testvlan" -Trunk -NativeVlanId 2 -AllowedVlanIdList "4" The strange part is this: When the VM is on VLAN 4 (tagged), it can reach other resources on the same VLAN as long as those resources are running on the same Hyper-V host. But if the target resource is outside the Hyper-V host, the VM cannot reach it at all. The hardware vendor has already ruled out any issue with the top-of-rack switches interconnecting the hosts. If I reconfigure the VM’s network adapter in access mode on the same VLAN, then all traffic works normally and the VM can reach resources outside the host without any problem. So it seems that traffic leaves the host correctly only when the adapter is in access mode, not when using a trunk with VLAN tagging. Has anyone seen this behavior before or has suggestions on what to check next?42Views0likes1CommentMS SQL backup immutability
Hello. What is you experience on enabling immutability for MS SQL backups while running Always on AGs on VM? Backups must locked and not be modifiable after written. I have looked at ~7 different solutions but non of them seems to be ideal. Thanks for you time!91Views0likes3CommentsAzure SQL DTU or vCore
Hello everyone, I have a windows server with SQL Server 2016 standard edition which contains 11 databases of various sizes (some of a few gigabytes and others reaching 150Gbytes), the windows server has 4vCore + 16Gbytes of ram and being a test environment we don't have big problems use it with those resources. Taking into account that on that server: 1) few users are connected and only for some days of the week 2) we use SQL Agent service, DB Mail, linked server and integrated authentication in AD (synchronized with AAD) I have looked at the Azure cost calculator but I have doubts (and above all a little confusion!) regarding the type of PaaS service that would be better to use, I would certainly choose serverless but there are two types: SQL Database and SQL Managed instances For Azure SQL Database there is the "Single Database" or "Elastic Pool" typology and for both the purchase model is for DTU or vCore. I would therefore like to have your opinion to understand the best solution to adopt while keeping the costs as low as possible being a test environment. Thank you!822Views0likes1CommentCustom Script Extensions and Session Host Configuration
Currently the Custom Script Extensions functionality definable in the Session Host Configuration only allows to define a script URL. What is the intended mechanism of authentication for this solution? Currently it seems that its only possible to use an anonymous access level Blob. Defining a token within the script URL is not great due to the fact that the URL is viewable in plain text via the Azure Portal. Neither of those will satisfy. CSE configuration by the Session Host Configuration during deployment. Key vault references are used when defining credentials for domain join and local admin accounts for the Session Hosts. Would it be possible to have key vault references for CSE Storage Account Name/Key or SAS token or the possibility to define a Managed Identity instead? These can be defined when deploying the CSEs manually. Please guide me as to what the best solution would be to this topic.60Views0likes1CommentAzure File copy task v4 and later causes 403 error
I've configured a release pipeline in ADO which copies some files to a Storage Account. Using Azure File copy task version 6 consistently fails with a 403 error. RESPONSE Status: 403 This request is not authorized to perform this operation using this permission. After much wasted time checking IP restrictions, checking access and recreating service connections I tried using an earlier version of the task that some other pipelines which do the same thing were using. I found that using version 4 or later of the file copy task causes the issue. Setting the task version to 3 works. Are there any known issues around this?22Views0likes1CommentIssue with AVD User Profile – FSLogix Not Recreating
Hi all, We have a user who has repeatedly reported that their settings and favorites are not loading in AVD. To troubleshoot, we deleted the user’s FSLogix profile from our storage account to allow it to recreate automatically. However, the profile is not being recreated. We are operating in a hybrid environment, and the user is part of a group assigned the Storage File Data SMB Share Elevated Contributor role. From the profile logs, we found the following error: FindFile failed for path: \\<redacted>.file.core.windows.net\userprofiles\<redacted>\Profile*.VHD (Account restrictions are preventing this user from signing in. For example: blank passwords aren't allowed, sign-in times are limited, or a policy restriction has been enforced.) What are some likely causes and additional troubleshooting steps we should take?107Views0likes4CommentsThe November Innovation Challenge Winning Teams!
We run the Innovation Challenge program because we believe the only way we can have the best AI platform for every person and every organization is by having a truly diverse and highly skilled community of developers building AI solutions on Azure. We run the Innovation Challenge program because we are geeks who love a good hackathon. We run the Innovation Challenge program because we get blown away by what our community can do. From our first Innovation Challenge hackathon in June of 2024 to our sixth that just finished in November of 2025, the growth curve is steep! Our judges work with the best development teams in the world, delivering cutting edge AI solutions. But even with our front row view of things, we are amazed by what can be done today when ad hoc teams come together, despite limited resources and tight deadlines. Participants were asked to choose one of these real world use cases. Auto-resolve Service Desk: Create a multi agent service desk experience that reduces wait times and backlog while earning trust through safe automation, transparency, and graceful escalation. Civic Chat: Build an intelligent civic engagement platform that enables communities to access local government information, participate in discussions, and receive personalized updates using Azure AI services. Customer Personalization Orchestrator: Build a team of agents that segments customers, retrieves product content, creates message variants, and executes A/B/n experiments, with safety checks for content and proof of uplift. This time around there were 76 projects from over 300 participants representing more than a dozen organizations in the program. The winners chosen by the judges came from Código Facilito, DIO, GenSpark, Project Blue Mountain, and Women in Cloud. First place $10,000 AgroHelpdesk: an intelligent service desk for agribusiness that uses a coordinated set of AI agents Second place $5,000 CivicUtopia: an intelligent and inclusive civic engagement platform designed to streamline how citizens interact with their local governments and political landscape. Multi-Agent Service Desk for Education: Large educational institutions struggle with repetitive service desk requests—password resets, course enrollment inquiries, transcript requests, and more. This solution intelligently resolves routine cases while escalating only the complex ones to human staff. Third place $2,500 ResolveIQ: an intelligent helpdesk solution that uses autonomous AI agents, advanced orchestration, and Azure cognitive services to revolutionize customer support and internal assistance. ChainReach AI: multi-agent system that automatically personalizes marketing campaigns at scale CivicChat (D.C.) : a multilingual, AI-powered civic engagement assistant designed to make government information accessible, trustworthy, and easy to understand Tune into Microsoft DevRadio over the next couple weeks to meet these teams!485Views3likes3CommentsContainer on App Service keeps getting stopped and terminated
I've got a .Net app running in a Docker container that I'm trying to run on a Linux App Service but as per the (sanitised) log output below from the Platform log stream, it's getting terminated only 4 seconds after it started. Where can I get information on why this is happening? Starting container: a0e3af0a_myapp-dev-as. Starting watchers and probes. Starting metrics collection. Container is running. Container start method finished after 1990 ms. Container is terminating. Grace period: 0 seconds. Stop and delete container. Retry count = 0 Timestamps removed as the forum doesn't seem to like log output?Solved41Views0likes2CommentsCMK and Customer Certificate support for TDE - Azure SQL PAAS
hi experts, I need bit of clarity as both CMK is supported for Azure SQL TDE ( Server and DB ) and also Certificate for protecting the DEK. How these 2 concepts are different in protecting the DEK in Azure SQL PaaS. CMK - https://learn.microsoft.com/en-us/azure/azure-sql/database/transparent-data-encryption-byok-overview?view=azuresql-mi Certificate - https://learn.microsoft.com/en-us/sql/relational-databases/security/encryption/transparent-data-encryption?view=sql-server-ver16 Does it mean I can protect the DEK with both Custom Customer Certificate as well as CMKs ? Thank you529Views0likes1CommentAzure SQL Database : Can I use same primary key column and foreign key column for multiple tables?
CREATE TABLE Table1( PRIMARY KEY (Table1ID), Column2 int ); CREATE TABLE Table2( PRIMARY KEY (Table1ID), Column2 int, FOREIGN KEY (Table1ID) REFERENCES Table1(Table1ID) ); CREATE TABLE Table3( PRIMARY KEY (Table1ID), Column2 int, FOREIGN KEY (Table1ID) REFERENCES Table1(Table1ID) );306Views0likes1CommentCan anyone attest to the accuracy of an Azure Migrate Business Case?
Hello! I've only created a business case in a simple lab environment using 5 on-prem Hyper-V servers. (SmartHotelHost lab from Github) The business case export explains that I'll be saving over $100K annually once fully migrated into Azure after multiple years. (It's only 5 servers!) That said, I've been reluctant to suggest the Business Case tool and steer clients toward the Azure Migrate Assessment and Azure Pricing Calculator which have proven to be reliable tools. Anyone have any experience with the business case? Was it accurate? Thanks a bunch! Rich46Views0likes1Comment
Events
Recent Blogs
- At Microsoft, we believe that meaningful open source participation is driven by people, not corporations. But companies can - and should - create the conditions that empower individuals to contribute...Dec 16, 202535Views0likes0Comments
- 8 MIN READIntroduction Extracting structured data from large, semi-structured documents (the detailed solution implementation overview and architecture is provided in this tech community blog: From Large Sem...Dec 15, 2025105Views1like0Comments