Recent Discussions
Azure Pipline
Greetings, I have created a simple website using Visual Studio and I am looking to deploy it to Azure services via Azure Pipeline. Unfortunately, I am experiencing errors during the deployment phase of my YAML script and I require some assistance. If you have a moment, could you kindly respond to coordinate a Zoom meeting where we can review the issue together? Thank you kindly.655Views0likes1CommentAzure Load Test Pricing
H , This is regarding Azure load testing pricing. Please advice. Virtual User Hour (VUH) usage 0 - 10,000 Virtual User Hours - $0.15/VUH10,000+ Virtual User Hours - $0.06/VUH Virtual User Hour (VUH) usage 0 - 10,000 Virtual User Hours - $0.15/VUH10,000+ Virtual User Hours - $0.06/VUH I am trying to understand above pricing. Lets say i want to run a test with 10k users just to login my website. This will max take 10 sec to complete. How will the pricing gets calculated? Regards, Sharukh32Views0likes3CommentsRelease management on Azure devops dashboard
Hello, Everyone! I've been struggling to promote the feature's release. We have three product teams, and I am responsible for releasing features to approximately 16 countries. As a result, the key account managers and relationship managers of the various regions will always ping me to inquire about the release dates of their region and regions to be released on that date. Since I have ADO, how can I establish boards for each region to show them the future and release date? It should, for example, display them the progress and inform them that it will be launched on this exact date. Can you tell me how I can increase visibility using the ADO board to highlight release management? Thanks,1KViews0likes1CommentChanging the Backlog Iteration
Hello, all. We're thinking of creating a new backlog iteration for our team and setting it as the default backlog. We would keep the "old" backlog with closed stories, etc. intact. Would doing something like this pose any risks to historical sprint data or are there unintended consequences that should be considered?896Views0likes1CommentService Discovery in Azure Dynamically Finding Service Instances
Modern cloud-native applications are built from microservices—independently deployable units that must communicate with each other to form a cohesive system. In dynamic environments like Azure Kubernetes Service (AKS), Azure App Service, or Azure Container Apps, service instances can scale up, scale down, or move across nodes at any time. This creates a challenge: How do services reliably find and talk to each other without hardcoding IP addresses or endpoints? The answer lies in the Service Discovery architecture pattern. https://dellenny.com/service-discovery-in-azure-dynamically-finding-service-instances/14Views0likes0CommentsCommon Security & Governance Blind Spots in Azure Integration
"Hello everyone, I'm starting a discussion to gather insights on a critical topic: security and governance for Azure Integration Services (AIS). As environments grow with dozens of Logic Apps, Functions, APIM instances, etc., it becomes harder to maintain a strong security posture. I’d like to hear from your experience: What are the most common security and governance blind spots people miss when building out their integration platforms on Azure? To get us started, here are a few areas I'm thinking about: Secret Management: Beyond just "use Key Vault," what are the subtle mistakes or challenges teams face? Network Security: How critical is VNet integration and the use of Private Endpoints for services like Service Bus and Storage Accounts in your opinion? When is it overkill? Monitoring & Observability: What are the best ways to get a single, unified view of a business transaction that flows through multiple Azure services for security auditing? Looking forward to a great discussion and learning from the community's collective experience!"31Views0likes0CommentsProblem with output variables in Self Hosted Agent
I have the following sample code that I implement using Azure DevOps pipelines and it works without problem, I can see the value of the SAUCE variable in all the tasks in my Job. Then I run it but using a self hosted Agent that I have in an Azure virtual machine and it doesn't work anymore, for some reason the value of the variable is lost. - job: TestOutputVars displayName: Test Output Variables steps: - bash: | echo "##vso[task.setvariable variable=sauce;isOutput=true]crushed tomatoes" echo "my environment variable is $SAUCE" - bash: | echo "my environment variable is $SAUCE" - task: PowerShell@2 inputs: targetType: "inline" script: | Write-Host "my environment variable is $env:SAUCE" pwsh: true What could be the problem, or should I do some specific configuration in the virtual machine I am using as agent ?780Views0likes1CommentDelete Old IRM Labels
Hi, We just started using Purview and I want to set up Sensitivity Labels to protect information. Currently there are no sensitivity labels set up or visible in Purview. However, in Office Apps I can still see some Rights Management protection labels which were set up 10 years ago or more. I think these may have been set using AD RMS in our online Microsoft domain but I am not sure. They were never used and there are no documents protected using these labels. (To explain where I can see them: in Excel for example they are listed under File > Protect Workbook > Restrict Access) I would like to get rid of these old labels so we can start clean using new sensitivity labels in Purview, but I can't find them listed anywhere and I can't find any articles that seem to cover this. I would be very grateful if anyone could explain how to list and hopefully delete these old labels so we can start fresh. Many thanks.36Views0likes2CommentsCreate tilelayer using a GDAL Tiler tilematrix set
I'm porting an application from Bing maps which used many tilesets generated using the GDAL tiler. In the Bing Maps API these layers needed a custom url which modified the y component. So where Azure Maps uses the the option tileUrl:url I need something like ... This works in Bing ..... uriConstructor:function(coord){ // coord is a PyramidTileId with : z,y,zoom,pixwidth,pixheight. var ymax=1<<coord.zoom; var y=ymax-coord.y-1; return url.replace('{x}',coord.x).replace('{y}',y).replace('{z}',coord.zoom); } I do so hope there is someone out there with some cool code to get me out of a very tight spot. Thanks Steve45Views0likes3CommentsCustom Windows Server Standard VM on Azure: It Works, But Is It Licensing Compliant?
Hi everyone, I wanted to share a recent technical experience where I successfully created and deployed a Windows Server Standard VM on Azure using a fully custom image. I started by downloading the official Windows Server Standard Evaluation ISO. I created a Generation 2 VM in Hyper-V and completed the OS setup using the Desktop Experience edition. Once the configuration was done, I ran sysprep to generalize the image. After that, I converted the disk from VHDX to VHD in fixed format, which turned out to be a critical step because Azure does not accept dynamic disks. The resulting file was around 127 GB, so I uploaded it to a premium storage account container to ensure performance. From there, I created a Generation 2 image in Azure and deployed a new VM from it. I then activated the Standard edition with a valid product key. Everything worked smoothly, but I’m still unsure whether this method is fully compliant with Microsoft’s licensing policies. Specifically, I’m trying to understand if going from an Evaluation ISO to sysprep, upload, deployment, and activation in Azure is a valid and compliant scenario when not using BYOL with Software Assurance or a CSP license. Has anyone gone through this process or has any insights on the compliance aspect? Thanks in advance for any guidance or clarification.Pipeline is running test cases twice after moving to new folder
Hi I'm running test cases (written i SpecFlow/C#) on Azure Pipeline connected to an Azure agent on an on-premise PC. I've recently moved out some feature-files (test cases) to a subfolder and it works great when i run locally. But when I run the same code in the pipeline the moved test cases are being run twice. I've disabled some test cases (by commenting out the content in the feature file) and run the pipeline, which strangely still running the disabled test case (but only once this time). It seems that Azure uses some cashed data when running and doesn't really notice all committed changes. Have any of you seen that problem before?725Views0likes1CommentEnhance YAML Pipelines 'Changes' API Endpoint to allow user to specify the 'artifact'
There exists an API endpoint that allows a user to request the changes for a Classic Release run and pass the Artifact Alias that they would like the changes from: https://vsrm.dev.azure.com/{collection}/{project}/_apis/Release/releases/{releaseId}/changes?artifactAlias={artifactAlias} If a team replicates this type of pattern in YAML pipelines where there is a build pipeline (or multiple) and then a multi-stage YAML 'release' pipeline, there is no way to get the changes from the build artifacts. You can request the changes from the API for the multi-stage 'release' pipeline, but you cannot get the changes for the build pipelines IN THE CONTEXT OF the currently releasing stage. The build artifacts are specified in the release pipeline in the resources: resources: pipelines: - pipeline: myBuildPipeline source: myBuildPipeline trigger: branches: include: - main566Views0likes1CommentError: spawn C:\hostedtoolcache\windows\terraform\1.5.5\x64\terraform.exe ENOENT
2023-08-21T19:27:27.5568244Z ##[error]Error: There was an error when attempting to execute the process 'C:hostedtoolcachewindowsterraform1.5.5x64terraform.exe'. This may indicate the process failed to start. Error: spawn C:hostedtoolcachewindowsterraform1.5.5x64terraform.exe ENOENT Azure devops pipeline fails with above error. I have renamed my artifact and then run pipeline. Path is correct only. Even if new run given or saved from github to trigger pipeline, it fails . Please help.924Views0likes1CommentHow to connect Tableau to Azure DevOps test result tables?
Hi All, I need to connect to test results (datababase/tables) Azure DevOps 2022, I found this: https://analytics.dev.azure.com/<account>/<Project>/_odata/v2.0/ But I can connect only to project tables, not the ones having the result of test cases executed. Also I found that maybe CDATA drivers can help but spending money is not an option. Do you know if there is a way to connect to test tables? maybe to connect to tfs_warehouse database or any other database? I can see only theses tables: Areas BoardLocations Dates Iterations Processes Projects Tags Teams Users WorkItemBoardSnapshot WorkItemLinks WorkItemTevisions WorkItems WorkItemSnapshot WorkItemTypeFields Thanks Jorge824Views1like1CommentUpdate Test Configuration after Testrun
Hello, I am trying to update/change the Test Configuration of a Testrun. Via Rest API it is possible to Patch the Test Result, where i can find the Config ID. Successfully i could update the comment afterwards with https://dev.azure.com/{organization}/{project}/_apis/test/runs/{testRunID}/Results?api-version=7.0 [ { "id": 100000, "comment": "Website theme is looking good" } ] But when updating the Config, only the Revision increments but the Config don't change [ { "id": 100000, "configuration": { "id": "193" } } ] Is this field changeable or is it read only? According to the microsoft documentation (https://learn.microsoft.com/en-us/rest/api/azure/devops/test/results/update?view=azure-devops-rest-7.0&tabs=HTTP#testcaseresult) the configuration is part of the testresult und it should be able to update. Any Ideas of this?783Views0likes1CommentAZ-900 exam free voucher Azure
Hello, I recently finished Azure Fundamentals (https://learn.microsoft.com/en-us/credentials/certifications/azure-fundamentals/?practice-assessment-type=certification) But I see, to take final exam and then claim certificate, it is needed to pay around $50-$90 based on region. So my question is, are there currently any absolutely free vouchers for that exam? -100% I have been looking for it for around month, still no clue. Please help, and best regards to You Microsoft Learners.39Views0likes0CommentsBuilding a Fully Secure Architecture Integrating Azure OpenAI
As AI adoption accelerates, organizations must ensure that AI services are secure, scalable, and compliant with enterprise security policies. Azure OpenAI Service provides powerful AI capabilities, but securing access to it is crucial when integrating with applications. In this blog, we will explore how to build a fully secure architecture by integrating Azure OpenAI Service with Azure API Management (APIM), Private Endpoints, and Applications. https://dellenny.com/building-a-fully-secure-architecture-integrating-azure-openai-with-apim-private-endpoints-and-applications/29Views0likes0CommentsExtracting Information from PDFs and Storing in a Database Using Azure AI Services
Handling documents efficiently is a critical requirement for many businesses. Extracting structured data from PDF files and storing it in a database can streamline operations in finance, legal, healthcare, and other industries. Azure AI Services provides robust tools for automating this process, including Azure AI Document Intelligence (formerly Form Recognizer) and Azure Cognitive Services. In this blog, we’ll walk through how to: Read a PDF document Extract relevant data Store the extracted information in a database https://dellenny.com/extracting-information-from-pdfs-and-storing-in-a-database-using-azure-ai-services/31Views0likes0Comments
Events
Recent Blogs
- What’s New? The updated experience introduces a dedicated App Service Quota blade in the Azure portal, offering a streamlined and intuitive interface to: View current usage and limits across th...Sep 04, 202563Views1like0Comments
- When we think about human intelligence, memory is one of the first things that comes to mind. It’s what enables us to learn from our experiences, adapt to new situations, and make more informed decis...Sep 04, 202569Views0likes0Comments