devops
172 TopicsHas anyone here integrated JIRA with Azure DevOps
We are currently using Azure Pipelines for our deployment process and Azure Boards to track issues and tickets. However, our company recently decided to move the ticketing system to JIRA, and I have been tasked with integrating JIRA with Azure DevOps. If you have done something similar, I will appreciate any guidance, best practices, or things to watch out for.193Views0likes4CommentsHow to deploy n8n on Azure App Service and leverage the benefits provided by Azure.
Lately, n8n has been gaining serious traction in the automation world—and it’s easy to see why. With its open-source core, visual workflow builder, and endless integration capabilities, it has become a favorite for developers and tech teams looking to automate processes without being locked into a single vendor. Given all the buzz, I thought it would be the perfect time to share a practical way to run n8n on Microsoft Azure using App Service. Why? Because Azure offers a solid, scalable, and secure platform that makes deployment easy, while still giving you full control over your container and configurations. Whether you're building a quick demo or setting up a production-ready instance, Azure App Service brings a lot of advantages to the table—like simplified scaling, integrated monitoring, built-in security features, and seamless CI/CD support. In this post, I’ll walk you through how to get your own n8n instance up and running on Azure—from creating the resource group to setting up environment variables and deploying the container. If you're into low-code automation and cloud-native solutions, this is a great way to combine both worlds. The first step is to create our Resource Group (RG); in my case, I will name it "n8n-rg". Now we proceed to create the App Service. At this point, it's important to select the appropriate configuration depending on your needs—for example, whether or not you want to include a database. If you choose to include one, Azure will handle the connections for you, and you can select from various types. In my case, I will proceed without a database. Proceed to configure the instance details. First, select the instance name, the 'Publish' option, and the 'Operating System'. In this case, it is important to choose 'Publish: Container', set the operating system to Linux, and most importantly select the region closest to you or your clients. Service Plan configuration. Here, you should select the plan based on your specific needs. Keep in mind that we are using a PaaS offering, which means that underlying compute resources like CPU and RAM are still being utilized. Depending on the expected workload, you can choose the most appropriate plan. Secondly—and very importantly—consider the features offered by each tier, such as redundancy, backup, autoscaling, custom domains, etc. In my case, I will use the Basic B1 plan. In the Database section, we do not select any option. Remember that this will depend on your specific requirements. In the Container section, under 'Image Source', select 'Other container registries'. For production environments, I recommend using Azure Container Registry (ACR) and pulling the n8n image from there. Now we will configure the Docker Hub options. This step is related to the previous one, as the available options vary depending on the image source. In our case, we will use the public n8n image from Docker Hub, so we select 'Public' and proceed to fill in the required fields: the first being the server, and the second the image name. This step is very important—use the exact same values to avoid issues. In the Networking section, we will select the values as shown in the image. This configuration will depend on your specific use case—particularly whether to enable Virtual Network (VNet) integration or not. VNet integration is typically used when the App Service needs to securely communicate with private resources (such as databases, APIs, or services) that reside within an Azure Virtual Network. Since this is a demo environment, we will leave the default settings without enabling VNet integration. In the 'Monitoring and Security' section, it is essential to enable these features to ensure traceability, observability, and additional security layers. This is considered a minimum requirement in production environments. At the very least, make sure to enable Application Insights by selecting 'Yes'. Finally, click on 'Create' and wait for the deployment process to complete. Now we will 'stop' our Web App, as we need to make some preliminary modifications. To do this, go to the main overview page of the Web App and click on 'Stop'. In the same Web App overview page, navigate through the left-hand panel to the 'Settings' section. Once there, click on it and select 'Environment Variables'. Environment variables are key-value pairs used to configure the behavior of your application without changing the source code. In the case of n8n, they are essential for defining authentication, webhook behavior, port configuration, timezone settings, and more. Environment variables within Azure specifically in Web Apps function the same way as they do outside of Azure. They allow you to configure your application's behavior without modifying the source code. In this case, we will add the following variables required for n8n to operate properly. Note: The variable APP_SERVICE_STORAGE should only be modified by setting it to true. Once the environment variables have been added, proceed to save them by clicking 'Apply' and confirming the changes. A confirmation dialog will appear to finalize the operation. Restart the Web App. This second startup may take longer than usual, typically around 5 to 7 minutes, as the environment initializes with the new configuration. Now, as we can see, the application has loaded successfully, and we can start using our own n8n server hosted on Azure. As you can observe, it references the host configured in the App Service. I hope you found this guide helpful and that it serves as a useful resource for deploying n8n on Azure App Service. If you have any questions or need further clarification, feel free to reach out—I'd be happy to help.3.7KViews4likes8CommentsKickstart Conditional Access in Microsoft Entra: Free Starter Pack with Policies & Automation
Introduction Conditional Access (CA) is the backbone of Zero Trust in Microsoft Entra ID. It helps you enforce security without compromising productivity. But rolling out CA can feel risky what if you lock out admins or break apps? To make this easier, I’ve created a free starter pack with: Ready-to-use policy templates (JSON) PowerShell scripts for deployment via Microsoft Graph GitHub Actions workflow for automation Safe rollout strategy using report-only mode Why This Matters Block legacy authentication to reduce attack surface. Require MFA for admins to protect privileged accounts. Handle high-risk sign-ins with compliant device + MFA. Validate impact before enforcing using report-only mode. What’s Inside the Starter Pack ✔ Policies Block legacy authentication Require MFA for admin roles High-risk sign-ins → compliant device + MFA Safety-net report-only baseline ✔ Scripts Deploy policies (deploy-conditional-access.ps1) Export existing policies Toggle report-only mode ✔ Automation GitHub Actions workflow for CI/CD deployment ✔ Docs Usage guide Safe rollout checklist How to Use It Download the repo: GitHub Repo: https://github.com/soaeb7007/entra-ca-starter-pack Install Microsoft Graph PowerShell SDK: Install-Module Microsoft.Graph -Scope CurrentUser Connect-MgGraph -Scopes 'Policy.ReadWrite.ConditionalAccess','Directory.Read.All' Select-MgProfile -Name beta Deploy policies in report-only mode: ./scripts/deploy-conditional-access.ps1 -PolicyPath ./policies -ReportOnly Validate impact in Sign-in logs before enforcing. Safe Rollout Checklist Exclude break-glass accounts, Start with report-only, Validate for 48–72 hours, Roll out to pilot group before org-wide Next Steps Enable report-only mode for new policies. Explore Conditional Access templates in Entra portal. Watch for my next post: “Optimizing Conditional Access for Performance and Security.” What’s your biggest challenge with Conditional Access? Drop it in the comments, I’ll cover the top 3 in my next post.61Views0likes0Comments[ADO] Work Items custom states
Hello. Wer shifting to ADO as a project management tool (not for software development and delivery, not directly). I'm creating our own custome process (inheriting from Agile) and was wondering if there is any way for us (my team) to have custom workflow states to be created by default instead of havig to do it manually for each work item type we create. This is what we would like to have and has we have some several WI types to create would make our lives easier and possibly future proof it in case we need to make changes (add or edit - through delete+add - more WI). Thank you.251Views0likes1CommentAzure DevOps sometimes doesn't trigger build for GitHub PRs
Hello Community, I have an Azure Pipelines plugin installed on my application's GitHub repo. We should have a CI build trigger for each PR and when those PR merged into specific branches. The issue is that sometimes the build is getting triggered for the PR commits, but when merging this PR to the branch, the build is not triggered, and I have to run it manually.259Views0likes1CommentBuilt a Real-Time Azure AI + AKS + DevOps Project – Looking for Feedback
Hi everyone, I recently completed a real-time project using Microsoft Azure services to build a cloud-native healthcare monitoring system. The key services used include: Azure AI (Cognitive Services, OpenAI) Azure Kubernetes Service (AKS) Azure DevOps and GitHub Actions Azure Monitor, Key Vault, API Management, and others The project focuses on real-time health risk prediction using simulated sensor data. It's built with containerized microservices, infrastructure as code, and end-to-end automation. GitHub link (with source code and documentation): https://github.com/kavin3021/AI-Driven-Predictive-Healthcare-Ecosystem I would really appreciate your feedback or suggestions to improve the solution. Thank you!125Views0likes2CommentsScaling Smart with Azure: Architecture That Works
Hi Tech Community! I’m Zainab, currently based in Abu Dhabi and serving as Vice President of Finance & HR at Hoddz Trends LLC a global tech solutions company headquartered in Arkansas, USA. While I lead on strategy, people, and financials, I also roll up my sleeves when it comes to tech innovation. In this discussion, I want to explore the real-world challenges of scaling systems with Microsoft Azure. From choosing the right architecture to optimizing performance and cost, I’ll be sharing insights drawn from experience and I’d love to hear yours too. Whether you're building from scratch, migrating legacy systems, or refining deployments, let’s talk about what actually works.83Views0likes1CommentIn Azure Devops, How to view all child work items for a dependent feature ?
I have a Epic with few features. In one of the feature, I have a user story that has a related link to other features. Is it possible to see all the features. its child user stories, tasks, bugs that are open for all the associated features ? Epic -> Feature 1 -> User stories -> Tasks, Bugs Epic -> Feature 2 -> User stories -> Tasks, Bugs Epic -> Feature 3 -> User stories -> Tasks, Bugs Epic -> Feature 4 -> Special User story (with relation to Feature 1, 2) -> Tasks, Bugs In the view I want to see all the features (that are associated to Special User story) and its childs Epic -> Feature 1 -> User stories, Tasks, Bugs Epic -> Feature 2 -> User stories, Tasks, Bugs370Views0likes2CommentsAzurepipeline Extension- doesn't give any error nor able to show dynamic dropdown value
I have created an extension, pushed to Marketplace & then used it in my org all this went so smooth. then when I started building Pipe line at TASK step when I choose my extension it is populating a field that has options pre-defined. but when it comes to dynamic it says "Not Found" aka empty. Details:- Custom step has 3 fields. Category- Cars [ pre defined option list ] Color - Blue [ pre defined option list ] Car List - this used endpoint - https://gist.githubusercontent.com/Satyabsai/b3970e2c3d229de2c70f1def3007ccfc/raw/02dc6f7979a83287adcb6eeecddb5575bef3516e/data.json ******************** TASK.JSON file**************************** { "id": "d9bafed3-2b89-4a4e-89b8-21a3d8a4f1d3", "name": "TestExecutor", "friendlyName": "Execute ", "description": "Executes", "helpMarkDown": "", "category": "Test", "author": "s4legen", "version": { "Major": 5, "Minor": 4, "Patch": 0 }, "instanceNameFormat": "Execute Test Suite $(carlist)", "inputs": [ { "name": "category", "type": "pickList", "defaultValue": "Web", "label": "Category", "required": true, "helpMarkDown": "Select the ", "options": { "mobile": "car", "web": "truck", "api": "Plan" } }, { "name": "Color", "type": "pickList", "defaultValue": "Blue", "label": "color", "required": true, "helpMarkDown": "Select the ", "options": { "nonProd": "Blue", "prod": "Red" } }, { "name": "Carlist", "type": "pickList", "defaultValue" :"BMWX7", "label": "Carlist", "required": true, "helpMarkDown": "Select the list to execute", "properties": { "EditableOptions": "true", "Selectable": "true", "Id": "CarInput" }, "loadOptions": { "endpointUrl": "https://gist.githubusercontent.com/Satyabsai/b3970e2c3d229de2c70f1def3007ccfc/raw/02dc6f7979a83287adcb6eeecddb5575bef3516e/data.json", "resultSelector": "jsonpath:$[*]", "itemPattern": "{ \"value\": \"{value}\", \"displayValue\": \"{displayValue}\" }" } } ], "execution": { "Node16": { "target": "index.js" } }, "messages": { "TestSuiteLoadFailed": "Failed to load test from endpoint. Using default options." } } ************** ************************* const tl = require('azure-pipelines-task-lib/task'); const axios = require('axios'); const TEST_ENDPOINT = 'https://gist.githubusercontent.com/Satyabsai/b3970e2c3d229de2c70f1def3007ccfc/raw/02dc6f7979a83287adcb6eeecddb5575bef3516e/data.json'; async function getValue(field) { if (field === 'Carlist') { try { const response = await axios.get(TEST_ENDPOINT, { timeout: 5000 }); return { options: response.data.map(item => ({ value: item.value, displayValue: item.displayValue })), properties: { "id": "CarlistDropdown" } }; } catch (error) { tl.warning(tl.loc('TestLoadFailed')); } } return null; } async function run() { try { const color = tl.getInput('color', true); const category = tl.getInput('category', true); const carlist = tl.getInput('Carlist', true); const result = await axios.post(tl.getInput('clicQaEndpoint'), { testSuite, category, environment }, { timeout: 10000 }); tl.setResult(tl.TaskResult.Succeeded, `Execution ID: ${result.data.executionId}`); } catch (err) { tl.setResult(tl.TaskResult.Failed, err.message); } } module.exports = { run, getValue }; ******************** CAN SOMEONE TELL ME WHAT JSON RESPONSE IS ACCPETABLE BY AZURE TO POPULATE DROPDOWN DYNAMICALLY SOURCE IS api74Views0likes1CommentSSL certificate problem while doing GIT pull from Azure Devops Repos
We are using a proxy server that does SSL inspection of traffic and thus replaces the cert with the one that it issues in the process. That cert is issued by the cert authority on the proxy itself. This is fairly common with modern proxies. But users are getting following error while doing Git pull:- "git pull fatal: unable to access 'https://ausgov.visualstudio.com/Project/_git/Repo': SSL Certificate problem: self-signed certificate in certificate chain" Do I need to import the proxy CA issuing cert in Devops portal somewhere to resolve this or does the SSL inspection needs to be removed? Has anybody got it to work with proxy inspection still turned on?169Views0likes1Comment