pipeline
13 TopicsHow to disable AzureDevOps Pipeline Parallelism
I have just created an organization in AzureDevops in the Free Tier. And everytime I try to create a Pipline ( even the starter/empty one ) I get this error. ##[error]No hosted parallelism has been purchased or granted. To request a free parallelism grant, please fill out the following form https://aka.ms/azpipelines-parallelism-request I tried to google around to find how can I change the settings to Parallelism and disable it or deactivate it, but I couldn't find anything useful. I tried: demands: Limit -equals DisAbleParallel but didn't do anything. Probably this is only for the self-hosted. I am not sure what could be the next thing that I need to do. I appreciate any help,6.5KViews0likes1CommentValidate changed files in last commit
Hi, I created the Pipeline using https://marketplace.visualstudio.com/items?itemName=touchify.vsts-changed-files and also test manually. The problem is when execute the pipeline, I need validate only the new files but the Azure Pipeline execute a reset HARD and therefore I don't have multiple commit in pipeline and I can't validate the files that different in new commit. How can I validate only the new files? My company has repositories in Azure DevOps of .Net and inside the repository has the SRC directory with multiple functions. Each function contains its respective pipeline then I need check what function had alterations for run the pipleine.2.2KViews0likes0CommentsPipeline Alias
Hello, I have a DevOps pipeline in which I need to download an artifact produced by another pipeline. I want to use something like this: - task: DownloadPipelineArtifact@2 displayName: Download Pipeline Artifacts inputs: source: 'specific' project: '$(resources.pipeline.<alias>.projectID)' pipeline: '$(resources.pipeline.<alias>.pipelineID)' preferTriggeringPipeline: true ${{ if eq(parameters.buildPipelineId, 'latest') }}: buildVersionToDownload: 'latestFromBranch' ${{ if ne(parameters.buildPipelineId, 'latest') }}: buildVersionToDownload: 'specific' runId: '${{ parameters.buildPipelineId }}' runBranch: '$(Build.SourceBranch)' path: $(Pipeline.Workspace)/${{ parameters.artifactName }} but I don't know which value I must insert instead of "alias". My source pipeline is named "1-TestPipeline" and is under the TEST folder in the DevOps Pipeline tab. Can you help me? Thanks, G1.8KViews0likes1CommentEnvironment comparison
Hi , Can anyone please help to understand, how do we compare two different environments Azure DevOps pipeline? Our scenario is , we have different Development team who will be responsible for all the Dev Environment related Build and Release Pipeline setup and deployment. Whatever changes are done in the Dev environment the same needs to be replicated in other environments like QA, UAT which are owned by other team. Currently we are facing challenges when run release pipeline on QA & UAT environments by referring Dev environment. Sometimes we are missing some variables, parameters whatever used in the Dev environments using terraform. The deployment is working well in the dev environment whereas not working in other environments using the same source code and same terraform scripts. Due to this, we are struggling every deployment code release whenever Dev team is releasing the code. Is there any possibilities how do we compare Dev environment with all the variables, parameters, pipeline configurations with QA & UAT environments and make sure all our configurations are matching with Dev environments without any discrepancy?1.6KViews0likes0CommentsPiping and the Pipeline
First published on TECHNET on Jun 06, 2010 Quick: What one feature truly separates Windows PowerShell from other scripting/shell languages? How many of you said “the pipeline?” Really? That many of you, huh? It’s inevitable: no sooner do you get Windows PowerShell installed then you start hearing about “piping” and “the pipeline.918Views0likes0CommentsEnhancing Azure DevOps Traceability: Tracking Pipeline-Driven Resource Changes
Introduction One of the key challenges in managing cloud infrastructure with Azure DevOps is maintaining clear traceability between pipeline executions and modifications to Azure resources. Unlike some integrated CI/CD platforms, Azure does not natively link resource changes to specific DevOps pipelines, making it difficult to track which deployment introduced a particular modification. To address this gap, two approaches have been presented that improve traceability and accountability: Proactive tagging of resources with pipeline metadata during deployment. Analyzing Azure Activity Logs to trace resource changes back to specific DevOps pipeline runs. By implementing these solutions, organizations can enhance governance, streamline debugging, and ensure better auditability of Azure DevOps deployments. Challenges in Traceability Without a built-in integration between Azure DevOps pipelines and Azure resource modifications, teams face difficulties such as: Lack of direct ownership tracking for resource updates. Manual investigation of deployment logs, which is time-consuming. Difficulty in debugging incidents caused by misconfigured or unauthorized changes. Approach 1: Proactive Tagging of Azure Resources in Pipelines This approach embeds metadata directly into Azure resources by adding tags at deployment time. The tags include: Pipeline Name (‘PipelineName’) Pipeline ID (‘PipelineId’) Run ID (‘RunId’) These tags serve as persistent markers, allowing teams to quickly identify which pipeline and build triggered a specific resource change. Link to Github repository can be found here. Implementation A YAML pipeline file (‘PipelineTaggingResources.yml’) is used to automate tagging as part of the deployment process. Key Steps Pipeline triggers on updates to the main branch. Tags are dynamically generated using environment variables. Tags are applied to resources during deployment. Pipeline logs and artifacts store metadata for verification. Code Snippet Below is an example of applying tags in an Azure DevOps pipeline: - task: AzureCLI@2 displayName: "Tag Resources with Pipeline Metadata" inputs: azureSubscription: "Your-Service-Connection" scriptType: bash scriptLocation: inlineScript inlineScript: | az tag create --resource-id "/subscriptions/xxxxx/resourceGroups/xxxx/providers/Microsoft.Web/sites/MyAppService" \ --tags PipelineName=$(Build.DefinitionName) PipelineId=$(Build.DefinitionId) RunId=$(Build.BuildId) Approach 2: Using Azure Activity Logs to Trace Resource Changes This approach is useful after changes have already occurred. It analyzes Azure Activity Logs to track modifications and correlates them with Azure DevOps build logs. Implementation A PowerShell script (‘ActivityLogsTrace.ps1’) extracts relevant details from Azure Activity Logs and compares them with DevOps pipeline execution records. Key Steps Retrieve activity logs for a given resource group and timeframe. Filter logs by operation type, e.g., ‘Microsoft.Web/serverfarms/write’. List Azure DevOps pipelines and their recent runs. Compare pipeline logs with activity log metadata to find a match. Output the responsible pipeline name and run ID. Comparison of Approaches Approach When to Use Advantages Limitations Tagging Azure Resources in Pipelines Before deployment (Proactive) Immediate traceability. No need to query logs Requires modifying deployment scripts Using Azure Activity Logs After changes have occurred (Reactive) Works for any past modification. No need to modify pipelines Requires Azure Monitor logs. Manual correlation of logs Which Approach Should You Use? For new deployments? Tagging resources is the best approach. For investigating past changes? Use Azure Activity Logs. For a robust solution? Combine both approaches for full traceability. Conclusion Tagging resources in Azure DevOps pipelines provides immediate traceability, while analyzing Azure Activity Logs enables retrospective identification of resource modifications. Tagging requires modifying deployment scripts, whereas log analysis works post-change but needs manual correlation. Combining both ensures robust traceability. For details, check the GitHub Repository.Need help to set up a pipeline using tasks and YAML to deploy a BICEP template and use a param file
I am trying to set up a pipeline that will deploy a BICEP template written by our cloud team and then use a param file to pass in the application specifics to create the Azure resource group and app services for an application. Ideally do not want to do this in powershell and instead in YAML or even better using the tasks in the pipeline builder. Currently everything I have tried doesn't work and errors. Has anyone done this before and have a proven method or any tips?817Views0likes1CommentAzure Devops Pipeline for Power Platform solution : Tenant to Tenant
I have a query related to Azure Devops Pipeline. Is it possible to move form one tenant to another tenant. Is it possible to move canvas app solution using Azure Devop Pipelines. If yes, I am using Sharepoint lists. Can this be moved, as it wont be the part of solution, should it be done manually? I am also using environment variables, How will this be mapped in the receiver tenant using pipelines. I have few solutions that i need to move from one solution to another every sprint. So what will be the suitable plan/license for it. Can you please share relevant documents or steps or related videos if any regarding this.212Views0likes1CommentADF Data Flow Fails with "Path does not resolve to any file" — Dynamic Parameters via Trigger
Hi guys, I'm running into an issue with my Azure Data Factory pipeline triggered by a Blob event. The trigger passes dynamic folderPath and fileName values into a parameterized dataset and mapping data flow. Everything works perfectly when I debug the pipeline manually or trigger the pipeline manually with the trigger and pass in the values for folderPath and fileName directly. However, when the pipeline is triggered automatically via the blob event, the data flow fails with the following error: Error Message: Job failed due to reason: at Source 'CSVsource': Path /financials/V02/Forecast/ForecastSampleV02.csv does not resolve to any file(s). Please make sure the file/folder exists and is not hidden. At the same time, please ensure special character is not included in file/folder name, for example, name starting with _ I've verified the blob file exists. The trigger fires correctly and passes parameters The path looks valid. The dataset is parameterized correctly with @dataset().folderPath and @dataset().fileName I've attached screenshots of: 🔵 00-Pipeline Trigger Configuration On Blob creation 🔵 01-Trigger Parameters 🔵 02-Pipeline Parameters 🔵 03-Data flow Parameters 🔵 04-Data flow Parameters without default value 🔵 05-Data flow CSVsource parameters 🔵 06-Data flow Source Dataset 🔵 07-Data flow Source dataset Parameters 🔵 08-Data flow Source Parameters 🔵 09-Parameters passed to the pipeline from the trigger 🔵 10-Data flow error message https://primeinnovativetechnologies-my.sharepoint.com/:b:/g/personal/john_primeinntech_com/EYoH5Sm_GaFGgvGAOEpbdXQB7QJFeXvbFmCbZiW85PwrNA?e=0yjeJR What could be causing the data flow to fail on file path resolution only when triggered, even though the exact same parameters succeed during manual debug runs? Could this be related to: Extra slashes or encoding in trigger output? Misuse of @dataset().folderPath and fileName in the dataset? Limitations in how blob trigger outputs are parsed? Any insights would be appreciated! Thank youSolved136Views0likes1Commentdynamic pipeline and deployments with approvals
I am looking to create a multistage YAML pipeline in Azure DevOPS that: * Dynamically generates jobs based on the value(s) of a variable from a previous step (e.g., host names). * Ensures that each dynamically created job includes its own approval step, either through ManualApproval@0 or via an environment. The challenge I am facing is that while "strategy: matrix" allows for the dynamic creation of build jobs, this strategy is not permitted for deployments, which are necessary for implementing approval steps. Do you have any suggestions on how to resolve this issue?69Views0likes1Comment