orchestration
10 TopicsData Factory Increases Maximum Activities Per Pipeline to 80
This week we have doubled the limit on number of activities you may define in a pipeline, from 40 to 80. With more freedom to develop, we want to empower you to create more powerful, versatile, and resilient data pipelines for all your business needs. We are excited to see what you come up with, harnessing the power of 40 more activities per pipeline!10KViews4likes23CommentsOrchestrating ADX queries using managed Airflow
Apache Airflow is a widely used task orchestration framework, which gained its popularity due to Python-based programmatic interface - the language of first choice by Data engineers and Data ops. The framework allows defining complex pipelines that move data around different parts, potentially implemented using different technologies. The following article shows how to setup managed instance of Apache Airflow and define a very simple DAG (direct acyclic graph) of tasks that does the following: Uses Azure registered application to authenticate with the ADX cluster. Schedules daily execution of a simple KQL query that calculates HTTP errors statistics based on Web log records for the last day. Pre-requisites for completing the steps of this tutorial are: Microsoft Azure account. Running Azure Data Explorer (ADX) cluster.3KViews1like0CommentsWhat is Zero Trust and is it the Future of Cybersecurity?
Zero Trust is a security architecture that assumes the network is already infiltrated and implements multi-factor authentication, least privilege access, and real-time monitoring and analytics to secure digital assets. The Zero Trust model goes beyond the traditional perimeter-based security method and focuses on protecting identities, endpoints, applications, data, infrastructure, and networks. By using a Zero Trust model, organizations can better protect themselves from cyber-attacks and data breaches while still making sure they are compliant and staying productive. Zero Trust is the future of cybersecurity because it focuses on visibility, automation, and orchestration. It is also the key to securing your digital estate.14KViews4likes2CommentsPipeline Logic 3: Error Handling and Try Catch
In this series on Orchestration, we will dive deep to understand conditional executions in ADF, and build complex logic such as how to execute a shared error handling step of any failures in the pipeline, how to add informative logging with best effort attempt, and how to ensure all dependencies succeed before proceeding to next steps.11KViews1like1CommentPipeline Logic 1: Error Handling and Best Effort Step
In this series on Orchestration, we will dive deep to better understand conditional executions in ADF, and build complex logic such as: how to execute a shared error handling step of any failures in the pipeline, how to add informative logging with best effort attempt, and how to ensure all dependencies succeed before proceeding to next steps.15KViews5likes2CommentsPipeline Logic 2: OR (at least 1 activity succeeded or failed)
In this series on Orchestration, we will dive deep to understand conditional executions in ADF, and build complex logic such as how to execute a shared error handling step of any failures in the pipeline, how to add informative logging with best effort attempt, and how to ensure all dependencies succeed before proceeding to next steps.9.5KViews3likes2Comments