AzureFunBytes Reminder - @Azure Data Factory Security with @narainabhishek - 5/20/2021

Microsoft

 

AzureFunBytes is a weekly opportunity to learn more about the fundamentals and foundations that make up Azure. It's a chance for me to understand more about what people across the Azure organization do and how they do it. Every week we get together at 11 AM Pacific on Microsoft LearnTV and learn more about Azure.

jaydestro_0-1621434669781.gif

 

This is part two of our series on Azure Data Factory. Last time Mark helped get us on the road to understanding how to best get our data into the cloud by using the linked services and tools with Azure Data Factory. Data Factory contains a series of interconnected systems that provide a complete end-to-end platform for data engineers. Utilizing our data requires some thoughtfulness when it comes to security.

This week on AzureFunBytes, Senior Program Manager, Abishek Narain will help us learn more about security best practices for Data Engineers.

 

 

 

When securing your data pipeline, there are some configurations and settings suggested by Azure you should follow. You'll want to follow this security baseline that applies guidance from the Azure Security Benchmark version1.0 to Azure Data Factory. You'll also want to ensure login details to your data endpoints are protected to ensure there's no unauthorized access due to credentials existing in the wild. There are also network security considerations that you will want to adhere to for various data stores that are accessed by Azure Data Factory, whether they are in the cloud or on-prem.

When: May 20, 2021 11 AM Pacific / 2 PM Eastern
Where: Microsoft LearnTV

Our Agenda:

  • Authentication
  • Meta-data encryption (Meta-data at rest)
  • Credential management
  • Data in transit
  • Data at rest
  • Network Security
  • Azure Policy integration

From the Azure Documentation on security considerations:

Azure Data Factory including Azure Integration Runtime and Self-hosted Integration Runtime does not store any temporary data, cache data or logs except for linked service credentials for cloud data stores, which are encrypted by using certificates. With Data Factory, you create data-driven workflows to orchestrate movement of data between supported data stores, and processing of data by using compute services in other regions or in an on-premises environment. You can also monitor and manage workflows by using SDKs and Azure Monitor.

So let's connect and collect together and talk about big data security! We'll take questions from the audience and try to understand how all the pieces fit together to gain important insights that drive our businesses.


Learn about Azure fundamentals with me!

Live stream is available on Twitch, YouTube, and LearnTV at 11 AM PT / 2 PM ET Thursday. You can also find the recordings here as well:

AzureFunBytes on Twitch
AzureFunBytes on YouTube
Azure DevOps YouTube Channel
Follow AzureFunBytes on Twitter

Useful Docs:

 

 

AzureFunBytes Episode 43 - Intro to @Azure Data Factory with @KromerBigData
Get $200 in free Azure Credit
Microsoft Learn: Introduction to Azure fundamentals
Azure security baseline for Azure Data Factory
Security considerations for data movement in Azure Data Factory
What is Data Factory?
Data access strategies
Azure Key Vault secrets in pipeline activities
Azure Policy documentation
Compute environments supported by Azure Data Factory

0 Replies