We are happy to announce that the public preview of capturing Event Hubs data in Delta Lake format with Stream Analytics no-code editor is now available.
Delta Lake is an open source storage layer that brings reliability to data lakes. Delta Lake extends Parquet data files with a file-based transaction log to provide ACID transactions, scalable metadata handling, and unify streaming and batch data processing. The Delta Lake transaction log has a well-defined open protocol that can be used by any system to read the log.
The Stream Analytics no-code editor is a drag and drop design tool that helps customers to develop the Stream Analytics jobs without writing a single line of code. The experience provides a canvas that allows you to connect to input sources to quickly see your streaming data. Then you can transform and preview it before writing to your destination of choice in Azure. To learn more, see No-code stream processing through Azure Stream Analytics | Microsoft Learn.
To capture your event hub data into ADLS Gen2 storage with Delta Lake format, you will need to have:
To access this capability, simply go to your Event Hubs in portal -> Features -> Process data or Capture:
You will be asked to provide a Stream Analytics job name to create the job. Once it is created, you will enter the no-code editor canvas to configure your data capturing as below.
Select Azure Data Lake Storage Gen2 tile, on Azure Data Lake Storage Gen2 configuration page, you can configure the necessary parameters, including the “Delta table path”, ADLS Gen2 account, etc.
Once everything is configured with what you want, you can start the job by clicking “Start” on the ribbon. Data will start to be captured into ADLS Gen2 storage with Delta Lake format shortly.
To learn more about the Stream Analytics no-code editor and Delta Lake output of Stream Analytics, see these documents below:
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.