Azure Event Hubs update – Q4 2022
Published Oct 12 2022 09:30 AM 8,249 Views
Microsoft

Azure Event Hubs enables you to stream millions of events per second from any source using Kafka, AMQP or HTTPS protocols and build end-to-end event streaming pipelines.
We are excited to announce the availability of several new features and major enhancements in Azure Event Hubs, which will be available in Q4 of calendar year 2022.

 

Log Compaction – Public Preview

With log compaction feature, you can create compacted event hubs or compacted Kafka topics which retain events using key-based retention. By default, each event hub/Kafka topic is created with time-based retention, where events are purged upon the expiration of the retention time. Rather using coarser-grained time-based retention, you can create a compacted event hub so that the last known value for each event key of an event hub or a Kafka topic is retained.

 

how-compaction-work.png

Log compaction is enabled at the event hub level and the triggering of the compaction job is determined by the Event Hubs service based on the size of the dirty/uncompacted portion of the event log. Events are compacted based on the partition keys which are set per each event by the producer application via either Event Hubs SDK(AMQP) or Apache Kafka client APIs.

 

Log Compaction feature is available in Premium and Dedicated SKUs only. 

 

Resource governance with Application Groups - GA

When you stream events to and from Azure Event Hubs using different event streaming applications, you often need to govern those workloads individually. For instance, you may have to prioritize certain event streaming workloads while you throttle the other workloads beyond a certain throughput limit. Therefore, fine-grained governance of client application workloads that stream data to and from Event Hubs is a common business requirement.

AppGr.png

 

Using application groups, you can create logical groupings between client applications that connect (publish or consume events) with Event Hubs and apply throttling and data access policies per each group. You can associate an application group with a uniquely identifiable condition such as the security context (shared access signatures (SAS) or Azure Active Directory (Azure AD) application ID - of the client application) either at Event Hubs namespace level or event hub/Kafka topic level.

 

add-app-group (1).png

By defining an application group and using application group policies, you can throttle low priority producers or consumers while allowing high priority producers or consumers to stream data without any interruptions. Also, you can completely cut off the traffic flowing through an application group as well. For that you can simply disable an application group so that any client application of that group cannot make any connection to Event Hubs namespace and all the existing connections will also be terminated.

 

For more information on Application Groups, see Resource governance with application groups.

 

Apache Kafka protocol version 3.2 compatibility 

With Azure Event Hubs you can migrate any existing Apache Kafka workloads that run on-prem or any other managed Kafka services to Event Hubs with zero code changes. Owing to the multi-protocol broker engine architecture of Event Hubs, we can run Kafka workloads in Event Hubs without internally hosting Apache Kafka brokers. Therefore, with Event Hubs you can run Kafka workloads with better performance and far more cost efficiently than existing managed Kafka offerings or operating Kafka by yourselves.

kafka-3.2.png

We are excited to announce the Apache Kafka version 3.2 compatibility of Azure Event Hubs. Along with this we are adding Kafka compacted topics (preview) support, static Kafka group coordinator support and support for offset delete API.

There are certain Kafka features that we don’t support at the moment such as transactions, compression and so on. You can read more about them and recommended alternatives here.

 

Event processing with built-in no-code editor - GA

In most event streaming scenarios, you would have to build an event streaming pipeline. With Event Hubs, you can process real-time data streams using our integration with Azure Stream Analytics. The built-in no-code editor allows you to create streaming pipelines using intuitive graphical drag-and-drop tool without writing a single line of code.

event-hub-review-connect.png

 

With the GA of no-code editor, we are introducing several new built-in scenarios for processing event streams in Event Hubs, that includes:

  • Enrich data and ingest to event hub
  • Transform and store data to Azure SQL database
  • Filter and ingest to Azure Data Explorer

These are the list of pre-built scenarios that the Azure Stream Analytics no-code editor offers you.

event-hub-process-data-templates.png

You can find more details on the no-code editor for stream processing at : No-code stream processing through Azure Stream Analytics

Enhanced front-end load balancing

When you produce or consume events from Event Hubs the traffic goes through the front-end layer (which is known as Gateway) of Event Hubs and it is responsible for routing traffic to the corresponding topic partition, protocol canonicalization, enforcement of security and enabling other networking related features.
Considering the importance of the gateway to the overall system performance, we made significant investments in increasing its performance.
As an Event Hubs user, you don’t really have to do anything to use this capability as the enhancements will be automatically available on all the existing namespaces and cluster across all SKUs.

 

Schema Registry support in Event Hubs Spark Connector

Azure Event Hubs connector for Apache Spark offers the ability process huge amounts of streaming data using Spark Streaming and Structured streaming. We now support using Event Hubs Spark connector with Azure Schema Registry to use schema driven event formats in your Spark streaming use cases.

You can find the following samples of using Event Hubs Spark connector with Azure Schema Registry.

 

Enforce a minimum TLS version in an Event Hubs namespace - GA

Azure Event Hubs supports choosing a specific TLS version for namespaces. Currently Azure Event Hubs uses TLS 1.2 on public endpoints by default, but TLS 1.0 and TLS 1.1 are still supported for backward compatibility.
To enforce stricter security measures, you can now configure your Event Hubs namespace to require that clients send and receive data with a newer version of TLS. If an Event Hubs namespace requires a minimum version of TLS, then any requests made with an older version will fail.
You can specify the minimum TLS version when creating an Event Hubs namespace in the Azure portal.

For more information on configuring the minimum TLS version in Event Hubs see Enforce minimum TLS version in Event Hubs.  

 

 

2 Comments
Co-Authors
Version history
Last update:
‎Nov 16 2022 10:20 AM
Updated by: