Azure Messaging and Stream Processing updates will cover recent releases from Azure’s Messaging and Streaming services that include Azure Service Bus, Azure Event Grid, Azure Event Hubs, Azure Stream Analytics and Azure Relay. This update will cover releases in Q2 of this calendar year.
Listen to a fantastic introductory talk about queues by our Messaging Architect, Clemens Vasters, who presented at the WeAreDevelopers World Congress.
Communication between a client application and an Azure Service Bus namespace is encrypted using Transport Layer Security (TLS). Transport Layer Security is a standard cryptographic protocol that ensures privacy and data integrity between clients and services over the Internet.
Azure Service Bus now supports choosing a specific TLS version for namespaces. Currently Azure Service Bus uses TLS 1.2 on public endpoints by default, but TLS 1.0 and TLS 1.1 are still supported for backward compatibility.
Azure Service Bus namespaces permit clients to send and receive data with TLS 1.0 and above. To enforce stricter security measures, you can configure your Service Bus namespace to require that clients send and receive data with a newer version of TLS. If a Service Bus namespace requires a minimum version of TLS, then any requests made with an older version will fail.
During Build we announced the preview of the completely revamped Service Bus Explorer tool on the Azure portal. Azure Service Bus has two types of operations which can be performed against it:
While we have offered a portal-based Service Bus Explorer for data operations for a while now, our customers have provided us with feedback that the experience was still lacking compared to the community managed Service Bus Explorer OSS tool.
To empower our customers even further, we have released a new version of Service Bus Explorer, which brings many new capabilities to the portal for working with their messages, right from the portal. For example, it is now possible to send, receive, and peek messages on queues, topics, and subscriptions, including their dead-letter sub-queues. The tool allows to perform operations such as complete, re-sending, and deferral. Moreover, this can be done on a single message or for multiple messages at once.
Event Grid now features new event sources through its integration with Microsoft Graph API (MGA). Through this partnership, Microsoft Graph API publishes events from the following event sources: Azure Active Directory, Microsoft Outlook, Microsoft Conversations, Microsoft Teams, Microsoft SharePoint and OneDrive, Security Alerts. Learn more.
Partner Events is now generally available. This is a feature that allows non-Azure services like SaaS services, generic platforms, or ERP providers to integrate to customer applications using events. Those services, which we refer to as partners, publish their events so that customers’ solutions can react to them. Customers configure Event Grid so that those events are delivered to an event handler, which processes the event. Learn more.
Partner Events includes the following noteworthy functionality that is new or whose behavior has changed since the last API release.
Upon a customer’s request, partners create a partner topic to which events are routed. To increase customers’ security stance, this GA version enforces the need for a customer to authorize a partner to create a partner topic on the customer’s Azure subscription. Without a partner authorization, any attempt from the partner to create a partner topic (channel) fails. Learn more.
Partners that wish to be publicly available on Event Grid’s partner gallery need to go through a verification process that validates, among other things, their identity. This feature provides assurance to customers they are dealing with a vetted partner organization. Learn more.
Partner route events to customers using Channels. Channels now support source-based routing, a kind of event routing based on of the event’s source context attribute. Learn more.
Starting with control plane API version 2022-06-15, Event Channels are no longer available as a resource type. However, wherever you have Event Channels, you now see Channels, which is the equivalent kind of resource that functionally replaces Event Chanels. Channels offer additional event routing capabilities on top of those Event Channels offered. You do not need to recreate your Event Channels as Channels. However, your deployment scripts need to be updated with ARM IDs that refer to Channels wherever you currently refer to the same resources as Event Channels, for example.
Partners now require a partner registration when creating a partner namespace. Partner registration is required as Event Grid now tracks partner authorizations using partner registration immutable IDs.
Event Grid now offers a simpler partner registration user experience with a reduced number of properties required to create a partner registration. Functionality is not affected as the properties no longer present in a partner registration are now tracked by partner verifications.
In-line event definitions allow partners to declare the event types they are making available on a partner topic. With the availability of this feature, a user subscribing to events that originate in a partner’s system can more easily configure Event Grid so that only certain event types are delivered to a destination. If provided by the publishing partner, a user can also obtain more information about the events. For example, a user may get the event schemas and links to the events’ documentation. In-line events are now available for channels/partner topics. To learn more, consult this article section and look for steps where in-line event definitions are added to a channel.
Event subscription configuration is now included when exporting ARM templates for Topics and Domains.
A new GA control plane API version 2022-06-15 has been released. New functionality released with this API version includes all the functionality described above.
Azure Event Grid now has a new landing page. The new landing page offers the following features aimed at providing a simpler user experience for new and seasoned Azure Event Grid customers.
SAP applications events can now be routed to Event Grid. Requirements for receiving events from SAP include having Business Technology Platform (BTP) on your SAP installation. BTP includes SAP Event Mesh, which is the broker used to route events to Azure Event Grid. If you are interested in receiving SAP applications events on Azure, please send us an email at mailto:email@example.com to participate in our private preview.
Retirement of Event Grid on IoT Edge is in progress now. It will be fully retired on March 31st, 2023. Learn more.
Azure Event Grid now supports delivering events to a specific event hub partition which is configured using custom delivery properties. Learn more.
Azure Functions now support CloudEvents schema when using Event Grid triggers. Learn more.
Azure Event Grid now supports events originating from Storage Accounts that use DNS zone endpoints. There are no new functional differences or new event types. You configure system topics and event subscriptions the same way you do when using any Azure service. Learn more.
Your applications now have a way to determine the input event schema expected by Topics and Domains using HTTP Options method. Learn more.
Azure Health Data Services events are now available through Azure Event Grid. Learn more.
You can now independently throttle your event streaming workloads based on client applications information (such as SAS or AAD application Id security) using Application Groups in Azure Event Hubs.
Using application groups, you can create logical groupings between client applications that connect (publish or consume events) to Event Hubs and apply throttling and data access policies.
Each application group must have a uniquely identifiable condition such as the security context (shared access signatures (SAS) or Azure Active Directory (Azure AD) application ID - of the client application) associated to it. Each group can have multiple policies such as throttling polices attached to it. And the workloads that correspond to that application group must adhere to the policies defined under that application group.
To learn more and try out application groups, read
Azure Event Hub Premium is designed for event streaming workloads that require consistent low latency and high performant event ingestion.
We conducted a performance benchmarking test for Azure Event Hubs Premium and analyzed how it performs against Apache Kafka and AMQP workloads by focusing on performance metrics: end-to-end event streaming latency and event publishing latency.
We published the latest performance benchmarking results for Azure Event Hubs Premium where we were able to achieve predictable 10ms end-to-end latency with triple replication across Availability Zones for both Kafka and native AMQP workloads.
You can find the details of performance benchmarking report at: Benchmarking Azure Event Hubs Premium for Kafka and AMQP workloads - Microsoft Tech Community
Event Hubs Capture feature can be used to load real-time streaming data to data lakes, warehouses, and other storage services, so that the stored data can be processed or analyzed by analytics services is one of the most popular use cases for Azure Event Hubs.
Now we have further expanded to capture capability by adding support for Apache Parquet in addition to the existing Avro format.
Using Azure Event Hubs’ no code editor for event processing, you can automatically capture streaming data in an Azure Data Lake Storage Gen2 account in Parquet format.
The no code editor allows you to easily develop an Azure Stream Analytics job without writing a single line of code.
Now you can monitor and audit data plane interactions of your client applications with Event Hubs using runtime audit logs and application metrics logs.
Using Runtime audit logs, you can capture aggregated diagnostic information for all data plane access operations such as publishing or consuming events.
Application metrics logs capture the aggregated data on certain runtime metrics (such as consumer lag and active connections) related to client applications and are connected to Event Hubs. Therefore, you can use application metrics to monitor runtime metrics such as consumer lag or active connection from a given client application.
Run time logs are enabled via diagnostic setting and created by periodically aggregating the log of data plane operations. Learn more about run time logs of Azure Event Hubs at Monitoring Azure Event Hubs - Azure Event Hubs | Microsoft Docs.
Azure Schema Registry is a feature of Event Hubs, which provides a central repository for schema documents for event-driven and messaging-centric applications. Azure Schema Registry allows the producer and consumer applications to exchange data without having to manage and share the schema.
You can find more information about Schema Registry for Azure Event Hubs at Azure Schema Registry in Azure Event Hubs - Azure Event Hubs | Microsoft Docs.
Dedicated Event Hubs cluster is the preferred choice of many customers who want to run single tenant event streaming use cases with massive throughput, sub-second latency and complete tenant isolation.
Previously, the scaling of these dedicated clusters requires you to create a support ticket and wait till we complete the scaling. Dedicated clusters now support self-serving scaling where you can manually scale up or scale down capacity units allocated to the cluster.
Self-serve scalable clusters are currently available in a limited number of regions and we are planning to expand their availability to other regions soon.
Stream Analytics no code editor provides an amazing drag and drop experience for you to develop your Stream Analytics jobs. This greatly reduces the barrier to entry for common stream processing scenarios such as streaming ETL, ingestion and materializing data to Azure Cosmos DB. You can easily get started directly in your Azure Event Hubs instance that has your real-time data streams. This new capability allows you to:
Stream Analytics jobs are configured to run with a certain number of Streaming Units (SUs), which is an abstraction for the amount of compute and memory available to the job. In the real world, streaming data is produced by various sources and has unpredictable patterns. With the release of native autoscale capability in Azure Stream Analytics (Preview), you can define the minimum and maximum SUs along with the conditions for autoscaling, Stream Analytics will take care of dynamically optimizing the number of SUs needed for your workload, allowing you to save on costs without compromising on performance. You can learn more in this blog.
New updates to the Stream Analytics portal experience include:
Thousands of customers continue to ramp up their usage of Stream Analytics and now require even more scale to keep up with the large volumes of streaming data that need to be analyzed. Stream Analytics is increasing the maximum size of jobs and clusters from 192 SUs to 396 SUs to help you scale as needed to meet your needs.
You can now use user-assigned managed identities to connect to your input sources and output sinks. We have also added support for connecting to Azure Service Bus and Azure Cosmos DB output sinks using Managed Identities.
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.