Today we are releasing a new update of Azure Stream Analytics that introduces numerous new features to add new options for security, reliability, as well as introducing new outputs and improvement to our SQL language. Here is the list of significant changes. As a Platform-as-a-Service the product is continuously updated so you always get the latest version and can access all these new features automatically.
Azure Data Explorer is a fast and highly scalable data exploration service for log and telemetry data. Azure Data Explorer is very complementary to Azure Stream Analytics as Azure Stream Analytics reacts to the data in real-time and can trigger alerts or transform the data as it comes and can send data to Azure Data Explorer where users can analyze extremely large volumes of data.
Today we are announcing that Azure Stream Analytics can directly output data to Azure Data Explorer, simplifying architecture where customers need both hot and warm path analytics on streaming data. This feature will be added to all Stream Analytics regions progressively. More information are available in this blog post: "Azure Data Explorer is now supported as output for Azure Stream Analytics job" .
We are also introducing output to Azure PostgreSQL, giving customers a new option to output streaming data to relational database. With this new output, Azure Stream Analytics jobs write to Azure PostgreSQL Single Server, Flexible Server (Preview) and Hyperscale (Citus), supporting high throughput insert of streaming data to existing table.
As we continue to bring the Stream Analytics Query Language closer to T-SQL, we are extending existing functions, and adding new ones, that will simplify queries for the most common tasks and bring stream and batch closer together.
In particular we added 12 String manipulation functions, Unicode/NCHAR conversion support, extended support of MIN/MAX aggregate to VARCHAR(MAX), and new bitwise operators.
More details and examples are available in this blog post: Ignite 2021 - Stream Analytics Query Language Improvements.
Managed Identity is a key security feature that eliminates the need for developers to manage credentials, and help to make your Stream Analytics jobs more secure, and remove the need to rotate your keys or connection strings. Today we are now extending support for Managed Identity with the following announcements:
With these new announcements, Stream Analytics supports Managed Identity for the following inputs and outputs. More will be added in the future.
Type |
Adapter |
Managed Identity Support |
Storage Account |
Blob/ADLS Gen 2 |
Yes |
Inputs |
Event Hubs |
Yes |
IoT Hubs |
No (available with workaround: users can route events to Event Hubs) |
|
Blob/ADLS Gen 2 |
Yes |
|
Reference Data |
Blob/ADLS Gen 2 |
Yes |
SQL |
Yes (preview) |
|
Outputs |
Event Hubs |
Yes |
SQL Database |
Yes |
|
Blob/ADLS Gen 2 |
Yes
|
|
Table Storage |
No |
|
Service Bus Topic |
No |
|
Service Bus Queue |
No |
|
Cosmos DB |
No |
|
Power BI |
Yes |
|
Data Lake Storage Gen1 |
Yes |
|
Azure Function |
No |
|
Azure Synapse Analytics |
Yes |
In addition to the previous features, we are introducing a new policy for data exfiltration prevention, enabling users to make sure the highest level of security is applied to their clusters and jobs.
In the last 6 months, we continued to extend the global footprint of Azure Stream Analytics, by adding 10 new regions: Australia Central, China East 2, China North 2, Germany West Central, Norway East, South India, South Africa North, Switzerland North, United Arab Emirates North and West US3. With this announcement, Stream Analytics is now available in 38 Azure regions worldwide.
Additionally Stream Analytics does now offer support for Availability Zones with Dedicated Cluster. Any new Dedicated Cluster will automatically benefit from Availability Zones, and in case of disaster in a zone will continue to run seamlessly by failing over to the other zones without the need of any user action. Availability Zones provide customers with the ability to withstand datacenter failures through redundancy and logical isolation of services. This will significantly reduce the risk of outage for your streaming pipelines.
New job diagram in Azure portal and in VS code can help you visualize your job pipeline. It shows inputs, outputs, and query steps. You can use the job diagram to examine the metrics for whole job or each step. This will help you to understand how the job is progressing and to isolate the source of a problem when you troubleshoot issues. For example, How much data is coming in vs how much data is backlogged.
Customers now have improved User Experience for their job diagram with job topology, metrics and job summary in a single view. We also completely changed the backend of this feature to improve job diagram availability and responsiveness.
Job diagram is Azure Portal
Customers can now use job diagram to view their pipeline for cloud jobs. You can also view logs to understand the errors.
|
This update includes hundreds of performance and functional improvements. While we are not releasing full release notes, but we wanted to call out the following ones:
To get started with Azure Stream Analytics, you can use one of our quick starts on our documentation page .
You must be a registered user to add a comment. If you've already registered, sign in. Otherwise, register and sign in.