As enterprises move to the cloud, there is an explosion of data from various sources like business workflows, mobile apps, telemetry, logs, and IoT sensors. It’s never been more important to have the power to process these data streams in real-time to gain insights that give you a competitive advantage and respond rapidly to changing business needs.
Azure Stream Analytics has always strived to provide the easiest, most delightful experience to build stream processing pipelines with a SQL-first approach and abstracting away all the complexities that are automatically taken care of behind the scenes. Today, we are excited to announce a slew of improvements to further simplify building and operating stream processing pipelines.
No code stream processing
We are thrilled to announce the Stream Analytics no code editor which provides an amazing drag and drop experience for you to develop your Stream Analytics jobs. This greatly reduces the barrier to entry for common stream processing scenarios such as streaming ETL, ingestion and materializing data to Azure Cosmos DB. You can easily get started directly in your Azure Event Hubs instance that has your real-time data streams. This new capability allows you to:
Be more productive: You don’t have to worry about stitching together various services manually or iteratively develop your stream processing logic with SQL. Ready to use templates allow you to build a streaming pipeline within minutes.
Instantly validate logic: As you drag and drop components in your canvas, you will instantly see results of the changes you make, enabling you to know if the results are as expected.
Operationalize jobs without worrying: Once you have developed and started your job, you can leverage the rich set of built-in metrics to setup alerts and monitor your job’s health. These jobs run in the same highly dependable Azure Stream Analytics infrastructure which thousands of other customers trust to run their mission-critical stream processing workloads.
Improved resource utilization with native autoscaling
Stream Analytics jobs are configured to run with a certain number of Streaming Units (SUs), which is an abstraction for the amount of compute and memory available to the job. In the real world, streaming data is produced by various sources and has unpredictable patterns. And we are pushing the boundaries to provide the best utilization of your Stream Analytics jobs.
Today, we are excited to announce native autoscale capability in Azure Stream Analytics (Preview). Once you define the minimum and maximum SUs along with the conditions for autoscaling, Stream Analytics will take care of dynamically optimizing the number of SUs needed for your workload, allowing you to save on costs without compromising on performance. You can learn more about autoscaling in Stream Analytics.
Learn more about autoscaling Stream Analytics jobs today.
Improved development experience in the Stream Analytics portal
New Get Started experience
Apart from the new no code editor, we are continuing to invest in improving the Stream Analytics portal experience. When you create a job using the Azure portal, you will be greeted with a brand new getting started experience. This helps you get familiarized with the 4 main steps you need to complete setup of your job:
Ingest data from a source
Define stream processing logic in SQL
Configure where output must be sent to
Enabling resource logs for easy troubleshooting
Other tabs in the Overview section include:
Monitoring: which shows pre-created charts with key metrics that help you understand your job’s performance
Properties: Summary of job’s configuration which you can easily edit
Tutorials: Links to learning modules and tutorial on how to get started easily
Easily modify inputs and outputs
When you are developing your query, you may want to see the details of the inputs or outputs you are working with and might also want to open those target resources to change some settings. You can do this now without ever leaving the query development experience! Just click on the three dots near the input or output and choose if you want to edit it or navigate to the resource page.
In the inputs and outputs section of the job, you will now see a link which can take you to that target resource. You no longer have to take additional steps searching for which resource you are connected to.
When you want to add an Azure SQL database as your job’s output, you can now see the list of tables available in your database and pick the one you want to write to.
To make this even better, when you test your query, you can now see the data type of your input and output fields. And if your query is configured to a SQL output, you can instantly see if the data type being written by your ASA job matches what your table expects. This allows for instant feedback with which you can fix queries to make it more resilient before running the job.
Increased scale of Stream Analytics
We have seen many customers continue to ramp up their usage of Stream Analytics and now require even more scale to keep up with the large volumes of streaming data that need to be analyzed. We are more than doubling the maximum SUs of a single job and a cluster from 192 SUs to 396 SUs to help you scale as needed to meet your needs. Coupled with autoscaling, your job can now start with 1 SU and scale up to 396 SUs dynamically!
The Azure Stream Analytics team is highly committed to listening to your feedback. We welcome you to join the conversation and make your voice heard via submitting ideas and feedback. You can stay up-to-date on the latest announcements by following us on Twitter @AzureStreaming. If you have any questions or run into any issues accessing any of these new improvements, you can also reach out to us at firstname.lastname@example.org.