apache kafka
3 TopicsAzure Stream Analytics Kafka Connectors is Now Generally Available!
We are excited to announce that Kafka Input and Output with Azure Stream Analytics is now generally available! This marks a major milestone in our commitment to empowering our users with robust and innovative solutions. With Stream Analytics Kafka Connectors, users can natively read and write data to and from Kafka topics. This enables users to fully leverage Stream Analytics' rich capabilities and features even when the data resides outside of Azure. Azure Stream Analytics is a job service, so you do not have to spend time managing clusters, and downtime concerns are alleviated with a 99.99% SLA (Service Level Agreement) at the job level. Key Benefits: Stream Analytics job can ingest Kafka events from anywhere, process them and output them to any number of Azure services as well as to other Kafka clusters. No need to use workarounds such as MirrorMaker or Kafka extensions with Azure function to process data in Kafka with azure stream analytics. The solution is low code and entirely managed by the Azure Stream Analytics team at Microsoft. Getting Started: To get started with Stream Analytics Kafka input and output connector please refer to these links provided below: Stream data from Kafka into Azure Stream Analytics Kafka output from Azure Stream Analytics You can add Kafka input or output to a new or an existing Stream Analytics job in a few simple clicks. To add Kafka input, go to input under job topology, click Add input, and select Kafka. For Kafka output, go to output under job topology, click Add output, and select Kafka. Next, you will be presented with Kafka connection configuration. Once filled you will be able to test the connection with Kafka cluster. VNET Integration: You can connect to Kafka cluster from Azure Stream Analytics whether it is on cloud or on prem with a public endpoint. You can also securely connect to Kafka cluster inside a virtual network with Azure Stream Analytics. Visit theRun your Azure Stream Analytics job in an Azure Virtual Network documentation for more information. Automated deployment with ARM template ARM templates allow for quick and automated deployment of Stream Analytics jobs. To deploy a stream analytics job with Kafka Input or Output quickly and automatically, users can include the following sample snippet in their Stream Analytics job ARM template. "type": "Kafka", "properties": { "consumerGroupId": "string", "bootstrapServers": "string", "topicName": "string", "securityProtocol": "string", "securityProtocolKeyVaultName": "string", "sasl": { "mechanism": "string", "username": "string", "password": "string" }, "tls": { "keystoreKey": "string", "keystoreCertificateChain": "string", "keyPassword": "string", "truststoreCertificates": "string" } } We can’t wait to see what you’ll build with Azure Stream Analytics Kafka input and output connectors. Try it out today and let us know your feedback. Stay tuned for more updates as we continue to innovate and enhance this feature. Call to Action: For direct help with using the Azure Stream Analytics Kafka input, please reach out toaskasa@microsoft.com. To learn more about Azure Stream Analytics click here157Views1like0CommentsRealize Lakehouse using best of breed of Open source using HDInsight
Author: Reems Thomas Kottackal, Product Manager HDInsight on AKS is a modern, reliable, secure, and fully managed Platform as a Service (PaaS) that runs on Azure Kubernetes Service (AKS). HDInsight on AKS allows an enterprise to deploy popular open-source analytics workloads like Apache Spark, Apache Flink, and Trino without the overhead of managing and monitoring containers. You can build end-to-end, petabyte-scale Big Data applications spanning event storage using HDInsight Kafka, streaming through Apache Flink, data engineering and machine learning using Apache Spark, and Trino's powerful query engine. In combination with Azure analytics services like Azure data factory, Azure event hubs,Power BI, Azure Data Lake Storage. HDInsight on AKS can connect seamlessly with HDInsight. You can reap the benefits of using needed cluster types in a hybrid model. Interoperate with cluster types of HDInsight using the same storage and meta storeacross both the offerings. The following diagram depicts an example of end-end analytics landscape realized through HDInsight workloads. We are super excited to get you started, lets get to how? Signup today -https://aka.ms/starthdionaks Read our documentation -https://aka.ms/hdionaks-docs Join our community, share an idea or share your success story - https://aka.ms/hdionakscommunity Have a question on how to migrate or want to discuss a use case -https://aka.ms/askhdinsight2.9KViews0likes0Comments