Importing Data from RabbitMQ into Azure Data Explorer via Event Hubs
Published Mar 29 2023 03:08 AM 5,103 Views
Microsoft

RabbitMQ_AzureDataExplorer.png

Imagine you have sensor data that’s being sent to RabbitMQ from various IoT devices. These devices capture temperature and humidity readings. You want to use the power of Azure Data Explorer to analyze this data and apply intelligent insights to it. You have the challenge of getting the data out of RabbitMQ and into Azure. Using Azure Event Hubs, you can bridge RabbitMQ with Azure Data Explorer. In this blog, you'll learn how to set up RabbitMQ and Event Hubs so your sensor data can be analyzed with Azure Data Explorer.

 

This blog assumes that you have RabbitMQ already configured and an Event Hubs namespace already set-up. If you want to read a step-by-step guide on how to set up an Event Hubs namespace, go here: Quickstart: Create an event hub using Azure CLI. 

 

Introduction

 

You have a RabbitMQ queue called telemetry where your IoT sensors send data in the following format:

 

{
    "Timestamp": <Timestamp>,
    "Temperature": <Temperature>,
    "Humidity": <Humidity>
}

 

To send messages from the RabbitMQ telemetry queue to Event Hubs, you'll use the Shovel Plugin, which comes packaged with RabbitMQ.

 

The Shovel plugin takes care of automatically taking messages from your queue and sending them to Event Hubs. It uses the AMQP 1.0 protocol supported both by RabbitMQ and Event Hubs.

 

Enabling the RabbitMQ Shovel Plugin

 

You can enable the plugin and its visual interface with this command:

 

rabbitmq-plugins enable rabbitmq_shovel_management

Note: You might need to run that command as root.

 

The Shovel plugin takes messages from an origin queue and sends them to a destination queue. The destination queue can be a RabbitMQ queue, or a queue in another messaging system like Azure Event Hubs. To do that, Shovel needs the right credentials to connect to the destination queue.

 

Create a Shared Access Signature for Event Hubs

 

You’ll need to create a Shared Access Policy (SAS) for your Event Hub, so RabbitMQ can publish messages to it. An SAS Policy lets you specify what an external party is allowed to do with your resource. The goal is for RabbitMQ to be able to send messages, but not listen or manage the queue.

 

In Azure portal, go to your Event Hubs namespace, and add an Event Hub to it, call it event-hubs-kusto-eh. Once the resource has been created, navigate to it, and in the left panel click on Shared Access Policies.

Fill in the form for your new policy by giving it the name eh-rmq-bridge, and tick the Send permission. This policy will be used by RabbitMQ to send messages to Event Hubs.

 

Event Hubs SAS PolicyEvent Hubs SAS Policy

 

Once the SAS key has been created, select it, and copy to the clipboard the one called Connection string-primary key.

 

Connecting RabbitMQ to Event Hubs

 

Your connection string looks something like this:

 

Endpoint=sb://<your-namespace>.servicebus.windows.net/;SharedAccessKeyName=eh-rmq-bridge;SharedAccessKey=<SharedAccessKey>;EntityPath=event-hubs-kusto-eh

 

But RabbitMQ Shovel expects a connection string in the AMQP protocol format:

 

amqps://eh-rmq-bridge:<SharedAccessKey>@<your-namespace>.servicebus.windows.net:5671/?sasl=plain

 

To help you converting the string you can go to the connection string converter tool and paste your connection string in the form, then click convert. You’ll get a connection string that’s RabbitMQ ready. (That website runs everything local in your browser so your data isn’t sent over the wire). You can access its source code on GitHub.

 

Connection String Conversion ToolConnection String Conversion Tool

Now open the RabbitMQ management plugin in your browser http://localhost:15672/#/dynamic-shovels and go to Admin -> Shovel Management, where you can add your new shovel that will take care of sending messages from a RabbitMQ queue to your Azure Service Bus queue.

 

Here call your Shovel rmq-to-eh and choose AMQP 0.9.1 as the source protocol, since that's RabbitMQ’s default protocol. In the screenshot, we have amqp://, which is the default URI that connects us to a local RabbitMQ server. Make sure to adapt that to your current deployment.

Use telemetry as the name of your queue. If this queue doesn’t exist, RabbitMQ will declare it for you. You can leave the other source options as default.

 

Then on the destination side of things, choose AMQP 1.0 as the protocol. In the URI field, enter the connection string that you got from the previous step, where you converted your Azure connection string to the RabbitMQ format. Finally, in the Address field, enter the name of your Event Hub (in this case event-hubs-kusto-eh).

 

It should look like this:

 

RabbitMQ Add ShovelRabbitMQ Add Shovel

After you click on Add Shovel, click on Shovel Status to see your new shovel listed there.

 

Publish messages from RabbitMQ to Event Hubs

 

Now it’s time to test the connection to Event Hubs. You can do that by publishing messages to the telemetry queue, and then see them in your Event Hub.

 

Here’s an example of a message that you can send:

 

{
    "Timestamp": "2020-01-01T00:00:00Z",
    "Temperature": 20,
    "Humidity": 50
}

 

You are going to use C# to generate random messages and send them to RabbitMQ. You can use the following code to do that:

 

using System;
using System.Text;
using RabbitMQ.Client;

namespace RabbitMQProducer
{
    class Program
    {
        static void Main(string[] args)
        {
            var factory = new ConnectionFactory() { HostName = "localhost" };
            using(var connection = factory.CreateConnection())
            using(var channel = connection.CreateModel())
            {
                channel.QueueDeclare(queue: "telemetry",
                                    durable: true,
                                    exclusive: false,
                                    autoDelete: false,
                                    arguments: null);

                int i = 0;
                while (true) {
                    var message = GenerateRandomMessage(args);
                    var body = Encoding.UTF8.GetBytes(message);

                    channel.BasicPublish(exchange: "",
                                        routingKey: "telemetry",
                                        basicProperties: null,
                                        body: body);
                    // throttle messages
                    if (i++ % 100 == 0)
                    {
                        Console.WriteLine(" [x] Sent {0}", message);
                        System.Threading.Thread.Sleep(100);
                    }
                }
            }

            Console.WriteLine(" Press [enter] to exit.");
            Console.ReadLine();
        }

        private static string GenerateRandomMessage()
        {
            var random = new Random();
            var temperature = random.Next(0, 50);
            var humidity = random.Next(0, 100);
            var timestamp = DateTime.UtcNow.ToString("o");
            return $"{{\"Timestamp\": \"{timestamp}\", \"Temperature\": {temperature}, \"Humidity\": {humidity}}}";
        }
    }
}

 

Assuming you created a new .NET Core project, and placed that code in a file called Program.cs, you can run this code with the following command:

 

dotnet run

 

You should see messages being sent to RabbitMQ:

 

Sending telemetry messages from the CLISending telemetry messages from the CLI

 

Then you can go to the RabbitMQ Management plugin and see that the messages are being sent to the telemetry queue:

 

RabbitMQ telemetry queue with messagesRabbitMQ telemetry queue with messages

 

Finally, you can go to the Azure portal and see that the messages were forwarded to Event Hubs:

 

Event Hubs received messages from RabbitMQEvent Hubs received messages from RabbitMQ

 

Creating the Azure Data Explorer cluster

 

Now it’s time to create the Azure Data Explorer cluster that will receive the messages from Event Hubs. You can do that by going to the Azure portal and creating a new Azure Data Explorer cluster. Give it a name, and choose the same region as your Event Hubs namespace. If you want to see in detail how to create an ADX cluster you can check the documentation here Quickstart: Create an Azure Data Explorer cluster and database.

 

NOTE: You can create a free Azure Data Explorer Cluster which can ingest data from Azure Event Hubs. Take a look at the following article Manage Event Hubs data connections in your free Azure Data Explorer cluster

 

Then create a database called telemetry. Once it’s ready, select it so you see the following screen:

 

Azure Data Explorer databaseAzure Data Explorer database

 

Creating a Data Connection to Event Hubs

 

As you can see there, you have the option to create different types of Data connections. You are going to create a new Event Hubs connection.

 

Azure Data Explorer create data connectionAzure Data Explorer create data connection

 

In the form, give it the name event-hubs-kusto, then select your subscription, and specify the same namespace where you created the Event Hubs, in this case event-hubs-kusto-ns. Then select the Event Hub that you created event-hubs-kusto-eh, and choose the default consumer group $Default. On the target table, you have the option to create a new table. Click on Create new, so a new ADX window opens where you can go through the create new table wizard.

 

Name your table telemetry-data, and click “Next: source”. In the source, select the Event Hubs connection that you created in the previous step, along with the $Default consumer group. Then click on “Next: schema”. Here you will specify JSON as the Data format. You will see that ADX fetches some of the data from Event Hubs and is able to infer the table structure from the JSON messages we sent from RabbitMQ. In this case you’ll end up with a table that has a Timestamp column of type datetime, a Temperature column of type int, and a Humidity column of type int.

 

ADX create table wizardADX create table wizard

 

Then select “Next: Create table” to finish the wizard. You should see the following screen:

 

ADX table createdADX table created

 

Now go back to Azure Portal, and in the “Create data connection” form you can click the refresh button in the “Table name” dropdown. You should see your new table listed there. Select it, and click on “Create”. You should see the following screen:

 

ADX data connection createdADX data connection created

 

Querying the data in Azure Data Explorer

 

Go back to the tab where you have the Azure Data Explorer and click on the “Query” button. There you can input your first query, to start playing with the data that you sent from RabbitMQ to Event Hubs, and then to Azure Data Explorer.

 

Be sure to restart the RabbitMQ producer you created in the previous step, so you can send more messages to Azure Data Explorer.

 

['telemetry-data'] 
| where Humidity < 50 and Humidity > 20
| project Timestamp, Temperature, Humidity
| limit 10000
| sort by Temperature

 

You should see the following results:

 

ADX query resultsADX query results

 

And that’s it, the data from your IoT devices is available in Azure Data Explorer, ready to be queried and analyzed.

 

Conclusion

 

In this post you learned how to use RabbitMQ to send messages from your IoT devices to Event Hubs, and then to Azure Data Explorer. You can also use this architecture to send data from Event Hubs to other Azure services, such as Azure Synapse Analytics , so you have the whole power of Azure’s big data services at your disposal.

 

If you want to learn more about Azure Data Explorer, you can check the following resources:

Co-Authors
Version history
Last update:
‎Mar 29 2023 03:07 AM
Updated by: