Ingest data using Splunk Universal forwarder into Azure Data Explorer
Published Jan 18 2024 03:27 AM 3,798 Views
Microsoft

Introduction

In today's data-driven world, it's essential to collect, analyze, and gain insights from various logs and data sources. Azure Data Explorer (aka Kusto) is a cloud-based data analytics platform designed for analyzing and visualizing large volumes of data in real time and is particularly well-suited for log time series and telemetry data. Its real-time capabilities make it a valuable resource for organizations looking to gain insights from their data such as forecasting, anomaly detection, and prediction among various other capabilities. More on leveraging the power of Kusto can be found here.

Akshay_Dixit_0-1705575867250.png

Splunk Universal Forwarder, commonly referred to as "Splunk UF," is a lightweight version of the Splunk Enterprise software. It is specifically designed for the sole purpose of collecting and forwarding log data and machine data from various sources to a central Splunk Enterprise server or a Splunk Cloud deployment. Splunk Universal Forwarder serves as an agent that simplifies the process of data collection and forwarding, making it an essential component in a Splunk deployment. More details on Splunk Universal Forwarder can be found here

 

In this article, we will talk about how we forward logs using Splunk Universal Forwarder to Azure Data Explorer (Kusto) using Kusto Splunk Univeral Forwarder Connector.

 

System Architecture

To forward logs or Splunk logs to Kusto, we need the Splunk Universal Forwarder to be installed in the same machine where the logs are located. The Splunk Universal Forwarder will be configured to send messages via TCP 9997 port (default port) to another system that will be hosting the Kusto Splunk Universal Forwarder Connector. This connector will be listening to the messages arriving on the TCP 9997 port and will forward the logs to Kusto. The following diagram represents the setup.

 

Screenshot 2023-10-25 at 8.49.21 PM.png

Prerequisites

Before we begin, make sure you have the following prerequisites in place:

  1. Splunk Universal Forwarder: Install Splunk Universal Forwarder on the machine where your logs originate. For the Installation guide please refer here.

  2. Azure Data Explorer (Kusto): Set up an Azure Data Explorer cluster and database to receive the logs. More on this can be found here.

  3. Azure Active Directory Application: Create an Azure AD application and service principal to enable communication between Splunk and Kusto.

  4. Docker: Docker needs to be installed in the system which will be required to configure and run the Kusto Splunk Universal Forwarder connector.

Step 1: Configure Splunk Universal Forwarder

  1. While setting up Splunk Universal Forwarder the receiving indexer needs to be configured to point to the system hosting Kusto Splunk Universal Forwarder Connector. Destination Indexer can be left blank. For setting up the Windows Universal Forwarder please refer to the following screenshot. Further details on the receiving indexer configuration can be found here.Screenshot 2023-10-25 172221.png
  2. Configure inputs.conf: Navigate to the folder where Splunk Universal Forwarder is installed and then to /etc/system/local/ folder. Inputs.conf needs to be modified/created to enable the universal forwarder to read logs from. Example configuration can be found below. more on configuration of Inputs.conf can be found here.
    [default]
    index = default
    disabled = false
    
    [monitor://C:\Program Files\Splunk\var\log\splunk\modinput_eventgen.log*]
    sourcetype = modinput_eventgen
  3. Configure outputs.conf: Navigate to the folder where Splunk Universal Forwarder is installed and then to /etc/system/local/ folder. Outputs.conf file needs to be created/modified to enable universal forwarder to determine the location to forward the logs to, in our case the hostname and port of system hosting Kusto Splunk Universal Forwarder connector. An example configuration can be found below. More details on the outputs.conf can be found here
    [tcpout]
    defaultGroup = default-autolb-group
    sendCookedData=false
    
    [tcpout:default-autolb-group]
    server = 127.0.0.1:9997
    
    [tcpout-server://127.0.0.1:9997]
  4. Restart the Splunk Universal Forwarder after the changes.

Step 2: Configure Kusto Splunk Universal Forwarder Connector

  1. Download/Clone the connector from the github repo
  2. Navigate to the base directory of the connector.
    cd .\SplunkADXForwarder\​
  3. Edit the config.yml by entering the required kusto credentials and database to enable the connector to send the data to kusto.
    ingest_url : <ingest URL>
    client_id : <azure-client-id>
    client_secret : <azure-client-secret>
    authority : <authority>
    database_name : <kusto-database-name>
    table_name : <kusto-table-name>
    table_mapping_name :
    data_format : csv​
  4. Docker build
    docker build -t splunk-forwarder-listener​
  5. Docker run
    docker run -p 9997:9997 splunk-forwarder-listener​

Step 3: Verify Data In Azure Data Explorer (Kusto)

  1. Once the docker starts running, data is sent to your Azure Data Explorer table. You can verify that the data is ingested by running a query in the web UI query editor.ADX-screenshot-final.png

Conclusion

In conclusion, we have demonstrated how to ingest data using Splunk Universal Forwarder to route data to Azure Data Explorer, a powerful cloud-based data analytics platform. By using the Kusto Splunk Universal Forwarder Connector, we can easily forward logs from various sources to Kusto and perform real-time analysis and visualization. This enables us to gain insights from our data and leverage the capabilities of Kusto for various scenarios such as forecasting, anomaly detection, and prediction. We hope this article helps you get started with Splunk and Kusto integration and explore the benefits of both technologies.

 

Ingest data from Splunk Universal Forwarder to Azure Data Explorer - Azure Data Explorer | Microsoft...

 

Important

This connector can be used in Real-Time Analytics in Microsoft Fabric. Use the instructions in this article with the following exceptions:

Co-Authors
Version history
Last update:
‎Jan 18 2024 03:27 AM
Updated by: