Forum Discussion

Karthikeyan_B_R's avatar
Karthikeyan_B_R
Copper Contributor
Apr 24, 2024

Need to know if fluentd can be setup to filter logs from azure kubernetes service before sending it

Hi,

I have a Kubernetes service setup in azure and have enabled insights for the same. I have created a log analytics workspace and a diagnostic setting to send all the logs to the log analytics workspace. In order for me to reduce the cost of data ingestion and implement log rotation i'm thinking of using fluentd.

So my approach would be to install the fluentd agent/docker image in the kubernetes service and filter it out before sending the logs to the log analytics workspace.

Is this approach even possible? If so please help me with few sources/reference links.

1 Reply

  • You can deploy as a DaemonSet within your Azure Kubernetes Service (AKS) cluster, ensuring that each node operates a Fluentd pod for local log collection. This setup enables the configuration of Fluentd to filter, transform, and selectively route log data prior to ingestion by the Log Analytics workspace:

     

    1.     Deploy Fluentd as a DaemonSet in your AKS cluster.
    2.     Configure Fluentd with filters (e.g., grep, record_transformer, parser) to exclude unnecessary logs or metadata.
    3.     Use the appropriate output plugin (like azure_loganalytics) to send the filtered logs to your Log Analytics workspace.

Resources