Blog Post

Partner news
2 MIN READ

Announcing seamless integration of Apache Kafka with Azure Cosmos DB in Azure Native Confluent

praveenrajap's avatar
praveenrajap
Icon for Microsoft rankMicrosoft
Jan 21, 2026

Build event-driven & real-time streaming applications using Azure Cosmos DB and Apache Kafka on Confluent Cloud

Integrate Azure Cosmos DB with Kafka applications

Confluent announced general availability (GA) of the fully managed V2 Kafka connector for Azure Cosmos DB, enabling users to seamlessly integrate their Azure Cosmos DB containers with Kafka-powered event streaming applications.  without worrying about provisioning, scaling, or managing the connector infrastructure.

The Confluent Cosmos DB v2 connector offers significant advantages as compared to the v1 connector in terms of higher throughput, enhanced security and observability, and increased reliability.

Seamless Integration with Azure Native Confluent

We are excited to announce a new capability in Azure Native Confluent service that enables users to create and configure Confluent-managed Cosmos DB Kafka connectors (v2) for Azure Cosmos DB containers through a direct, seamless experience in the Azure portal.

Users can also provision and manage environments, Kafka clusters and Kafka topics from within the Azure Native Confluent service, creating a holistic end-to-end experience to integrate with Azure Cosmos DB. This eliminates the need for users to switch between the Azure and Confluent Cloud portal.

Key Highlights

  • Bi-directional Support: Allows users to create source connectors to stream data from Cosmos DB to Kafka topics, or sink connectors to move data from Kafka into Cosmos DB.
  • Secure Authentication: Users can authenticate to Kafka cluster using service accounts, enabling least-privilege access controls to provision the connectors - aligned with Confluent’s recommended security guidelines.

Create a Confluent Cosmos DB (v2) Kafka Connector from Azure portal

The following section summarizes the key steps required to provision the connector from the Azure Native Confluent service.

  1. Navigate to the native Confluent resource in Azure.
  2. Navigate to Connectors -> Create new connector

You can also create an environment, cluster and topic from within the Azure Portal.

  1. Choose the desired connector type
  2. Source: to stream data from Azure Cosmos DB
  3. Sink to move data into Azure Cosmos DB

 

 

  1. Select ‘Azure Cosmos DB V2’ as the connector plugin

 

 

  1. Enter the connector name. Then, select the required Kafka topics, Azure Cosmos DB account and database.

          

 

  1. Select Service Account authentication and provide a name for the service account. When the connector is created, this will create a new service account on Confluent Cloud. Optionally, you can also select the user-account based authentication by provisioning an API key on Confluent Cloud.

 

  1. Do the required connector configurations. Enter the topic container mapping in the form of ‘topic1#container1,topic2#container2…’

 

 

 

  1. Review the configuration summary and click Create. Your connector will appear in the list with real-time status indicators.

 

Other Resources

If you would like to give us feedback on this feature, the overall product or have any suggestions for us to work on, please drop in your suggestions in the comments.

 

Updated Jan 22, 2026
Version 2.0
No CommentsBe the first to comment