MGDC is designed to deliver data periodically (daily, weekly) to an Azure Storage account (or to Microsoft Fabric). If you need the data delivered to an Azure Event hub, you will need additional steps.
Copilot suggested:
To transfer data from Azure Storage (like Blob Storage) to Azure Event Hub, you’ll need to set up a pipeline that reads the data and pushes it as events. Azure doesn’t offer a direct built-in connector for this, but here are several effective approaches:
Use a time-triggered Logic App to pull data from Blob Storage and send it to Event Hub.
- Trigger: Recurrence (e.g., every 5 minutes)
- Action 1: Get blob content from Azure Storage
- Action 2: Send message to Azure Event Hub
This is ideal for simple workflows and low-volume data.
🔗 Learn more: Connect to Event Hub from Logic Apps
Write a custom Azure Function that reacts to blob creation or modification events.
- Trigger: Blob Storage event (via Event Grid)
- Function Logic: Read blob content, transform if needed, send to Event Hub
- Output Binding: Azure Event Hub
This gives you full control over the transformation and routing logic.
Use Stream Analytics to stream data from Blob Storage to Event Hub.
- Input: Blob Storage (note: limited format support like CSV or JSON)
- Query: Transform or filter data
- Output: Azure Event Hub
⚠️ Parquet files aren’t natively supported, so you may need preprocessing.
Set up Event Grid to monitor blob changes and route events to Event Hub.
- Create an Event Grid subscription on your storage account
- Set the endpoint type to Event Hub
- Configure filters and routing rules
This is useful for event-driven architectures where you want to react to blob changes.
🔗 Example setup: Azure Event Hub listening to blob changes