azure event grid
16 TopicsJSON Structure: A JSON schema language you'll love
We talk to many customers moving structured data through queues and event streams and topics, and we see a strong desire to create more efficient and less brittle communication paths governed by rich data definitions well understood by all parties. The way those definitions are often shared are schema documents. While there is great need, the available schema options and related tool chains are often not great. JSON Schema is popular for its relative simplicity in trivial cases, but quickly becomes unmanageable as users employ more complex constructs. The industry has largely settled on "Draft 7," with subsequent releases seeing weak adoption. There's substantial frustration among developers who try to use JSON Schema for code generation or database mapping—scenarios it was never designed for. JSON Schema is a powerful document validation tool, but it is not a data definition language. We believe it's effectively un-toolable for anything beyond pure validation; practically all available code-generation tools agree by failing at various degrees of complexity. Avro and Protobuf schemas are better for code generation, but tightly coupled to their respective serialization frameworks. For our own work in Microsoft Fabric, we're initially leaning on an Avro-compatible schema with a small set of modifications, but we ultimately need a richer type definition language that ideally builds on people's familiarity with JSON Schema. This isn't just a Microsoft problem. It's an industry-wide gap. That's why we've submitted JSON Structure as a set of Internet Drafts to the IETF, aiming for formal standardization as an RFC. We want a vendor-neutral, standards-track schema language that the entire industry can adopt. What Is JSON Structure? JSON Structure is a modern, strictly typed data definition language that describes JSON-encoded data such that mapping to and from programming languages and databases becomes straightforward. It looks familiar—if you've written "type": "object", "properties": {...} before, you'll feel right at home. But there's a key difference: JSON Structure is designed for code generation and data interchange first, with validation as an optional layer rather than the core concern. This means you get: Precise numeric types: int32 , int64 , decimal with precision and scale, float , double Rich date/time support: date , time , datetime , duration —all with clear semantics Extended compound types: Beyond objects and arrays, you get set , map , tuple , and choice (discriminated unions) Namespaces and modular imports: Organize your schemas like code Currency and unit annotations: Mark a decimal as USD or a double as kilograms Here's a compact example that showcases these features. We start with the schema header and the object definition: { "$schema": "https://json-structure.org/meta/extended/v0/#", "$id": "https://example.com/schemas/OrderEvent.json", "name": "OrderEvent", "type": "object", "properties": { Objects require a name for clean code generation. The $schema points to the JSON Structure meta-schema, and the $id provides a unique identifier for the schema itself. Now let's define the first few properties—identifiers and a timestamp: "orderId": { "type": "uuid" }, "customerId": { "type": "uuid" }, "timestamp": { "type": "datetime" }, The native uuid type maps directly to Guid in .NET, UUID in Java, and uuid in Python. The datetime type uses RFC3339 encoding and becomes DateTimeOffset in .NET, datetime in Python, or Date in JavaScript. No format strings, no guessing. Next comes the order status, modeled as a discriminated union: "status": { "type": "choice", "choices": { "pending": { "type": "null" }, "shipped": { "type": "object", "name": "ShippedInfo", "properties": { "carrier": { "type": "string" }, "trackingId": { "type": "string" } } }, "delivered": { "type": "object", "name": "DeliveredInfo", "properties": { "signedBy": { "type": "string" } } } } }, The choice type is a discriminated union with typed payloads per case. Each variant can carry its own structured data— shipped includes carrier and tracking information, delivered captures who signed for the package, and pending carries no payload at all. This maps to enums with associated values in Swift, sealed classes in Kotlin, or tagged unions in Rust. For monetary values, we use precise decimals: "total": { "type": "decimal", "precision": 12, "scale": 2 }, "currency": { "type": "string", "maxLength": 3 }, The decimal type with explicit precision and scale ensures exact monetary math—no floating-point surprises. A precision of 12 with scale 2 gives you up to 10 digits before the decimal point and exactly 2 after. Line items use an array of tuples for compact, positional data: "items": { "type": "array", "items": { "type": "tuple", "properties": { "sku": { "type": "string" }, "quantity": { "type": "int32" }, "unitPrice": { "type": "decimal", "precision": 10, "scale": 2 } }, "tuple": ["sku", "quantity", "unitPrice"], "required": ["sku", "quantity", "unitPrice"] } }, Tuples are fixed-length typed sequences—ideal for time-series data or line items where position matters. The tuple array specifies the exact order: SKU at position 0, quantity at 1, unit price at 2. The int32 type maps to int in all mainstream languages. Finally, we add extensible metadata using set and map types: "tags": { "type": "set", "items": { "type": "string" } }, "metadata": { "type": "map", "values": { "type": "string" } } }, "required": ["orderId", "customerId", "timestamp", "status", "total", "currency", "items"] } The set type represents unordered, unique elements—perfect for tags. The map type provides string keys with typed values, ideal for extensible key-value metadata without polluting the main schema. Here's what a valid instance of this schema looks like: { "orderId": "f47ac10b-58cc-4372-a567-0e02b2c3d479", "customerId": "7c9e6679-7425-40de-944b-e07fc1f90ae7", "timestamp": "2025-01-15T14:30:00Z", "status": { "shipped": { "carrier": "Litware", "trackingId": "794644790323" } }, "total": "129.97", "currency": "USD", "items": [ ["SKU-1234", 2, "49.99"], ["SKU-5678", 1, "29.99"] ], "tags": ["priority", "gift-wrap"], "metadata": { "source": "web", "campaign": "summer-sale" } } Notice how the choice is encoded as an object with a single key indicating the active case— {"shipped": {...}} —making it easy to parse and route. Tuples serialize as JSON arrays in the declared order. Decimals are encoded as strings to preserve precision across all platforms. Why Does This Matter for Messaging? When you're pushing events through Service Bus, Event Hubs, or Event Grid, schema clarity is everything. Your producers and consumers often live in different codebases, different languages, different teams. A schema that generates clean C# classes, clean Python dataclasses, and clean TypeScript interfaces—from the same source—is not a luxury. It's a requirement. JSON Structure's type system was designed with this polyglot reality in mind. The extended primitive types map directly to what languages actually have. A datetime is a DateTimeOffset in .NET, a datetime in Python, a Date in JavaScript. No more guessing whether that "string with format date-time" will parse correctly on the other side. SDKs Available Now We've built SDKs for the languages you're using today: TypeScript, Python, .NET, Java, Go, Rust, Ruby, Perl, PHP, Swift, and C. All SDKs validate both schemas and instances against schemas. A VS Code extension provides IntelliSense and inline diagnostics. Code and Schema Generation with Structurize Beyond validation, you often need to generate code or database schemas from your type definitions. The Structurize tool converts JSON Structure schemas into SQL DDL for various database dialects, as well as self-serializing classes for multiple programming languages. It can also convert between JSON Structure and other schema formats like Avro, Protobuf, and JSON Schema. Here's a simple example: a postal address schema on the left, and the SQL Server table definition generated by running structurize struct2sql postaladdress.json --dialect sqlserver on the right: JSON Structure Schema Generated SQL Server DDL { "$schema": "https://json-structure.org/meta/extended/v0/#", "$id": "https://example.com/schemas/PostalAddress.json", "name": "PostalAddress", "description": "A postal address for shipping or billing", "type": "object", "properties": { "id": { "type": "uuid", "description": "Unique identifier for the address" }, "street": { "type": "string", "description": "Street address with house number" }, "city": { "type": "string", "description": "City or municipality" }, "state": { "type": "string", "description": "State, province, or region" }, "postalCode": { "type": "string", "description": "ZIP or postal code" }, "country": { "type": "string", "description": "ISO 3166-1 alpha-2 country code" }, "createdAt": { "type": "datetime", "description": "When the address was created" } }, "required": ["id", "street", "city", "postalCode", "country"] } CREATE TABLE [PostalAddress] ( [id] UNIQUEIDENTIFIER, [street] NVARCHAR(200), [city] NVARCHAR(100), [state] NVARCHAR(50), [postalCode] NVARCHAR(20), [country] NVARCHAR(2), [createdAt] DATETIME2, PRIMARY KEY ([id], [street], [city], [postalCode], [country]) ); EXEC sp_addextendedproperty 'MS_Description', 'A postal address for shipping or billing', 'SCHEMA', 'dbo', 'TABLE', 'PostalAddress'; EXEC sp_addextendedproperty 'MS_Description', 'Unique identifier for the address', 'SCHEMA', 'dbo', 'TABLE', 'PostalAddress', 'COLUMN', 'id'; EXEC sp_addextendedproperty 'MS_Description', 'Street address with house number', 'SCHEMA', 'dbo', 'TABLE', 'PostalAddress', 'COLUMN', 'street'; -- ... additional column descriptions The uuid type maps to UNIQUEIDENTIFIER , datetime becomes DATETIME2 , and the schema's description fields are preserved as SQL Server extended properties. The tool supports PostgreSQL, MySQL, SQLite, and other dialects as well. Mind that all this code is provided "as-is" and is in a "draft" state just like the specification set. Feel encouraged to provide feedback and ideas in the GitHub repos for the specifications and SDKs at https://github.com/json-structure/ Learn More We've submitted JSON Structure as a set of Internet Drafts to the IETF, aiming for formal standardization as an RFC. This is an industry-wide issue, and we believe the solution needs to be a vendor-neutral standard. You can track the drafts at the IETF Datatracker. Main site: json-structure.org Primer: JSON Structure Primer Core specification: JSON Structure Core Extensions: Import | Validation | Alternate Names | Units | Composition IETF Drafts: IETF Datatracker GitHub: github.com/json-structure4.6KViews7likes1CommentWhat’s New in Azure Event Grid
Azure Event Grid continues to evolve with new capabilities in General Availability and Public Preview – intended to help you enhance performance, security, and interoperability in modern event-driven systems. These enhancements strengthen Azure Event Grid’s foundational messaging layer for distributed systems—supporting real-time telemetry, automation, and hybrid workloads. With support for advanced MQTT features and flexible authentication models, Event Grid now offers: Stronger security posture through OAuth 2.0 and Custom Webhook authentication Operational efficiency with static client identifiers and message retention Broader integration across devices, applications, and cloud services General Availability Features MQTT OAuth 2.0 Authentication Authenticate MQTT clients using JSON Web Tokens (JWTs) issued by any OpenID Connect (OIDC)-compliant identity provider. This enables seamless integration with Microsoft Entra ID (formerly Azure AD), custom identity platforms, or third-party IAM solutions. MQTT Custom Webhook Authentication Use external webhooks or Azure Functions to dynamically validate client connections. Authorize using shared access signatures (SAS), API keys, custom credentials, or X.509 certificate fingerprints. This is a powerful feature for scenarios requiring granular control across large, dynamic fleets of devices or multitenant environments. MQTT Assigned Client Identifiers Assign deterministic, pre-approved client identifiers to MQTT clients. This enables enhanced session continuity, better device tracking, simplified diagnostics, and improved auditability—critical for managing long-lived connections and operational visibility in regulated industries. Assign deterministic, pre-approved client identifiers to MQTT clients. This enables enhanced session continuity, better device tracking, simplified diagnostics, and improved auditability—critical for managing long-lived connections and operational visibility in regulated industries. First Class integration with Fabric Route MQTT messages and Cloud Events from Event Grid Namespace to Fabric Event Streams for real-time analytics, storage, visualization of IoT data without having to hop thru Event Hub. Public Preview Features HTTP Publish Bridge traditional HTTP-based applications into event-driven ecosystems by allowing HTTP clients to publish messages directly to Event Grid topics. This enables RESTful services, legacy systems, and webhooks to participate in real-time event workflows, complementing MQTT and cloud-native integrations. MQTT Retain Support Support for retained MQTT messages allows clients to receive the latest value on a topic immediately upon subscription—without waiting for the next publish. This is particularly useful in IoT telemetry scenarios, stateful dashboards, and device shadow synchronization. Retained messages are stored per topic with configurable expiry and can be cleared on demand. Unlocking Smart Factory Insights with Sparkplug B on Azure Event Grid MQTT Broker In the age of Industry 4.0, factories are becoming smarter, more connected, and increasingly data driven. A key enabler of this transformation is Sparkplug B, an MQTT-based protocol purpose-built for industrial IoT (IIoT). And now, with Azure Event Grid MQTT Broker, Sparkplug B comes to life in the cloud—securely, reliably, and at scale. What is Sparkplug B? Think of Sparkplug B as the common language for industrial devices. It defines how sensors, gateways, and SCADA systems talk to each other—sharing not just telemetry data (like temperature or RPM) but also device lifecycle information such as when a machine comes online (BIRTH) or goes offline (DEATH). Why it Matters for Manufacturers Real-time factory monitoring – View live machine vitals across distributed plants. Predictive maintenance – Anticipate failures by analyzing trends. Seamless SCADA integration – Auto-discover tags in systems like Ignition SCADA with Cirrus Link. Edge-to-cloud bridge – Bring legacy factory systems into Azure for analytics, AI, and automation. Azure Event Grid MQTT Broker + Sparkplug B With Azure Event Grid MQTT Broker, manufacturers can run Sparkplug B workloads with enterprise-grade reliability. A connected factory floor where insights flow seamlessly from edge devices to cloud-unlocking efficiency, uptime, and innovation using the following capabilities: QoS 1 for reliability (at-least-once delivery). Last Will & Testament (LWT) for real-time device state awareness. Retained messages to ensure new subscribers always see the last known good value. Native support for binary Sparkplug payloads over secure TLS. From Factory Floor to Cloud Insights Sensors measure machine temperature and RPM. Edge gateways publish Sparkplug B messages to Azure Event Grid MQTT Broker. Ignition SCADA with Chariot Cirrus Link auto-discovers and displays these tags. Azure Data Explorer or Fabric ingests the same data for real-time dashboards, predictive analytics, or automated alerts. Ready to Get Started? Learn more about Azure Event Grid MQTT Broker Sparkplug B Support561Views0likes0CommentsAnnouncing new features and updates in Azure Event Grid
Discover powerful new features in Azure Event Grid, enhancing its functionality and user experience. This fully managed event broker now supports multi-protocol interoperability, including MQTT, for scalable messaging. It seamlessly connects Microsoft-native and third-party services, enabling robust event-driven applications. Streamline event management with flexible push-pull communication patterns. We are thrilled to announce General Availability of the Cross-tenant delivery to Event Hubs, Service Bus, Storage Queues, and dead letter storage using managed identity with federated identity credentials (FIC) from Azure Event Grid topics, domains, system topics, and partner topics. New cross-tenant scenarios, currently in Public Preview enable delivery to Event Hubs, webhooks, and dead letter storage in Azure Event Grid namespaces. This includes system topics, partner topics, and domains, offering seamless integration. The update enhances flexibility for event-driven applications across tenants. Azure Event Grid now also offers managed identity support for webhook delivery for all their resources. Public Preview features for new cross-tenant scenarios and managed identity support for webhook delivery are currently available in West Central, West Europe, UK South, Central US, and more regions will be supported soon. We are also introducing the Public Preview for the support of Network Security Perimeter (NSP) in Azure Event Grid topics and domains, for inbound and outbound communication. This perimeter defines a boundary with implicit trust access between each resource, where you can have sets of inbound and outbound access rules. By incorporating these advanced security measures, Azure Event Grid enhances the defense against a wide range of cyber threats, helping organizations to safeguard their event-driven workloads. In addition to this, Azure Event Grid has introduced message ordering support within single MQTT client sessions, ensuring reliable sequential event delivery, and a connection rate limit of one attempt per second per session, which maintains system stability. Furthermore, the expansion to support up to 15 MQTT topic segments per topic or filter offers greater flexibility in topic hierarchies. High throughput messaging, supporting up to 1,000 messages per second per session, is now in Public Preview, making it ideal for demanding scenarios such as IoT telemetry and real-time analytics. Azure Event Grid now also offers OAuth 2.0 JWT authentication for MQTT clients in Public Preview. This feature enables secure client authentication via JSON Web Tokens (JWT) issued by OpenID Connect (OIDC) compliant providers, providing a lightweight, secure, and flexible authentication option for clients not provisioned in Azure. Additionally, Custom Webhook Authentication has been introduced, allowing dynamic client authentication through webhooks or Azure Functions, with Entra ID JWT validation for centralized and customizable strategies. Finally, Assigned Client Identifiers in Public Preview provide consistent client IDs, improving session management and operational control, further enhancing the scalability and flexibility of client authentication workflows. We believe these updates will greatly enhance your Azure Event Grid experience. We welcome your feedback and appreciate your ongoing partnership as we work to deliver top features and services.1.1KViews0likes0CommentsAnnouncing new features and updates in Azure Event Grid
We are excited to share several new updates and features in Azure Event Grid that enhance our service's capabilities and improve your experience. In this article, you will find more information about the General Availability of Webhook Endpoints, and custom domain names in Azure Event Grid, as well as the introduction of new Public Previews like the cross-tenant delivery, and namespace topic to namespace topic forwarding support. Azure Event Grid is a highly integrated event broker designed to simplify the development of event-driven applications. It features pub-sub scenarios with a rich variety of event sources and handlers, including first-party and third-party integrations. Its flexible design supports multi-protocol interoperability, push and pull delivery, as well as MQTT, allowing for diverse message consumption patterns. Below, we detail the latest additions and improvements now available. We are pleased to announce General Availability of the webhook endpoints in Azure Event Grid namespace topics. This feature allows for efficient and reliable push delivery to webhooks, expanding the possibilities for event-driven architectures and integrations. We are also excited to announce that custom domain names support is now Generally Available in Azure Event Grid’s MQTT broker. This new feature allows you to assign your own domain names to the MQTT and HTTP endpoints within your Azure Event Grid namespaces. By doing so, you can enhance security and simplify client configuration. Additionally, assigning custom domain names to namespaces can help improve availability, manage capacity, and facilitate cross-region client mobility. We are also excited to announce the General Availability of Microsoft Graph API events that provide notifications about state changes of resources in Microsoft Outlook, Teams, SharePoint, Microsoft Entra ID, Microsoft Conversations, and security alerts. In Public Preview, we are introducing support for cross-tenant delivery to Event Hubs, Service Bus, and Storage Queues using managed identity with federated identity credentials (FIC) in Azure Event Grid topics, domains, system topics and partner topics. This enhancement enables secure and efficient cross-tenant communication for basic resources. Lastly, we are launching the Public Preview for namespace topic to namespace topic forwarding, enabling seamless event forwarding between topics hosted in the same or different namespaces, simplifying the event routing and management. We are confident that these updates will provide significant benefits and improvements to your Azure Event Grid experience. We look forward to your feedback and continued partnership as we strive to deliver the best possible features and services.483Views0likes0CommentsAnnouncing public preview of MQTT protocol and pull message delivery in Azure Event Grid
Azure Event Grid now supports MQTT protocol for bi-directional communication between IoT devices and cloud application, and pull delivery of messages on custom topics, for flexible messaging at high scale.
18KViews8likes27CommentsAnnouncing MQTT Last Will and Testament Public Preview in Azure Event Grid
Announcing MQTT Last Will and Testament (LWT) Public Preview in Azure Event Grid's MQTT Broker capability. LWT enables your MQTT clients to get notified with the abrupt disconnections of other MQTT clients.3.7KViews0likes2Comments