azure managed redis
7 TopicsUsing MCP Server with Azure Managed Redis in VS Code
Overview Have you thought about asking your VS Code to generate test data and automatically upload them to Redis, without having to write any code? Wouldn’t it be nice to code and manage resources all in one place in VS Code? This blog walks you through setting up VS Code to connect with Redis MCP Server and an Azure Managed Redis to accomplish these. By the end of this blog, we will be able to run VS Code in agent mode and tell it to generate customer test data in English, and it will leverage the MCP server to write the data into Redis. What is MCP and Redis MCP? Model Context Protocol (MCP) is an open standard that lets AI models use external tools and services through a unified interface, enabling seamless integration and standardized interactions. VS Code agent mode lets you run the VS Code tool as an agent, connecting directly to MCP servers like Redis MCP. This enables seamless integration between your development environment and Redis-powered context management, making it easy to automate tasks, manage data, and interact with Redis from within VS Code. This article provides a step-by-step guide to setting up a Redis MCP server connected to Azure Managed Redis using agent mode in Visual Studio Code. Quick Start: Step-by-Step Instructions Prerequisites Create an Azure Managed Redis (AMR) Instance Create an Azure Managed Redis in the Azure Portal Download or clone redis/mcp-redis Start VS Code in Agent mode with MCP servers enabled Download and install VS Code Use agent mode in VS Code Enable MCP support in VS Code (Optional) Add AMR to Redis Insight to view data in Azure Managed Redis Download Redis Insight Setup Redis MCP server in VS Code The following instructions are adapted from Use MCP Servers in VS Code to describe how to use the Redis MCP server. The example uses a Windows 11 development machine. Set up a new workspace in VS Code for your project. Add .vscode/mcp.json Create a .vscode/mcp.json file in your workspace. Copy over the required configuration content for MCP server connection. Below is an example. Notice that the REDIS_SSL setting needs to be added for Azure Managed Redis, if you followed the recommended security options when creating it. { "servers": { "redis": { "type": "stdio", "command": "C:\\Users\\user1\\.local\\bin\\uv.exe", "args": [ "--directory", "C:\\Users\\user1\\source\\repos\\mcp-redis", "run", "src/main.py" ], "env": { "REDIS_HOST": "yourredisname.yourredisregion.redis.azure.net", "REDIS_PORT": "10000", "REDIS_USERNAME": "default", "REDIS_PWD": "<replace_with_your_access_key_token>”, "REDIS_SSL": "true" } } } } Add settings.json Update your workspace’s settings.json to allow the agents you want to use. For example, if you want to use copilot-swe and copilot/claude-sonnet-4 with the Redis MCP server, the file would look like: { "chat.mcp.serverSampling": { "MCPWorkspace/.vscode/mcp.json: redis": { "allowedModels": [ "copilot/copilot-swe", "copilot/claude-sonnet-4" ] } } } Start MCP server. In the Run Commands bar (Ctrl+Shift+P), type: >MCP: List Servers Select Redis Select Start Server Open Copilot Chat Window In VS Code, open the Agent chat window (Ctrl+Shift+I) On the bottom menu of the Chat window, ensure Agent mode is selected among Agent, Ask, and Edit. Test MCP Server to add test data to Redis Type: generate 10 customer information with an integer ID as the key and hashset with properties like First Name, Last Name, Phone number, Address. Then put each of the 10 customer entries in the Redis See the output calling MCP tool. Note that MCP might prompt for permissions to execute. Select ‘Allow for this session’ from the dropdown menu Optionally verify that the key value pair has been added to the Azure Managed Redis instance through Redis Insight tool. This workflow makes it fast and intuitive to leverage Redis MCP with Azure Managed Redis, using VS Code’s agent mode for powerful, context-driven development. You’ll be able to automate Redis operations, manage context, and interact with agents—all from your familiar VS Code environment.39Views1like1CommentOrchestrate multi-LLM workflows with Azure Managed Redis
Authors: Roberto Perez, George von Bülow & Roy de Milde Key challenge for building effective LLMs In the age of generative AI, large language models (LLMs) are reshaping how we build applications — from chatbots to intelligent agents and beyond. But as these systems become more dynamic and multi-modal, one key challenge stands out: how do we route requests efficiently to the right model, prompt, or action at the right time? Traditional architectures struggle with the speed and precision required to orchestrate LLM calls in real-time, especially at scale. This is where Azure Managed Redis steps in — acting as a fast, in-memory data layer to power smart, context-aware routing for LLMs. In this blog, we explore how Redis and Azure are enabling developers to build AI systems that respond faster, think smarter, and scale effortlessly. Across industries, customers are hitting real limitations. AI workloads often need to track context across multiple interactions, store intermediate decisions, and switch between different prompts or models based on user intent — all while staying responsive. But stitching this logic together using traditional databases or microservice queues introduces latency, complexity, and cost. Teams face challenges like keeping routing logic fast and adaptive, storing transient LLM state without bloating backend services, and coordinating agent-like behaviors across multiple components. These are exactly the pain points AMR was built to address — giving developers a low-latency, highly available foundation for real-time AI orchestration and more. How to use Azure Managed Redis as a Semantic Router Semantic routing uses AI to route user queries to the right service, model or endpoint, based on their intent and context. Unlike rule-based systems, it leverages Generative AI to understand the meaning behind requests, enabling more accurate and efficient decisions. Importantly, the semantic router itself does not forward the query—it only selects the appropriate route. Your application is responsible for taking that routing decision and sending the query to the correct agent, model, or human. The users sends a query, which is passed to the system for processing The query is analyzed by an embedding model to understand its semantic intent and context The semantic router evaluates the user’s intent and context to choose the optimal route: A specific model for further processing An agent to handle the query A default response if applicable Escalation to a human for manual handling, if needed Valid queries go through the RAG pipeline to generate a response The final response is sent back to the user Code examples + Architecture Example: Jupyter Notebook with Semantic Router Let’s look at a Jupyter Notebook example that implements a simple Semantic Router with Azure Managed Redis and the Redis Vector Library. First, we install the required Python packages and define a connection to an AMR instance: pip install -q "redisvl>=0.6.0" sentence-transformers dotenv Define the Azure Managed Redis Connection. import os import warnings warnings.filterwarnings("ignore") from dotenv import load_dotenv load_dotenv() REDIS_HOST = os.getenv("REDIS_HOST") # ex: "gvb-sm.uksouth.redis.azure.net" REDIS_PORT = os.getenv("REDIS_PORT") # for AMR this is always 10000 REDIS_PASSWORD = os.getenv("REDIS_PASSWORD") # ex: "giMzOzIP4YmjNBGCfmqpgA7e749d6GyIHAzCaF5XXXXX" # If SSL is enabled on the endpoint, use rediss:// as the URL prefix REDIS_URL = f"redis://:{REDIS_PASSWORD}@{REDIS_HOST}:{REDIS_PORT}" Next, we create our first Semantic Router with an allow/block list: from redisvl.extensions.router import Route, SemanticRouter from redisvl.utils.vectorize import HFTextVectorizer vectorizer = HFTextVectorizer() # Semantic router blocked_references = [ "things about aliens", "corporate questions about agile", "anything about the S&P 500", ] blocked_route = Route(name="block_list", references=blocked_references) block_router = SemanticRouter( name="bouncer", vectorizer=vectorizer, routes=[blocked_route], redis_url=REDIS_URL, overwrite=False, ) To prevent users from asking certain categories of questions, we can define example references in a list of blocked routes using the Redis Vector Library function SemanticRouter(). While it is also possible to implement blocking at the LLM level through prompt engineering (e.g., instructing the model to refuse answering certain queries), this approach still requires an LLM call, adding unnecessary cost and latency. By handling blocking earlier with semantic routing in Azure Managed Redis, unwanted queries can be intercepted before ever reaching the model, saving LLM tokens, reducing expenses, and improving overall efficiency. Let’s try it out: user_query = "Why is agile so important?" route_match = block_router(user_query) route_match The router first vectorizes the user query using the specified Hugging Face text vectorizer. It finds a semantic similarity with route reference “corporate question sabout agile” and returns the matching route ‘block_list`. Note the returned distance value – this indicates the degree of semantic similarity between the user query and the blocked reference. You can fine-tune the Semantic Router by specifying a minimum threshold value that must be reached to count as a match. For full details and more complex examples, you can explore the Jupyter Notebooks in this GitHub repository. How do customers benefit? For customers, this technology delivers clear and immediate value. By using Azure Managed Redis as the high-performance backbone for semantic routing and agent coordination, organizations can significantly reduce latency, simplify infrastructure, and accelerate time-to-value for AI-driven experiences. Instead of building custom logic spread across multiple services, teams get a centralized, scalable, and fully managed in-memory layer that handles vector search, routing logic, and real-time state management — all with enterprise-grade SLAs, security, and Azure-native integration. The result? Smarter and faster LLM interactions, reduced operational complexity, and the flexibility to scale AI use cases from prototypes to production without re-architecting. Whether you're building an intelligent chatbot, orchestrating multi-agent workflows, or powering internal copilots, this Redis-backed technology gives you the agility to adapt in real time. You can dynamically route based on user intent, past interactions, or even business rules — all while maintaining low-latency responses that users expect from modern AI applications. And because it’s fully managed on Azure, teams can focus on innovation rather than infrastructure, with built-in support for high availability, monitoring, and enterprise governance. It’s a future-proof foundation for AI systems that need to be not just powerful, but precise. Try Azure Managed Redis today If you want to explore how to route large language models efficiently, Azure Managed Redis provides a reliable and low-latency solution. You can learn more about the service on the Azure Managed Redis page and find detailed documentation in the Azure Redis overview. For hands-on experience, check out the routing optimization notebook and other examples in the Redis AI resources repository and GitHub - loriotpiroloriol/amr-semantic-router. Give it a try to see how it fits your LLM routing needs.158Views0likes0CommentsBuilding faster AI agents with Azure Managed Redis and .NET Aspire
AI is evolving fast—and so are the tools to build intelligent, responsive applications. In our recent Microsoft Reactor session, Catherine Wang (Principal Product Manager at Microsoft) and Roberto Perez (Microsoft MVP and Senior Global Solutions Architect at Redis) shared how Azure Managed Redis helps you create Retrieval-Augmented Generation (RAG) AI agents with exceptional speed and consistency. Why RAG agents? RAG applications combine the power of large language models (LLMs) with your own data to answer questions accurately. For example, a customer support chatbot can deliver precise, pre-approved answers instead of inventing them on the fly. This ensures consistency, reduces risk, and improves customer experience. Where Azure Managed Redis fits with agentic scenarios In this project, Azure Managed Redis is used as a high-performance, in-memory vector database to support Agentic Retrieval-Augmented Generation (RAG), enabling fast similarity searches over embeddings to retrieve and ground the LLM with the most relevant known answers. Beyond this, Azure Managed Redis is a versatile platform that supports a range of AI-native use cases, including: Semantic Cache – Cache and reuse previous LLM responses based on semantic similarity to reduce latency and improve reliability. LLM Memory – Persist recent interactions and context to maintain coherent, multi-turn conversations. Agentic Memory – Store long-term agent knowledge, actions, and plans to enable more intelligent and autonomous behavior over time. Feature Store – Serve real-time features to machine learning models during inference for personalization and decision-making. These capabilities make Azure Managed Redis a foundational building block for building fast, stateful, and intelligent AI applications. Demo highlights In the session, the team demonstrates how to: Deploy a RAG AI agent using .NET Aspire and Azure Container Apps. Secure your Redis instance with Azure Entra ID, removing the need for connection strings. Use Semantic Kernel to orchestrate agents and retrieve knowledge base content via vector search. Monitor and debug microservices with built-in observability tools. Finally, we walk through code examples in C# and Python, demonstrating how you can integrate Redis search, vector similarity, and prompt orchestration into your own apps. Get Started Ready to explore? ✅ Watch the full session replay: Building a RAG AI Agent Using Azure Redis ✅ Try the sample code: Azure Managed Redis RAG AI Sample495Views0likes0CommentsGet started with Azure Managed Redis today: a step-by-step guide to deployment
At Microsoft Build 2025, we announced the general availability of Azure Managed Redis, a fully-managed, first-party service built in partnership with Redis. Ready for production workloads globally, Azure Managed Redis marks a major milestone for developers looking to build high-performance, real-time applications with the speed and reliability of Redis, fully managed on Azure. Call to action: get started with Azure Managed Redis in the Azure Portal. Key updates: Up to 15x performance improvements over Azure Cache for Redis 99.999% availability with multi-region Active‑Active replication Support for Redis 7.4 (with Redis 8 coming soon) New modules including RedisJSON, vector search, bloom filters, and time-series Flexible SKUs that let you scale memory and compute independently Navigate the new Azure Managed Redis in the Azure Portal Azure Managed Redis also comes with an updated Azure Portal experience which simplifies how you create, configure, and manage your Redis instances. Whether experimenting or deploying to production, the portal gives you full control with a few clicks. Step-by-step guide to deploying in the Azure Portal Want to see Azure Managed Redis in action? This quick walkthrough video shows how to set up Azure Managed Redis inside the Azure Portal: 👉 Watch on YouTube In this tutorial, you’ll learn how to: How to configure your Active-Active instance for high availability and low latency Setting up geo-replication across regions for 99.999% availability SLA Key tips and best practices to get started quickly No code required — just the Azure Portal and a few minutes of your time! Azure Managed Redis is perfect for cloud architects, developers, and IT pros looking to build resilient, globally available Redis-backed applications on Azure. Whether you're building AI-powered applications, speeding up your web services, or just getting started with Redis, now’s the time to explore what Azure Managed Redis can do. To learn more, head to our product page for more information or contact your Microsoft sales representative. To get started, provision Azure Managed Redis in the Azure Portal today. Resources Azure Managed Redis product page Azure Managed Redis pricing page Create an Azure Managed Redis instance Watch the Microsoft Build 2025 session on AMR Explore the documentation507Views0likes0CommentsAzure Managed Redis is now generally available: enterprise-grade performance and flexibility
We are excited to announce the general availability of Azure Managed Redis, a fully-managed, first-party service built in partnership with Redis. Designed to meet the needs of modern, cloud-native and AI-powered applications, Azure Managed Redis is now ready for production workloads globally, offering improved performance, flexible deployment options, and cost-efficient scalability. Azure Managed Redis serves as both an in-memory key-value datastore for caching and an integrated vector datastore for AI applications, making it a powerful foundation for real-time intelligence. Now available in over 50 Azure regions, Azure Managed Redis enables developers to build fast, intelligent apps backed by one of the world’s most trusted real-time data stores —seamlessly integrated into the Azure ecosystem. Availability SLA only applies to caches that are using replication 2 Up to 40% more cost-effective than the current Azure Cache for Redis offering A high-performance, enterprise-grade Redis service Azure Managed Redis delivers up to 15x higher performance compared to Azure Cache for Redis, at a significantly lower price point, making it the most cost-effective managed Redis option available on Azure today. With multi-region Active-Active geo-replication, customers can achieve sub-millisecond latency for users around the globe, all while maintaining a 99.999% SLA. Designed as a production-grade platform, Azure Managed Redis supports the latest Redis 7.4 features and will soon support to-be-launched Redis 8. The Redis 7.4 release also introduces new non-cluster deployment mode and flexible memory and compute scaling (both in preview), as well as native vector search—enabling Redis to serve as both a high-speed cache and a core component of AI and analytics workloads. What’s new in general availability? Azure Managed Redis is built for scale, performance, and adaptability—giving customers full control over their architecture with added flexibility and pricing efficiency. Key enhancements now available at GA include: Improved performance at a lower cost: Achieve up to 15x throughput compared to previous Redis offerings on Azure. Global availability: Now supported in over 50 Azure regions across all continents. Flexible scaling (in preview): Independently scale memory and performance to match dynamic workloads and reduce unnecessary cost. Enhanced Azure Portal experience: A revamped interface optimized for quick experimentation, faster provisioning, and streamlined create-and-manage workflows Expanded cluster modes: OSS Clustering – High performance with low latency and app-aware topology changes. Enterprise Clustering – Simpler app integration with a non-clustered interface and support for advanced modules like RediSearch. No Cluster (in preview) – Ideal for OSS Redis compatibility and straightforward migrations. Built for modern AI and real-time workloads As organizations adopt intelligent applications built with generative AI, retrieval-augmented generation (RAG), and autonomous agents, Redis is playing a critical role as a vector-capable, real-time data platform. With Azure Managed Redis, developers can: Power advanced scenarios with built-in vector search, secondary indexing, semantic caching, and agent memory. Take advantage of multi-model data structures, including JSON, time series, geospatial, and probabilistic types. Integrate seamlessly with Azure OpenAI Service, Azure Kubernetes Service, and Azure Functions for end-to-end GenAI pipelines. In addition, developers benefit from rich ecosystem tools like the Redis Insight GUI, Redis Copilot AI assistant, and optimized client libraries for .NET, Python, Java, Go, and Node.js. Enterprise-ready reliability and compliance Azure Managed Redis is designed for mission-critical workloads, offering enterprise-grade reliability, scale, and security from day one. Up to 99.999% availability with multi-region Active-Active geo-replicated deployments. Sub-millisecond latency through in-memory data storage Support for FedRAMP, HIPAA, PCI DSS, ISO 27001, and other compliance standards. Fully integrated identity, access, and monitoring features within Azure. Organizations can easily migrate from any tier of Azure Cache for Redis and immediately benefit from the latest Redis innovations, previously only available in Enterprise and Enterprise Flash tiers. Flexible tiers for every workload Azure Managed Redis is available in four service tiers tailored to fit your performance, memory, and cost needs: Tier* Description Memory Optimized High memory-to-vCPU (1:8) ratio, ideal for dev/test and memory-intensive workloads. Balanced Standard memory-to-vCPU (1:4) for general-purpose caching and app acceleration. Compute Optimized Low memory-to-vCPU (1:2) for high-throughput, compute-bound scenarios. Flash Optimized (in preview) Tiered architecture combining RAM and NVMe for large, cost-effective caches. With independent scaling (in preview), you can fine-tune performance and memory allocation based on workload needs, improving efficiency for both caching and AI applications. *General Availability only applicable for SKU’s up to 120GB, with SKUs larger than 120GB in-preview Get started Azure Managed Redis general availability will begin rolling out over the following days. Start building with the power of Redis and the flexibility of Azure—whether you’re accelerating a web app, deploying a real-time leaderboard, or building memory for your GenAI agents. If you are attending Microsoft Build (virtually or in person) check out our session: Get faster AI agent responses and low app latency with Azure Managed Redis. For those attending in person, stop by the Azure Managed Redis booth for demos and to speak with an expert. To get started with Azure Managed Redis today, head to our product page for more information or contact your Microsoft sales representative. Resources Azure Managed Redis product page Azure Managed Redis pricing page Create an Azure Managed Redis instance Explore the documentation Featured MS Build 2025 sessions & resources Get faster AI agent responses and low app latency with Azure Managed Redis Build observable, production-ready, distributed apps on App Service2.3KViews0likes0CommentsJoin Us at Build 2025: Explore What’s New with Azure Managed Redis
We're thrilled to be at Microsoft Build 2025—both in Seattle and online—to share the latest updates and innovations in Azure Managed Redis. Whether you're modernizing .NET apps, building intelligent AI services, or scaling global web apps, Azure Managed Redis has never been more essential. As a fully managed, first-party service built in partnership with Redis, Azure Managed Redis brings enterprise-grade caching, real-time data, and integrated vector capabilities to your app architecture—with seamless integration into the Azure ecosystem. Check out what’s happening with Azure Managed Redis at Build: 🔍 Breakout Sessions Explore how Azure Managed Redis powers modern, scalable, and AI-enhanced applications in these featured breakout sessions: BRK203: Get faster AI agent responses and low app latency with Azure Managed Redis Discover how Azure Managed Redis boosts AI application performance—including those using Azure SQL—by delivering low latency, fast response times, optimized configurations, and seamless .NET caching integration. BRK201: Innovate, deploy, & optimize your apps without infrastructure hassles Explore the latest in autoscaling, high availability, and observability for cloud applications—without managing infrastructure. Learn how Azure Managed Redis and other services simplify deployment, scaling, and monitoring for web apps, microservices, AI/ML, and analytics workloads. 💬 Meet the Experts Have questions? Looking to talk architecture, migration, or real-time performance tuning? Visit us in the Expert Meetup Zone to connect with the Microsoft and Redis product teams, engineers, and architects behind Azure Managed Redis. 🔎 How to find it: Log into the Microsoft Build 2025 website or use the official event mobile app to view the venue map and session schedule. 📍 To find the Expert Meetup zone, check out the official MS Build Event Guide for a venue maps and other logistical information. 🍹 Redis x Microsoft Build Happy Hour Let’s keep the conversation going! You’re invited to the Microsoft Build Happy Hour with Redis—open to all Build attendees. 🗓️ Tuesday, May 20th, 6:30–8:30 PM 📍 Register here Whether you're a long-time Redis developer or just getting started, this is a great chance to network and have fun with the Redis and Azure community. 🧪 Hands-On Labs (featuring open-source Redis) Dive into practical scenarios and learn how Redis supercharges AI and app experiences: LAB340: Accelerate AI App Development with AI Gateway Capabilities in Azure API Management Leverage Redis as a fast datastore for low-latency inferencing and vector-based lookups. LAB306: Integrating and Enhancing Applications with .NET Aspire Enhance application performance and user experience by integrating Redis for session caching and real-time responsiveness. Get Started with Azure Managed Redis Want to try it out before or after Build? 🚀 Start building 📘 Explore the documentation We can’t wait to connect with you at Build 2025—in Seattle or online!296Views2likes0CommentsData Migration with RIOT-X for Azure Managed Redis
By: Roy de Milde, Global Black Belt and App Innovation Specialist, Microsoft and George von Bülow, Senior Solution Architect, Redis Introduction Customers are increasingly seeking efficient ways to migrate their current cache systems to Azure Managed Redis. This growing demand has led to the development of innovative tools and applications to facilitate the migration process. Azure Managed Redis offers numerous benefits, including enhanced performance, scalability, and security features. By migrating to Azure Managed Redis, customers can take advantage of its fully managed service, which reduces the operational burden and allows them to focus on their core business activities. Additionally, Azure Managed Redis provides seamless integration with other Azure services, enabling customers to build more robust and scalable applications. The migration process can be complex, but with the right tools and guidance, customers can achieve a smooth transition and unlock the full potential of Azure Managed Redis. We are writing this blog to give more insights around this and help customers understand the benefits and process of migrating to Azure Managed Redis with RIOT-X RIOT-X Explanation RIOT-X is a tool created by the field engineers at Redis. RIOT-X, which stands for Redis Input/Output Tools, is a command-line utility designed to help users seamlessly transfer data in and out of Redis. It supports various sources and targets, including files (CSV, JSON, XML), data generators, relational databases, and Redis itself through snapshot and live replication. The introduction of RIOT-X addresses the challenges faced by customers during migration, offering a streamlined and reliable solution. By leveraging RIOT-X, customers can ensure a smooth transition to Azure Managed Redis, benefiting from enhanced performance, scalability, and security features provided by Azure. Migration RIOT-X can be used for various migration scenarios, depending on the connectivity between the source and target Redis databases, and requirements for application availability. Snapshot Migration In the simplest case, data is directly replicated from the source (which can be any flavor or version of Redis) to Azure Managed Redis during a scheduled downtime period. After replication, the applications that are using Redis are reconfigured to connect to Azure Managed Redis. The corresponding RIOT-X command is: riotx replicate redis://source redis://target Live Migration If the application is mission critical and cannot afford any downtime, then a live migration comes into play. This is like the snapshot scenario, but RIOT-X continuously updates the database in Azure Managed Redis with changes applied to the source database. The modified command uses the --mode live argument: riotx replicate --mode live redis://source redis://target Applications can now be updated with blue-green deployments or similar techniques, where read and write components are changed independently. The source Redis database continues to run until the application migration has been completed. Export/Import Migration For cases where there is no direct network connection between the source database and Azure Managed Redis, RIOT-X allows exporting data in a standard format, such as JSON or CSV. The data file is then transferred to a location accessible by Azure Managed Redis and imported with RIOT-X. These are the corresponding commands: riotx file-export --uri redis://source --type=json export.json riotx file-import --uri redis://target --type=json export.json Technical Setup RIOT-X can be run on Windows, macOS and Linux. Windows scoop bucket add redis https://github.com/redis/scoop.git scoop install riotx MacOS brew install redis/tap/riotx Linux Download the pre-compiled binary from RIOT-X releases and unzip. unzip riotx-standalone-0.6.2-*.zip You can also run RIOT-X as a Docker container: docker run riotx/riotx [OPTIONS] [COMMAND] RIOT-X supports the standard authentication methods for Redis. If you are using Entra Id authentication with Azure Managed Redis, you must enable access keys for the duration of the migration. Support for Entra Id has been added as a feature request for RIOT-X. Architecture RIOT-X is essentially an ETL tool where data is extracted from the source system, transformed, and loaded into the target system. RIOT-X is a standalone system and does not need to be co-located with a Redis server. Connection to Redis source and target databases is achieved through the Redis serialization protocol (RESP), with support for both RESP2 and RESP3. Replication between a Redis source and target works as follows: Identify source keys to be replicated Read data associated with each key Write each key to the target If the source and target databases use different Redis versions (which is the case for Azure Cache for Redis at version 6.2 and Azure Managed Redis at version 7.4), then data structure replication must be enabled with the --struct argument: riotx replicate --struct redis://source redis://target Getting started with Azure Managed Redis We are excited to offer the public preview of Azure Managed Redis, built to drive innovation and prepare your applications for AI. If you are attending Microsoft Build 2025, please join us at this session and stop by the Azure Managed Redis booth for demos and to speak with an expert. To get started with Azure Managed Redis today, please check out our product page for more information or contact our sales team. Resources Azure Managed Redis product page Azure Managed Redis pricing page Azure Cache for Redis product page558Views2likes1Comment