azure managed redis
12 TopicsIntroducing the PublicNetworkAccess property to Azure Managed Redis
We want to enable users of Azure Managed Redis to disable public traffic without a private endpoint or allow both public and private access simultaneously. In certain enterprise environments, Redis and networking responsibilities are handled by distinct teams. The Data Operations team may securely provision Azure Managed Redis instances with PublicNetworkAccess set to Disabled, subsequently establishing connections to private links overseen by the Networking Operations team. By introducing a dedicated control for PublicNetworkAccess, it provides the flexibility of being no longer necessary to configure a Private Endpoint concurrently with Azure Managed Redis at the time of creation. With the new PublicNetworkAccess property, you can now restrict public IP traffic independently of Private Links to Virtual Networks. The following network configurations are now supported: • Public traffic without Private Links • Public traffic with Private Links • Private traffic without Private Links • Private traffic with Private Links API changes The PublicNetworkAccess property is introduced in Microsoft.Cache redisEnterprise 2025-07-01. This is a security-related breaking change. We will deprecate older API versions before 2025-07-01 in October 2026. After October 2026: • You can only set PublicNetworkAccess property using API versions 2025-07-01 or later • You can no longer send API calls with older versions prior to 2025-07-01 • Your older caches provisioned with the older versions of the APIs will continue to work, but additional operations on it will require calls to be made with API versions 2025-07-01 or later Changing an Existing Azure Managed Redis Cache to Use PublicNetworkAccess property Use the Azure portal to add PublicNetworkAccess config to your existing Azure Managed Redis cache. Steps to Change PublicNetworkAccess Property 1. Open your cache in the Azure Portal. 2. From the resource menu, select Networking. 3. In the portal, set the PublicNetworkAccess property here. Note: This is an irreversible operation—once set, you cannot revert to the unset state. 4. In the Public access pane, select Enable or Disable and save. 5. NOTE: your existing Private Endpoints will remain unaffected Figure 1: Set PublicNetworkAccess in existing Azure Managed Redis by selecting ‘Disable’ or ‘Enable’ public access Best practices Having PublicNetworkAccess controlled separately by a setting provides more flexibility for a team to reuse existing Private Endpoints to enhance the end-to-end management experience. To improve Azure Managed Redis security, disable PublicNetworkAccess and use a Virtual Network with Private Endpoint and Private Links. Virtual Networks provide network controls and extra protection, while Private Links enable one-way communication for greater isolation. This ensures other resources in the Virtual Network stay secure even if Redis is compromised.435Views0likes2CommentsWhat’s new in Azure Managed Redis: Ignite 2025 feature announcements
Azure Managed Redis continues to power the world’s most demanding, low-latency workloads—from caching and session management to powering the next generation of AI agents. At Microsoft Ignite 2025, we’re excited to announce several new capabilities designed to make Azure Managed Redis even more scalable, manageable, and AI-ready. Bigger, faster, stronger: new enterprise-scale SKUs (generally available) We are expanding our capacity portfolio with the general availability of Memory Optimized 150 and 250, Balanced 150 and 250, Compute Optimized 150 and 250 SKUs/, bringing higher throughput, lower latency, and greater memory capacity for your most demanding workloads. Whether you’re running global gaming platforms, AI-powered personalization engines, or enterprise-scale caching tiers, these new SKUs offer the performance headroom to scale with confidence. Redis as a knowledge base for AI Agents Azure Managed Redis is now available as part of the Azure AI Foundry MCP tools catalog, allowing customers to use Redis as a knowledge store or memory store for AI agents. This integration makes it simple to connect Redis to Foundry-based agents, enabling semantic searching, long term and short term memory, and faster reasoning for multi-agent applications—all running on trusted Azure infrastructure. Scheduled Maintenance (public preview) You can now configure maintenance windows for their Azure Managed Redis instances, giving you greater control and predictability for planned service updates. This capability helps align maintenance with your own operational schedules—minimizing disruption and providing flexibility for mission-critical applications. Terraform Provider for Azure Managed Redis We’re making infrastructure automation even easier with a dedicated Terraform provider for Azure Managed Redis. This new provider enables you to declaratively create, manage, and configure AMR resources through code, improving consistency and streamlining CI/CD pipelines across environments. Reserved Instances: now in 30+ Regions Azure Managed Redis now supports Reserved Instances in over 30 regions, with more coming soon. Reserved pricing provides predictable costs and savings for long-term workloads.. Go to Azure Portal | Reservations | Add and search for ‘Azure Cache for Redis’. Azure Managed Redis SKUs like Balanced, Computer Optimized, Memory Optimized, and Flash Optimized, would show up as sub SKUs in the category. Reserved Instance is available for 35% discount with 1 year purchase and 55% discount with 3 year purchase. Learn More & Get Started Azure Managed Redis is redefining what’s possible for caching, data persistence, and agentic AI workloads. Explore the latest demos, architecture examples, and tutorials from Ignite: Learn more about Azure Managed Redis Try Azure Managed Redis samples on GitHub Watch the Azure Managed Redis session at Ignite on-demand (BRK129) Explore Redis as a memory store for Microsoft Agent Framework Introducing the PublicNetworkAccess property to Azure Managed Redis | Microsoft Community Hub Ready to build Internet-scale AI apps with Azure Managed Redis? Start today at aka.ms/hol-amr.191Views0likes0CommentsAzure Managed Redis at Ignite 2025: pre-day, session, and booth
Microsoft Ignite 2025 is almost here! Many practitioners are surprised by the powerful new capabilities in Azure Managed Redis—and now is your chance to see them in action. Whether you are modernizing applications, accelerating AI workloads, or building next-generation agent architectures, Azure Managed Redis is your key to speed and scale. Don’t miss the chance to connect with experts from Microsoft and Redis at our pre-day workshop and general session at Ignite and learn how to: Unlock high-performance caching for demanding workloads Build a powerful memory layer for agentic applications Leverage vector storage for Retrieval-Augmented Generation (RAG) Optimize LLM costs with semantic caching All in one fully-managed service—Azure Managed Redis. Connect with Azure Managed Redis team at Ignite 2025 1. Ignite pre-day workshop: Build Internet-Scale AI Apps with Azure Managed Redis — Caching to Agents (in-person only) When: Ignite pre-day on Monday, November 17, 2025, 1pm-5pm PT Where: Moscone Center, San Francisco Registration: Add this optional in-person workshop in the Ignite registration → Building AI applications and agents at internet scale requires more than speed — it demands unified memory, context, and scalability. You’ll see live demos and learn how to build and scale intelligent applications using Azure Managed Redis for caching and modern AI workloads, architect your applications for performance, reliability, and scale with geo-replication, and migrate to Azure Managed Redis. Seats are limited, please sign up today. 2. Breakout Session: Smarter AI Agents with Azure Managed Redis - BRK129 (in-person and online) View session details on the Ignite website and save to your event favorites Azure Managed Redis with Azure AI Foundry and Microsoft Agent Framework let developers build adaptive, context-aware AI systems. Redis handles real-time collaboration, persistent learning, and semantic routing, while the Agent Framework supports advanced reasoning and planning. Integrated short- and long-term memory lets agents access relevant data directly, simplifying development and operations. Azure Managed Redis supports MCP for control and data plane tasks, enabling easy management and scaling of workloads and knowledge stores. Join us to discover how to build scalable, multi-agent systems backed by the performance, reliability, and unified memory of Redis on Azure. 3. Visit the Azure Managed Redis booth in the Expo Hall Have questions? Looking to talk architecture, migration, and supercharging your AI apps? Visit us in the Expert Meetup Zone to connect with the Microsoft and Redis product teams, engineers, and architects behind Azure Managed Redis. Prepare for Ignite Learn more about Microsoft Ignite Explore the Azure Managed Redis documentation. Try the hands-on workshop for Azure Managed Redis.632Views0likes0CommentsRedis named Stack Overflow’s top data storage tool for AI Agents
Redis recognized by developers on Stack Overflow as the #1 data storage tool for AI agent workloads and #5 database in 2025. Microsoft is the only cloud provider to offer Redis Enterprise as a fully native, managed service with Azure Managed Redis.118Views0likes0CommentsSupercharging AI Agents with Memory on Azure Managed Redis
Co-authored by Purna Mehta, AMR Product Manager, Redis, Inc. and Han Xing Choong, Software Engineer, Redis, Inc. AI agents are quickly evolving from simple chatbots to sophisticated, context-aware systems that reason, orchestrate tools, and learn over time. Memory is at the heart of that transformation. Azure Managed Redis - Microsoft’s first-party service based on Redis Enterprise, can make agent memory management both seamless and production-ready in the Azure ecosystem. Memory capability enables AI agents to retain the context of interactions with users and systems. By continuously capturing preferences, operational patterns, and unique requirements, agents develop a comprehensive understanding over time. With the addition of vector store capabilities, Azure Managed Redis allows AI agents to efficiently store, index, and search high-dimensional data like embeddings, supporting advanced semantic search for more accurate, real-time responses. This is where Azure Managed Redis can help accelerate building agentic AI apps using Microsoft Agent Framework. What is Microsoft Agent Framework Microsoft Agent Framework is an open-source SDK and runtime designed to let developers build, deploy, and manage sophisticated multi-agent systems with ease. It unifies the enterprise-ready foundations of Semantic Kernel with the innovative orchestration of Autogen. Understanding agent memory architecture An agent’s usefulness depends on how and what it remembers. A typical agentic memory architecture contains two key components: Working short-term memory A play-by-play of the current interaction, so the agent can track ongoing conversation history and state without asking you to repeat yourself. Microsoft Agent Framework can help manage this short-term or working memory tied to a user’s session. In the Agent Framework SDK, it is implemented as Thread with an associated ChatMessageStore that can be offloaded to Redis. Durable long-term Memory Preferences, behavioral patterns, durable facts and interests pieced together from many interactions so the agent can learn and adapt over time. Long-term memory is typically shared across sessions. Within Microsoft Agent Framework SDK, long-term memory is provided through ContextProviders. Azure Managed Redis can be integrated directly as the context provider to maximize performance, control and advanced features. One of the popular context providers is powered by Mem0. Mem0 handles intelligent memory extraction, deduplication, and contextual grounding while recording these memories in Azure Managed Redis. Memory in action - travel agent example To show this in action, we have created a sample agent – an AI travel concierge – which provides time-aware information to generate trip itineraries personalized based on preferences. To achieve this goal, it relies on LLM, external tools, and manages user-specific context with a dual-memory layer. This layered memory design means a conversation isn’t a single exchange – it’s a long-running interaction. Agents can draw on past facts and preferences to shape current responses, creating continuity and building toward complex outcomes over time. Below is the agent architecture from the Travel Agent Sample application on GitHub built with Microsoft Agent Framework. How Azure Managed Redis fits this architecture When building an agent using Microsoft Agent Framework and Azure Managed Redis – these are some of the key attributes for production-grade internet scale applications on Azure: Latest innovation: Azure Managed Redis is a first-party service, building on Redis Enterprise capabilities. Azure Managed Redis brings multiple data types, fast vector similarity search and semantic caching (that helps you save tokens), full JSON support, high throughput, and low latency out of the box. Reliability and SLAs: Azure Managed Redis offers high availability up to 99.999%. Offers active-active geo-replication and persistent storage, so memory doesn’t disappear between sessions, and agents can scale globally. Enterprise ready: As a first-party product, Azure Managed Redis integrates with Azure Entra ID for authentication, Azure networking, and security – like Private Endpoints, TLS and encryption at rest - that are available and manageable through Azure Managed Redis in Azure portal. Azure Managed Redis brings the performance, simplicity, and scalability of Redis Enterprise directly into the Azure ecosystem, making it easier to build intelligent, stateful agents. With seamless integration into your Azure environment, you can rely on fully managed Redis to power agent memory, and context management. Azure Managed Redis also delivers sub-millisecond lookups that enable real-time agent interactions or intelligent systems that need near real-time responses like recommendation systems, text to speech or speech to speech agents. Next steps - Explore the Travel Agent Sample application on GitHub - Check the Azure Managed Redis Documentation - Deep dive into Microsoft Agent Framework649Views0likes0CommentsUsing MCP Server with Azure Managed Redis in VS Code
Overview Have you thought about asking your VS Code to generate test data and automatically upload them to Redis, without having to write any code? Wouldn’t it be nice to code and manage resources all in one place in VS Code? This blog walks you through setting up VS Code to connect with Redis MCP Server and an Azure Managed Redis to accomplish these. By the end of this blog, we will be able to run VS Code in agent mode and tell it to generate customer test data in English, and it will leverage the MCP server to write the data into Redis. What is MCP and Redis MCP? Model Context Protocol (MCP) is an open standard that lets AI models use external tools and services through a unified interface, enabling seamless integration and standardized interactions. VS Code agent mode lets you run the VS Code tool as an agent, connecting directly to MCP servers like Redis MCP. This enables seamless integration between your development environment and Redis-powered context management, making it easy to automate tasks, manage data, and interact with Redis from within VS Code. This article provides a step-by-step guide to setting up a Redis MCP server connected to Azure Managed Redis using agent mode in Visual Studio Code. Quick Start: Step-by-Step Instructions Prerequisites Create an Azure Managed Redis (AMR) Instance Create an Azure Managed Redis in the Azure Portal Download or clone redis/mcp-redis Start VS Code in Agent mode with MCP servers enabled Download and install VS Code Use agent mode in VS Code Enable MCP support in VS Code (Optional) Add AMR to Redis Insight to view data in Azure Managed Redis Download Redis Insight Setup Redis MCP server in VS Code The following instructions are adapted from Use MCP Servers in VS Code to describe how to use the Redis MCP server. The example uses a Windows 11 development machine. Set up a new workspace in VS Code for your project. Add .vscode/mcp.json Create a .vscode/mcp.json file in your workspace. Copy over the required configuration content for MCP server connection. Below is an example. Notice that the REDIS_SSL setting needs to be added for Azure Managed Redis, if you followed the recommended security options when creating it. { "servers": { "redis": { "type": "stdio", "command": "C:\\Users\\user1\\.local\\bin\\uv.exe", "args": [ "--directory", "C:\\Users\\user1\\source\\repos\\mcp-redis", "run", "src/main.py" ], "env": { "REDIS_HOST": "yourredisname.yourredisregion.redis.azure.net", "REDIS_PORT": "10000", "REDIS_USERNAME": "default", "REDIS_PWD": "<replace_with_your_access_key_token>”, "REDIS_SSL": "true" } } } } Add settings.json Update your workspace’s settings.json to allow the agents you want to use. For example, if you want to use copilot-swe and copilot/claude-sonnet-4 with the Redis MCP server, the file would look like: { "chat.mcp.serverSampling": { "MCPWorkspace/.vscode/mcp.json: redis": { "allowedModels": [ "copilot/copilot-swe", "copilot/claude-sonnet-4" ] } } } Start MCP server. In the Run Commands bar (Ctrl+Shift+P), type: >MCP: List Servers Select Redis Select Start Server Open Copilot Chat Window In VS Code, open the Agent chat window (Ctrl+Shift+I) On the bottom menu of the Chat window, ensure Agent mode is selected among Agent, Ask, and Edit. Test MCP Server to add test data to Redis Type: generate 10 customer information with an integer ID as the key and hashset with properties like First Name, Last Name, Phone number, Address. Then put each of the 10 customer entries in the Redis See the output calling MCP tool. Note that MCP might prompt for permissions to execute. Select ‘Allow for this session’ from the dropdown menu Optionally verify that the key value pair has been added to the Azure Managed Redis instance through Redis Insight tool. This workflow makes it fast and intuitive to leverage Redis MCP with Azure Managed Redis, using VS Code’s agent mode for powerful, context-driven development. You’ll be able to automate Redis operations, manage context, and interact with agents—all from your familiar VS Code environment.437Views3likes3CommentsOrchestrate multi-LLM workflows with Azure Managed Redis
Authors: Roberto Perez, George von Bülow & Roy de Milde Key challenge for building effective LLMs In the age of generative AI, large language models (LLMs) are reshaping how we build applications — from chatbots to intelligent agents and beyond. But as these systems become more dynamic and multi-modal, one key challenge stands out: how do we route requests efficiently to the right model, prompt, or action at the right time? Traditional architectures struggle with the speed and precision required to orchestrate LLM calls in real-time, especially at scale. This is where Azure Managed Redis steps in — acting as a fast, in-memory data layer to power smart, context-aware routing for LLMs. In this blog, we explore how Redis and Azure are enabling developers to build AI systems that respond faster, think smarter, and scale effortlessly. Across industries, customers are hitting real limitations. AI workloads often need to track context across multiple interactions, store intermediate decisions, and switch between different prompts or models based on user intent — all while staying responsive. But stitching this logic together using traditional databases or microservice queues introduces latency, complexity, and cost. Teams face challenges like keeping routing logic fast and adaptive, storing transient LLM state without bloating backend services, and coordinating agent-like behaviors across multiple components. These are exactly the pain points AMR was built to address — giving developers a low-latency, highly available foundation for real-time AI orchestration and more. How to use Azure Managed Redis as a Semantic Router Semantic routing uses AI to route user queries to the right service, model or endpoint, based on their intent and context. Unlike rule-based systems, it leverages Generative AI to understand the meaning behind requests, enabling more accurate and efficient decisions. Importantly, the semantic router itself does not forward the query—it only selects the appropriate route. Your application is responsible for taking that routing decision and sending the query to the correct agent, model, or human. The users sends a query, which is passed to the system for processing The query is analyzed by an embedding model to understand its semantic intent and context The semantic router evaluates the user’s intent and context to choose the optimal route: A specific model for further processing An agent to handle the query A default response if applicable Escalation to a human for manual handling, if needed Valid queries go through the RAG pipeline to generate a response The final response is sent back to the user Code examples + Architecture Example: Jupyter Notebook with Semantic Router Let’s look at a Jupyter Notebook example that implements a simple Semantic Router with Azure Managed Redis and the Redis Vector Library. First, we install the required Python packages and define a connection to an AMR instance: pip install -q "redisvl>=0.6.0" sentence-transformers dotenv Define the Azure Managed Redis Connection. import os import warnings warnings.filterwarnings("ignore") from dotenv import load_dotenv load_dotenv() REDIS_HOST = os.getenv("REDIS_HOST") # ex: "gvb-sm.uksouth.redis.azure.net" REDIS_PORT = os.getenv("REDIS_PORT") # for AMR this is always 10000 REDIS_PASSWORD = os.getenv("REDIS_PASSWORD") # ex: "giMzOzIP4YmjNBGCfmqpgA7e749d6GyIHAzCaF5XXXXX" # If SSL is enabled on the endpoint, use rediss:// as the URL prefix REDIS_URL = f"redis://:{REDIS_PASSWORD}@{REDIS_HOST}:{REDIS_PORT}" Next, we create our first Semantic Router with an allow/block list: from redisvl.extensions.router import Route, SemanticRouter from redisvl.utils.vectorize import HFTextVectorizer vectorizer = HFTextVectorizer() # Semantic router blocked_references = [ "things about aliens", "corporate questions about agile", "anything about the S&P 500", ] blocked_route = Route(name="block_list", references=blocked_references) block_router = SemanticRouter( name="bouncer", vectorizer=vectorizer, routes=[blocked_route], redis_url=REDIS_URL, overwrite=False, ) To prevent users from asking certain categories of questions, we can define example references in a list of blocked routes using the Redis Vector Library function SemanticRouter(). While it is also possible to implement blocking at the LLM level through prompt engineering (e.g., instructing the model to refuse answering certain queries), this approach still requires an LLM call, adding unnecessary cost and latency. By handling blocking earlier with semantic routing in Azure Managed Redis, unwanted queries can be intercepted before ever reaching the model, saving LLM tokens, reducing expenses, and improving overall efficiency. Let’s try it out: user_query = "Why is agile so important?" route_match = block_router(user_query) route_match The router first vectorizes the user query using the specified Hugging Face text vectorizer. It finds a semantic similarity with route reference “corporate question sabout agile” and returns the matching route ‘block_list`. Note the returned distance value – this indicates the degree of semantic similarity between the user query and the blocked reference. You can fine-tune the Semantic Router by specifying a minimum threshold value that must be reached to count as a match. For full details and more complex examples, you can explore the Jupyter Notebooks in this GitHub repository. How do customers benefit? For customers, this technology delivers clear and immediate value. By using Azure Managed Redis as the high-performance backbone for semantic routing and agent coordination, organizations can significantly reduce latency, simplify infrastructure, and accelerate time-to-value for AI-driven experiences. Instead of building custom logic spread across multiple services, teams get a centralized, scalable, and fully managed in-memory layer that handles vector search, routing logic, and real-time state management — all with enterprise-grade SLAs, security, and Azure-native integration. The result? Smarter and faster LLM interactions, reduced operational complexity, and the flexibility to scale AI use cases from prototypes to production without re-architecting. Whether you're building an intelligent chatbot, orchestrating multi-agent workflows, or powering internal copilots, this Redis-backed technology gives you the agility to adapt in real time. You can dynamically route based on user intent, past interactions, or even business rules — all while maintaining low-latency responses that users expect from modern AI applications. And because it’s fully managed on Azure, teams can focus on innovation rather than infrastructure, with built-in support for high availability, monitoring, and enterprise governance. It’s a future-proof foundation for AI systems that need to be not just powerful, but precise. Try Azure Managed Redis today If you want to explore how to route large language models efficiently, Azure Managed Redis provides a reliable and low-latency solution. You can learn more about the service on the Azure Managed Redis page and find detailed documentation in the Azure Redis overview. For hands-on experience, check out the routing optimization notebook and other examples in the Redis AI resources repository and GitHub - loriotpiroloriol/amr-semantic-router. Give it a try to see how it fits your LLM routing needs.201Views0likes0CommentsBuilding faster AI agents with Azure Managed Redis and .NET Aspire
AI is evolving fast—and so are the tools to build intelligent, responsive applications. In our recent Microsoft Reactor session, Catherine Wang (Principal Product Manager at Microsoft) and Roberto Perez (Microsoft MVP and Senior Global Solutions Architect at Redis) shared how Azure Managed Redis helps you create Retrieval-Augmented Generation (RAG) AI agents with exceptional speed and consistency. Why RAG agents? RAG applications combine the power of large language models (LLMs) with your own data to answer questions accurately. For example, a customer support chatbot can deliver precise, pre-approved answers instead of inventing them on the fly. This ensures consistency, reduces risk, and improves customer experience. Where Azure Managed Redis fits with agentic scenarios In this project, Azure Managed Redis is used as a high-performance, in-memory vector database to support Agentic Retrieval-Augmented Generation (RAG), enabling fast similarity searches over embeddings to retrieve and ground the LLM with the most relevant known answers. Beyond this, Azure Managed Redis is a versatile platform that supports a range of AI-native use cases, including: Semantic Cache – Cache and reuse previous LLM responses based on semantic similarity to reduce latency and improve reliability. LLM Memory – Persist recent interactions and context to maintain coherent, multi-turn conversations. Agentic Memory – Store long-term agent knowledge, actions, and plans to enable more intelligent and autonomous behavior over time. Feature Store – Serve real-time features to machine learning models during inference for personalization and decision-making. These capabilities make Azure Managed Redis a foundational building block for building fast, stateful, and intelligent AI applications. Demo highlights In the session, the team demonstrates how to: Deploy a RAG AI agent using .NET Aspire and Azure Container Apps. Secure your Redis instance with Azure Entra ID, removing the need for connection strings. Use Semantic Kernel to orchestrate agents and retrieve knowledge base content via vector search. Monitor and debug microservices with built-in observability tools. Finally, we walk through code examples in C# and Python, demonstrating how you can integrate Redis search, vector similarity, and prompt orchestration into your own apps. Get Started Ready to explore? ✅ Watch the full session replay: Building a RAG AI Agent Using Azure Redis ✅ Try the sample code: Azure Managed Redis RAG AI Sample610Views0likes0CommentsGet started with Azure Managed Redis today: a step-by-step guide to deployment
At Microsoft Build 2025, we announced the general availability of Azure Managed Redis, a fully-managed, first-party service built in partnership with Redis. Ready for production workloads globally, Azure Managed Redis marks a major milestone for developers looking to build high-performance, real-time applications with the speed and reliability of Redis, fully managed on Azure. Call to action: get started with Azure Managed Redis in the Azure Portal. Key updates: Up to 15x performance improvements over Azure Cache for Redis 99.999% availability with multi-region Active‑Active replication Support for Redis 7.4 (with Redis 8 coming soon) New modules including RedisJSON, vector search, bloom filters, and time-series Flexible SKUs that let you scale memory and compute independently Navigate the new Azure Managed Redis in the Azure Portal Azure Managed Redis also comes with an updated Azure Portal experience which simplifies how you create, configure, and manage your Redis instances. Whether experimenting or deploying to production, the portal gives you full control with a few clicks. Step-by-step guide to deploying in the Azure Portal Want to see Azure Managed Redis in action? This quick walkthrough video shows how to set up Azure Managed Redis inside the Azure Portal: 👉 Watch on YouTube In this tutorial, you’ll learn how to: How to configure your Active-Active instance for high availability and low latency Setting up geo-replication across regions for 99.999% availability SLA Key tips and best practices to get started quickly No code required — just the Azure Portal and a few minutes of your time! Azure Managed Redis is perfect for cloud architects, developers, and IT pros looking to build resilient, globally available Redis-backed applications on Azure. Whether you're building AI-powered applications, speeding up your web services, or just getting started with Redis, now’s the time to explore what Azure Managed Redis can do. To learn more, head to our product page for more information or contact your Microsoft sales representative. To get started, provision Azure Managed Redis in the Azure Portal today. Resources Azure Managed Redis product page Azure Managed Redis pricing page Create an Azure Managed Redis instance Watch the Microsoft Build 2025 session on AMR Explore the documentation568Views0likes0CommentsAzure Managed Redis is now generally available: enterprise-grade performance and flexibility
We are excited to announce the general availability of Azure Managed Redis, a fully-managed, first-party service built in partnership with Redis. Designed to meet the needs of modern, cloud-native and AI-powered applications, Azure Managed Redis is now ready for production workloads globally, offering improved performance, flexible deployment options, and cost-efficient scalability. Azure Managed Redis serves as both an in-memory key-value datastore for caching and an integrated vector datastore for AI applications, making it a powerful foundation for real-time intelligence. Now available in over 50 Azure regions, Azure Managed Redis enables developers to build fast, intelligent apps backed by one of the world’s most trusted real-time data stores —seamlessly integrated into the Azure ecosystem. Availability SLA only applies to caches that are using replication 2 Up to 40% more cost-effective than the current Azure Cache for Redis offering A high-performance, enterprise-grade Redis service Azure Managed Redis delivers up to 15x higher performance compared to Azure Cache for Redis, at a significantly lower price point, making it the most cost-effective managed Redis option available on Azure today. With multi-region Active-Active geo-replication, customers can achieve sub-millisecond latency for users around the globe, all while maintaining a 99.999% SLA. Designed as a production-grade platform, Azure Managed Redis supports the latest Redis 7.4 features and will soon support to-be-launched Redis 8. The Redis 7.4 release also introduces new non-cluster deployment mode and flexible memory and compute scaling (both in preview), as well as native vector search—enabling Redis to serve as both a high-speed cache and a core component of AI and analytics workloads. What’s new in general availability? Azure Managed Redis is built for scale, performance, and adaptability—giving customers full control over their architecture with added flexibility and pricing efficiency. Key enhancements now available at GA include: Improved performance at a lower cost: Achieve up to 15x throughput compared to previous Redis offerings on Azure. Global availability: Now supported in over 50 Azure regions across all continents. Flexible scaling (in preview): Independently scale memory and performance to match dynamic workloads and reduce unnecessary cost. Enhanced Azure Portal experience: A revamped interface optimized for quick experimentation, faster provisioning, and streamlined create-and-manage workflows Expanded cluster modes: OSS Clustering – High performance with low latency and app-aware topology changes. Enterprise Clustering – Simpler app integration with a non-clustered interface and support for advanced modules like RediSearch. No Cluster (in preview) – Ideal for OSS Redis compatibility and straightforward migrations. Built for modern AI and real-time workloads As organizations adopt intelligent applications built with generative AI, retrieval-augmented generation (RAG), and autonomous agents, Redis is playing a critical role as a vector-capable, real-time data platform. With Azure Managed Redis, developers can: Power advanced scenarios with built-in vector search, secondary indexing, semantic caching, and agent memory. Take advantage of multi-model data structures, including JSON, time series, geospatial, and probabilistic types. Integrate seamlessly with Azure OpenAI Service, Azure Kubernetes Service, and Azure Functions for end-to-end GenAI pipelines. In addition, developers benefit from rich ecosystem tools like the Redis Insight GUI, Redis Copilot AI assistant, and optimized client libraries for .NET, Python, Java, Go, and Node.js. Enterprise-ready reliability and compliance Azure Managed Redis is designed for mission-critical workloads, offering enterprise-grade reliability, scale, and security from day one. Up to 99.999% availability with multi-region Active-Active geo-replicated deployments. Sub-millisecond latency through in-memory data storage Support for FedRAMP, HIPAA, PCI DSS, ISO 27001, and other compliance standards. Fully integrated identity, access, and monitoring features within Azure. Organizations can easily migrate from any tier of Azure Cache for Redis and immediately benefit from the latest Redis innovations, previously only available in Enterprise and Enterprise Flash tiers. Flexible tiers for every workload Azure Managed Redis is available in four service tiers tailored to fit your performance, memory, and cost needs: Tier* Description Memory Optimized High memory-to-vCPU (1:8) ratio, ideal for dev/test and memory-intensive workloads. Balanced Standard memory-to-vCPU (1:4) for general-purpose caching and app acceleration. Compute Optimized Low memory-to-vCPU (1:2) for high-throughput, compute-bound scenarios. Flash Optimized (in preview) Tiered architecture combining RAM and NVMe for large, cost-effective caches. With independent scaling (in preview), you can fine-tune performance and memory allocation based on workload needs, improving efficiency for both caching and AI applications. *General Availability only applicable for SKU’s up to 120GB, with SKUs larger than 120GB in-preview Get started Azure Managed Redis general availability will begin rolling out over the following days. Start building with the power of Redis and the flexibility of Azure—whether you’re accelerating a web app, deploying a real-time leaderboard, or building memory for your GenAI agents. If you are attending Microsoft Build (virtually or in person) check out our session: Get faster AI agent responses and low app latency with Azure Managed Redis. For those attending in person, stop by the Azure Managed Redis booth for demos and to speak with an expert. To get started with Azure Managed Redis today, head to our product page for more information or contact your Microsoft sales representative. Resources Azure Managed Redis product page Azure Managed Redis pricing page Create an Azure Managed Redis instance Explore the documentation Featured MS Build 2025 sessions & resources Get faster AI agent responses and low app latency with Azure Managed Redis Build observable, production-ready, distributed apps on App Service2.6KViews0likes0Comments