azure cache for redis
17 Topics- 16KViews3likes0Comments
AI Resilience: Strategies to Keep Your Intelligent App Running at Peak Performance
Stay Online Reliability. It's one of the 5 pillars of Azure Well-Architect Framework. When starting to implement and go-to-market any new product witch has any integration with Open AI Service you can face spikes of usage in your workload and, even having everything scaling correctly in your side, if you have an Azure Open AI Services deployed using PTU you can reach the PTU threshold and them start to experience some 429 response code. You also will receive some important information about the when you can retry the request in the header of the response and with this information you can implement in your business logic a solution. Here in this article I will show how to use the API Management Service policy to handle this and also explore the native cache to save some tokens! Architecture Reference The Azure Function in the left of the diagram just represent and App request and can be any kind of resource (even in an On-Premisse environment). Our goal in this article is to show one in n possibilities to handle the 429 responses. We are going to use API Management Policy to automatically redirect the backend to another Open AI Services instance in other region in the Standard mode, witch means that the charge is going to be only what you use. First we need to create an API in our API Management to forward the requests to your main Open AI Services (region 1 in the diagram). Now we are going to create this policy in the API call request: <policies> <inbound> <base /> <set-backend-service base-url="<your_open_ai_region1_endpoint>" /> </inbound> <backend> <base /> </backend> <outbound> <base /> </outbound> <on-error> <retry condition="@(context.Response.StatusCode == 429)" count="1" interval="5" /> <set-backend-service base-url="<your_open_ai_region2_endpoint>" /> </on-error> </policies> The first part of our job is done! Now we have an automatically redirect to our OpenAI Services deployed at region 2 when our PTU threshold is reached. Cost consideration So now you can ask me: and about my cost increment for using API Management? Even if you don't want to use any other feature on API Management you can leverage of the API Management native cache and, once again using policy and AI, put some questions/answers in the built-in Redis* cache using semantic cache for Open AI services. Let's change our policy to consider this: <policies> <inbound> <base /> <azure-openai-semantic-cache-lookup score-threshold="0.05" embeddings-backend-id ="azure-openai-backend" embeddings-backend-auth ="system-assigned" > <vary-by>@(context.Subscription.Id)</vary-by> </azure-openai-semantic-cache-lookup> <set-backend-service base-url="<your_open_ai_region1_endpoint>" /> </inbound> <backend> <base /> </backend> <outbound> <base /> <azure-openai-semantic-cache-store duration="60" /> </outbound> <on-error> <retry condition="@(context.Response.StatusCode == 429)" count="1" interval="5" /> <set-backend-service base-url="<your_open_ai_region2_endpoint>" /> </on-error> </policies> Now, API Management will handle the tokens inputted and use semantic equivalence and decide if its fit with cached information or redirect the request to your OpenAI endpoint. And, sometime, this can help you to avoid reach the PTU threshold as well! * Check the tier / cache capabilities to validate your business solution needs with the API Management cache feature: Compare API Management features across tiers and cache size across tiers. Conclusion API Management offers key capabilities for AI that we are exploring in this article and also others that you can leverage for your intelligent applications. Check it out on this awesome AI Gateway HUB repository At least but not less important, dive in API Management features with experts in the field inside the API Management HUB. Thanks for reading and Happy Coding!496Views4likes1Comment- 10KViews5likes0Comments
Connect to Azure Cache for Redis using SSL Port 6380 from Linux VM
Scenario: You are using a Linux VM and you want to connect to Azure Cache for Redis using SSL Port 6380. Action: You can connect to Azure Cache for Redis using SSL Port with the help of Stunnel and Redis-cli. The steps are as follows: Step 1: Install the Redis-cli tool in your Linux machine. The command is as below: sudo apt-get update sudo apt-get install redis-tools Note: redis-tools package has redis-cli tool as well among other tools. Step 2: Since the redis-cli doesn’t support SSL port (6380), we can make use of stunnel to connect to Azure Cache for Redis using SSL port. We have version 4 of the utility, called stunnel4 which can be installed using the below command: sudo apt-get install stunnel4 Note: If you want to run the Redis using non SSL port 6379, in that case you do not need stunnel and you can directly access using the below command provided non-ssl port is open in Azure Cache for Redis: redis-cli -p 6379 -a <Your Access Key for Azure Cache for Redis> -h < yourcachename.redis.cache.windows.net> Step 3: To configure the service to start at booting, you must modify the /etc/default/stunnel4 file using the below command: sudo nano /etc/default/stunnel4 This opens a file where you have a variable ‘ENABLED’ which must be set to 1 to enable the service to start as shown below: You can save the changes with CTL+X and then pressing ENTER. Step 4: We need to configure the Azure Cache for Redis for redis-cli which must be mentioned in Redis configuration file of stunnel. Execute the below command: sudo nano /etc/stunnel/redis.conf This creates a new file where add the following entry and insert the actual name of your Azure Cache for Redis in place of yourcachename. [redis-cli] client = yes accept = 127.0.0.1:6380 connect = yourcachename.redis.cache.windows.net:6380 Save the file. Step 5: Now, we have configured the stunnel and hence need to restart the service which can be done with the help of below command: sudo systemctl restart stunnel4.service Step 6: If you check the services listening for connections on your Redis, you should see stunnel listening on port 6380 as below: sudo netstat -plunt Step 7: Now you can connect to Azure Cache for Redis using SSL port with the help of Redis-cli. Below is the command: redis-cli -p 6380 -a <Your Access Key for Azure Cache for Redis> You can see that Redis gets connected successfully and you will be able to perform operations on Azure Cache for Redis: Hope this helps!31KViews5likes3CommentsSetting Up WSL for Secure Redis CLI Connections to Azure Redis Cache on Windows.
This blog will guide you through the necessary steps to set up WSL on your Windows machine, enabling you to harness the power of Redis CLI and securely connect to an Azure Redis Cache instance by running a Linux environment with Redis CLI on your Windows machine, without the need for a separate virtual machine or dual booting.4.1KViews0likes0CommentsLeveraging Redis Insights for Azure Cache for Redis
The blog talks about how to leverage Redis Insights GUI tool while working with Azure Cache for Redis. We will look at some of the option what will help us with some high-level connectivity troubleshooting and insights to our data present inside the cache. To start with, we can leverage this for testing the connectivity to our Redis cache instance. After clicking on Add Redis Database button, we can fill in the other fields ahead: Host: Complete FQDN or the completed Redis cache Endpoint For Basic, Standard & Premium Tier - <Cachename>.redis.windows.net For Enterprise Tier - <Cachename>.<regionname>.redisenterprise.cache.azure.net Port: 6380 or 6379 (depending on whether we are testing for SSL or non-SSL port respectively) / 10000 for Enterprise Tier Database Alias: Cache name Password: Access Key for your cache Use TLS: Option to be checked for testing with 6380 port and also Enterprise Tier cache. Post that, we can click on Test Connection button which will help us doing a high-level check whether the cache endpoint is reachable or not. SSL Port: Non-SSL Port: Enterprise Cache: Once the test connection is successful, you can click on Add Redis Database to start exploring the insights of your cache instance. Note: All the above demo has been done without any kind of firewall, private endpoints or VNET restrictions. In case you are having VNET or private endpoints configured, then you have test it from a VM which is part of the VNET configured. On clicking My Redis Database option, it will list down all the databases you have connected too from the Redis Insights along with some high-level details such as modules if any for the enterprise tier or OSS Cluster as connection type if clustering policy selected was OSS. If Enterprise, it shows as Standalone only. For this Demo, we took an empty cache and the below snippet demonstrate on how your can-do simple Set operations or add a key to your cache instance. We can add a new key by providing the key type such as Set, String, List, Hast etc, Key name, TTL etc. We have added 3 keys initially and it will start reflecting in the left-hand window section as depicted below: Similarly, we added the keys further and all of them started listing. Selecting any of the keys shall provide insights to that particular key on the right sight window such as value, TTL , key size etc. You can also use this to do any kind of pattern match as well. E.g. In the below snippet, we tried listing all the keys that start key name as testkey. There is a Bulk Actions button available as well which has mainly 2 option available: Perform bulk deletion. Execute multiple set of Redis commands in a sequential format which can be uploaded as a plain text file. Moving ahead, there is an Analysis Tool option which can be leveraged to gain insight to the data summary residing in our cache. There is a New Report button which will generate a report providing various kind of insights on data residing in the cache. Below are some of the highlights: It provides a high-level summary key based on type. It gives you a view of how much data in under No Expiry (no TTL set) and is expected to get freed in expected time (based on TTL set). In the below example, it points around ~450 bytes of memory to get freed in less than an hour while there is approx. ~1200 bytes of data which don’t have any kind of TTL set and will not expire. It further provides high level details of top keys based on TTL or Key size which can be used to identify larger size keys. There is also a Workbench option that provides a command line option like Redis CLI, using which we can execute commands. In the below example, we have used it to do PING-PONG test, set up the keys and other operations too. Disclaimer: Please note that tool is supported by REDIS and not Azure cache for Redis so we don’t control the behavior or features for the tool. Hope that helps!9.3KViews1like0CommentsSSL/TLS connection issue troubleshooting guide
You may experience exceptions or errors when establishing TLS connections with Azure services. Exceptions are vary dramatically depending on the client and server types. A typical ones such as "Could not create SSL/TLS secure channel." "SSL Handshake Failed", etc. In this article we will discuss common causes of TLS related issue and troubleshooting steps.40KViews9likes1Comment