Speed is in the loop. With uvloop powering Python 3.13+ apps on Azure Functions, asynchronous workloads run faster, scale better, and keep up with demanding serverless scenarios.
Python 3.13+ apps on Azure Functions are now faster by default. By replacing the standard event loop with uvloop, the Functions Python worker delivers higher throughput and lower latency for asynchronous workloads — no code changes required.
Introduction
Azure Functions powers millions of customer scenarios, from real-time APIs to event-driven automation. For Python developers, scalability often comes down to how efficiently the runtime handles I/O, concurrency, and asynchronous workloads.
That’s why, starting with Python 3.13, the Azure Functions Python worker now uses uvloop as its default event loop. Built on top of libuv (the same library behind Node.js), uvloop provides a drop-in replacement for Python’s standard asyncio loop with measurable performance improvements.
For customers, this means faster request handling and more responsive serverless applications — without having to update a single line of app code.
Why Event Loops Matter
The event loop is the backbone of any asynchronous Python application. It schedules coroutines, manages I/O events, and drives concurrency. In serverless workloads like Azure Functions, this loop runs continuously to:
- Handle incoming HTTP requests
- Dispatch and complete async tasks (like database queries or service calls)
- Manage parallel event processing (Event Hubs, Service Bus, etc.)
The default Python event loop (UnixSelectorEventLoop) is reliable, but it wasn’t designed for high-throughput scenarios at massive scale. Uvloop, by contrast, is a high-performance reimplementation in Cython that consistently outperforms the built-in loop in both throughput and latency.
How It Works in Azure Functions
In Python 3.13+, the Azure Functions Python worker sets uvloop as the default event loop policy at startup:
import uvloop, asyncio
asyncio.set_event_loop_policy(uvloop.EventLoopPolicy())
This means any async workload — whether you’re using async def in your functions, calling external APIs, or parallelizing work with asyncio.gather — benefits immediately from uvloop’s faster scheduling and I/O handling. It is already available in the Functions runtime environment.
No configuration changes, no requirements.txt edits, and no feature flags. If you’re running Functions on Python 3.13 or higher, uvloop is already in play.
Measuring the Performance Gains
We tested uvloop against the existing Unix event loop across several realistic workloads. For testing with Flex Consumption on Azure, the app with no uvloop is on Python 3.12, while the app with uvloop is on Python 3.13. The Flex Consumption app has an instance size of 2048 MB. Results were measured by taking the median of three runs for each test case.
Test 1: 10k Requests, 50 Virtual Users
| Environment | Event Loop | Average HTTP Request Time (ms) | Requests per second | % Diff vs unix |
| Local | unix | 96.95 | 515 | - |
| uvloop | 87.99 | 565 | +9.7% | |
| Azure | unix | 54.34 | 882 | - |
| uvloop | 51.77 | 923 | +4.8% |
Test 2: Sustained Load, 100 Virtual Users (5 min)
| Environment | Event Loop | Number of Requests | Requests per second | % Diff vs unix |
| Local | unix | 157,580 | 525 | - |
| uvloop | 167,928 | 560 | +6.4% | |
| Azure | unix | 571,797 | 1,898 | - |
| uvloop | 588,458 | 1,961 | +2.9% |
Test 3: Heavy Concurrency, 500 Virtual Users + 5 async tasks per request
| Environment | Event Loop | Number of Requests | Requests per second | % Diff vs unix |
| Local | unix | 216,212 | 720 | - |
| uvloop | 231,878 | 772 | +7% | |
| Azure | unix | 1,791,600 | 5,696 | - |
| uvloop | 1,806,750 | 6,020 | +1% |
The Unix Event Loop started showing failures in both environments in ~2% of requests.
Across the board, uvloop delivered measurable improvements in throughput and latency — especially under high concurrency.
Why Only Python 3.13+?
While uvloop works with older versions of Python, we rolled it out as the default starting in 3.13 because:
- It ensured the change was strictly a net positive in performance and stability
- Easier rollout for all available Azure Functions SKUs, avoiding breaking existing customers
- Python 3.13 for the Azure Functions Worker introduces a Proxy Worker, so this is an additional performance boost to help with the extra overhead introduced
Older runtimes remain on the standard event loop to minimize compatibility risks.
Challenges and Lessons Learned
Integrating uvloop into the Functions Python worker surfaced a few interesting challenges:
- Compatibility: Ensuring uvloop worked seamlessly across Linux environments at scale
- Observability: Updating logs to confirm which event loop policy was active
- Benchmark design: Testing realistic workloads (HTTP requests, async fan-out) to validate improvements beyond microbenchmarks
Through this process, we confirmed uvloop consistently improved throughput and latency without regressions.
Future Directions
Switching to uvloop is just one step in making Azure Functions Python faster and more scalable. Looking ahead, we’re exploring:
- Deeper async optimizations: further tuning around asyncio and gRPC handling
- Serialization improvements: building on work like orjson for faster data processing
- Cold start performance: reducing startup overhead in Python workers
Conclusion
By adopting uvloop as the default event loop for Python 3.13+, Azure Functions makes async workloads faster, more reliable, and more scalable — all without requiring customers to change their code.
If you’re upgrading to Python 3.13 for your Functions apps, uvloop is already running under the hood to give you better performance out of the box.