Blog Post

Educator Developer Blog
7 MIN READ

.NET 10 and Memory: Less Heap, Smarter GC, Faster Apps

hazem's avatar
Dec 08, 2025

When .NET 10 Thinks Like a Performance Engineer

As Microsoft steps into the Ignite 2025 era of “AI-first everything” across Azure, Foundry, and cloud-native workloads, .NET quietly got its own big upgrade: .NET 10, a new long-term support (LTS) release and the official successor to .NET 9, supported for three years.

The release was celebrated at .NET Conf 2025 in November, where Microsoft shipped .NET 10 alongside Visual Studio 2026 and highlighted performance, memory efficiency and cloud-readiness as core pillars of the platform. A few days later at Microsoft Ignite 2025 in San Francisco, the story zoomed out: AI agents, Azure-hosted workloads, and App Service / Functions all moved forward with first-class .NET 10 support, positioning this runtime as the default foundation for modern cloud and AI solutions.

I’m Hazem Ali, a Microsoft MVP, Principal AI & ML Engineer / Architect, and Founder & CEO of Skytells. 

In this article, I’ll walk through what .NET 10 actually changes in memory, heap, and stack behavior—and why that matters if you’re building high-throughput APIs, AI agents, or cloud-native services in the post-Ignite world.

At a high level, .NET 10 does three big things for memory:

  1. Allocates more objects on the stack instead of the heap.
  2. Teaches the JIT to understand more “hidden” allocations (delegates, spans, small arrays).
  3. Leans on a smarter GC mode (DATAS) that adapts heap size to your app instead of your machine.

Let’s unpack that in plain language.


1. Heap vs Stack in 60 Seconds

Quick mental model:

  • Stack

    • Used for short-lived data.
    • Allocation is extremely fast — often just advancing a pointer.
    • Memory is released automatically when the function returns.
    • Very cache-friendly due to tight, contiguous layout.

    Heap

    • Used for long-lived or shared objects.
    • Allocation is slower and requires runtime management.
    • Objects are tracked by the garbage collector (GC) in managed runtimes.
    • Creating too many short-lived objects on the heap increases GC pressure, which can lead to pauses and more cache misses.

So any time the runtime can say:

“This object definitely dies when this method returns”

…it can put it on the stack instead, and the GC never has to care.

.NET 10’s main memory trick is expanding how often it can safely do that.


2. From Heap to Stack: .NET 10 Gets Serious About Escape Analysis

The JIT in .NET 10 spends more effort answering one question:

“Does this object escape the current method?”

If the answer is “no”, it can be stack-allocated. This is called escape analysis, and .NET 10 pushes it much further than .NET 9.

2.1 Delegates and Lambdas That Don’t Leak

Consider this simple closure:

int SumTwice(int y)
{
    Func<int, int=""> addY = x => x + y;
    return DoubleResult(addY, y);

    static int DoubleResult(Func<int, int=""> f, int v)
        => f(v) * 2;
}</int,></int,>

The C# compiler turns that into:

  • A hidden closure class with a field y.
  • A delegate object pointing at a method on that closure.

In .NET 9, both of these lived on the heap.

In .NET 10, if the JIT can inline DoubleResult and prove the delegate never escapes the method, the delegate object is stack-allocated instead. Benchmarks in the official performance blog show:

  • ~3× faster for this pattern
  • ~70% fewer bytes allocated (only the closure remains on the heap)

You don’t change the code; the JIT just stops paying the “lambda tax” as often.


2.2 Small Arrays of Value Types on the Stack

.NET 10 adds the ability to stack-allocate small, fixed-size arrays of value types (that don’t hold GC references) when they clearly don’t outlive the method.

Example from the official docs:

static void Sum()
{
    int[] numbers = { 1, 2, 3 };
    int sum = 0;

    for (int i = 0; i < numbers.Length; i++)
        sum += numbers[i];

    Console.WriteLine(sum);
}

The runtime article explicitly states that numbers is now stack-allocated in this scenario, because:

  • The size is known at compile time (int[3]).
  • The array never escapes the method.

Result: no heap allocation for that small buffer, and one less thing for the GC to track.


2.3 Small Arrays of Reference Types

Historically, arrays of reference types (string[], object[], etc.) have always been heap allocations in .NET, and this remains true in .NET 10. The GC must track the references stored in these arrays, which makes stack allocation impossible.

However, .NET 10 significantly reduces the cost of using small ref-type arrays by improving escape analysis around the patterns that create and consume them. While the array itself still lives on the heap, many of the associated allocations that previously accompanied these patterns can now be eliminated entirely.

Example:

static void Print()
{
    string[] words = { "Hello", "World!" };
    foreach (var s in words)
        Console.WriteLine(s);
}

In .NET 9, using a small string[] like this typically incurred extra hidden allocations (iterator objects, closure artifacts, helper frames).

In .NET 10, if the JIT can prove the code is fully local and non-escaping:

  • Iterator-related allocations can be removed,
  • Delegate and closure helpers may be stack-allocated or optimized away,
  • The only remaining heap object is the array itself — with no additional GC noise.

A similar pattern appears in the performance blog’s benchmark:

[Benchmark]
public void Test()
{
    Process(new string[] { "a", "b", "c" });

    static void Process(string[] inputs)
    {
        foreach (string input in inputs)
            Use(input);

        static void Use(string s) { }
    }
}

On .NET 10, this benchmark shows zero additional heap allocations beyond the array itself, because the runtime eliminates the iterator and closure allocations that .NET 9 would create. The array still resides on the heap, but the overall memory footprint effectively drops to zero for the surrounding pattern.


2.4 Structs, Spans, and “Hidden” References

.NET 10’s improved escape analysis can recognize when neither the struct nor its referenced array escapes, enabling the runtime to eliminate unnecessary heap allocations around the pattern.

From the runtime docs:

struct GCStruct
{
    public int[] arr;
}

public static int Main()
{
    int[] x = new int[10];
    GCStruct y = new GCStruct() { arr = x };
    return y.arr[0];
}
  • In .NET 9, x is treated as escaping (through y) and lives on the heap.
  • In .NET 10, the JIT understands that neither y nor x escapes, so it can stack-allocate the array and associated data.

This also benefits types like Span (which is just a struct with a reference and a length) and unlocks more cases where spans and small arrays become stack-only, not heap noise.


3. DATAS: The GC That Adapts to Your App Size

On the GC side, the key concept is DATASDynamic Adaptation To Application Sizes.

  • Introduced as an opt-in mode in .NET 8.
  • Enabled by default in .NET 9.

By the time you land on .NET 10 LTS, DATAS is the default GC behavior for most apps, with more tuning and guidance from the GC team.

3.1 What DATAS Actually Does

Official docs describe DATAS as a GC mode that:

  • Adapts the heap size to the app’s memory requirements,
  • Keeps the heap size roughly proportional to the live (long-lived) data size,
  • Shrinks when the workload gets lighter and grows when it gets heavier.

That’s different from classic Server GC, which:

  • Assumes your process “owns” the machine,
  • Grows the heap aggressively if there’s memory available,
  • May end up with very different heap sizes depending on hardware.

DATAS is especially targeted at bursty workloads and containerized apps where memory actually costs money and you might have many processes on the same node.

3.2 How You Control It

From the GC configuration docs:

DATAS can be toggled via:

  • Environment variable:

    • DOTNET_GCDynamicAdaptationMode=1 → enable
    • DOTNET_GCDynamicAdaptationMode=0 → disable
  • runtimeconfig.json:

    • "System.GC.DynamicAdaptationMode": 1 or 0
  • MSBuild property:

    • 1

But again: starting with .NET 9, it’s on by default, so in .NET 10 you typically only touch this if you have a very specific perf profile where DATAS isn’t a good fit.


4. What This Means for Your Apps

Putting it together:

You get fewer “silly” allocations for free

.NET 10’s runtime now:

  • Stack-allocates more delegates, closures, spans, and small arrays when they don’t escape.
  • Reduces the abstraction penalty of idiomatic C# (LINQ, foreach, lambdas, etc.), so you don’t have to micro-optimize everything yourself.

The GC behaves more like “pay for what you really use”

With DATAS:

  • Your heap won’t balloon just because you moved your app to a bigger SKU.
  • Memory usage tracks live data instead of “whatever the machine has spare”.

You still keep control when needed

If you have:

  • A latency-critical, always-hot service on a big dedicated machine,
  • Or you’ve benchmarked and found DATAS not ideal for a specific scenario,

…you can still flip DOTNET_GCDynamicAdaptationMode off and go back to classic Server GC semantics.


5. TL;DR for Busy Teams

If you’re scanning this on a Friday:

  • Upgrading from .NET 8 → 10 LTS gives you:

    • Tuned DATAS GC as the default,
    • Better JIT escape analysis,
    • Stack-allocation of more small arrays and delegates.
  • You don’t need to rewrite your code to benefit; just recompile and deploy.

  • For critical services, benchmark with and without DATAS (toggle via DOTNET_GCDynamicAdaptationMode) and pick what fits your SLOs.

That’s the memory game-changer in .NET 10: the runtime quietly moves more stuff off the heap, while the GC learns to grow and shrink based on your real live data, not just the machine it’s sitting on.

Published Dec 08, 2025
Version 1.0