Blog Post

Apps on Azure Blog
6 MIN READ

How to use Azure Table Storage with .NET Aspire and a Minimal API

FBoucher's avatar
FBoucher
Copper Contributor
Apr 30, 2025

Azure Storage is a versatile cloud storage solution that I've used in many projects. In this post, I'll share my experience integrating it into a .NET Aspire project through two perspectives: first, by building a simple demo project to learn the basics, and then by applying those learnings to migrate a real-world application, AzUrlShortener.

This post is part of a series about modernizing the AzUrlShortener project:

Part 1: Learning using a simple project

For this post we will be using a simpler project instead of the full AzUrlShortener solution to make it easier to follow. All the code of this simple project is also available on GitHub: AspireAzStorage, make a copy (fork it) and explore it.

πŸ’‘All the code is available on GitHub: AspireAzStorage

The Context

This tutorial demonstrates how to create a .NET Aspire solution with a Minimal API that retrieves employee data from Azure Table Storage. We'll build a clean, structured solution that can run both locally and in Azure.

The structure of the solution was created with a simple dotnet new webapi -n EmployeeApi -o EmployeeDemo\EmployeeApi command. Then from your favorite editor, "Add .NET Aspire Orchestration", by right-clicking on the project in the Solution Explorer.

For AppHost to be able to orchestrate a Azure Storage, we will need to add Aspire.Hosting.Azure.Storage package. This can be done by many ways, but by using the CLI it would look like dotnet add AppHost package Aspire.Hosting.Azure.Storage.

Defining the Orchestration to use Azure Storage

We want the API to read data from an Azure Table Storage and return the result. Using dependency injection (DI), we could add an Azure Storage account to the AppHost project, and specify we needs the table client and pass it to the API project. The code of progam.cs in the AppHost project would look like this:

using Microsoft.Extensions.Hosting;

var builder = DistributedApplication.CreateBuilder(args);

var azStorage = builder.AddAzureStorage("azstorage");

if (builder.Environment.IsDevelopment())
{
    azStorage.RunAsEmulator();
}

var strTables = azStorage.AddTables("strTables");

builder.AddProject<Projects.Api>("api")
		.WithExternalHttpEndpoints()
		.WithReference(strTables)
		.WaitFor(strTables);

builder.Build().Run();

The azStorage is the reference to the Azure Storage account, and strTables is the reference to the table client. To be able to execute the solution locally, we check if the environment is "IsDevelopment" and run the Azure Storage emulator. This will allow .NET Aspire to create an Azurite container to emulate the Azure Storage account. When in production the emulator is not needed, and the Azure Storage account will be used. Finally we pass the strTables reference to the API project and make sure the client is ready before starting the API.

The Minimal API project

We already know that our project is expecting an Azure Table Storage client, so we can add the Aspire.Azure.Data.Tables package to the API project. Using the CLI the command is dotnet add EmployeeApi package Aspire.Azure.Data.Tables. And we can add builder.AddAzureTableClient("strTables"); just before the app creation in the Program.cs file.

The beauty of a Minimal API is that it is very flexible and can be as minimal or structured as you want. When the project is created everything is in the Program.cs file. That makes it easy to follow and understand. But as the project grows, it can become hard to maintain. To make it easier to maintain, we can move the endpoints, models and services in distinct files and folders. That left our Program.cs with only the following code:

using Api.Endpoints;

var builder = WebApplication.CreateBuilder(args);

builder.AddServiceDefaults();

// Add services to the container.
// Learn more about configuring OpenAPI at https://aka.ms/aspnet/openapi
builder.Services.AddOpenApi();

builder.AddAzureTableClient("strTables");

var app = builder.Build();
app.MapDefaultEndpoints();

// Configure the HTTP request pipeline.
if (app.Environment.IsDevelopment())
{
    app.MapOpenApi();
}

app.UseHttpsRedirection();

// Add the Employee Endpoints
app.MapEmployeeEndpoints();

app.Run();

And the rest of the code is split in different files and folders. The structure of the project is as follow:

EmployeeApi/
β”œβ”€β”€ Endpoints/        
β”‚   └── EmployeeEndpoints.cs    # Endpoints for the Employee API
β”œβ”€β”€ Models/
β”‚   └── EmployeeEntity.cs       # Model for the Azure Table Storage
β”œβ”€β”€ Services/
β”‚   └── AzStorageTablesService.cs
β”œβ”€β”€ Api.http                    # HTTP file to test the API
└── Program.cs                  # Main file of the Minimal API

Employee Endpoints

You may have noticed at the end of the Program.cs file, we are calling app.MapEmployeeEndpoints(). This is a custom extension method that will add the endpoints to the API.

    public static void MapEmployeeEndpoints(this IEndpointRouteBuilder app)
    {
        var endpoints = app.MapGroup("api")
                            .WithOpenApi();

        MapGetAllEmployees(endpoints);
        MapGetAllEmployeesAsync(endpoints);
        MapGetEmployeesByFirstLetter(endpoints);
        MapGetEmployeesGroupByFirstLetterFirstNameAsync(endpoints);
        MapGenerateEmployees(endpoints);
    }

This will group all the endpoints under the /api path and add the OpenAPI documentation. Then we can define each endpoint in a different method. For example, the MapGetAllEmployees method will look like this:

    private static void MapGetAllEmployees(IEndpointRouteBuilder endpoints)
    {
        endpoints.MapGet("/GetEmployeesAsync", (TableServiceClient client) => GetAllEmployeeAsync(new AzStrorageTablesService(client)))
            .WithName("GetAllEmployees")
            .WithDescription("Get all employees from the table storage");
    }

Note the TableServiceClient client parameter. This is the Azure Table Storage client that was created previously and pass using DI. We are passing it to the AzStrorageTablesService service that will be responsible to interact with the Azure Table Storage. The WithName and WithDescription methods are used to add metadata to the endpoint that will be used in the OpenAPI documentation.

The Azure Table Storage Service

To make sure the Employee table exists when the queries are executed, we can use the AzStrorageTablesService constructor to create the table if it does not exist, and instantiate the table client.

    private readonly TableClient _employeeTableClient;

    public AzStrorageTablesService(TableServiceClient client)
    {
        client.CreateTableIfNotExists("Employee");
        _employeeTableClient = client.GetTableClient("Employee");
    }

The only thing left is to implement the GetAllEmployeeAsync method that will query the table and return the result.

    public async Task<List<EmployeeEntity>> GetAllEmployeeAsync()
    {
        var lstEmployees = new List<EmployeeEntity>();
        var queryResult = _employeeTableClient.QueryAsync<EmployeeEntity>();

        await foreach (var emp in queryResult.AsPages().ConfigureAwait(false))
        {
            lstEmployees.AddRange(emp.Values);
        }

        return lstEmployees;
    }

To make sure all record are returned, we are using the AsPages method. This will fetch all employee of all pages fill a list and return it.

Testing the API

To test manually the API, we can use the Api.http file. This file is a simple text file that contains the HTTP requests. For example, to get all employees, the content of the file would look like this:

@Api_HostAddress = https://localhost:7125

### Get all employees
GET {{Api_HostAddress}}/api/GetEmployeesAsync
Accept: application/json

Putting everything together

The demo solution contains more endpoints, but the structure is the same. There is a /generate/{quantity?} endpoint to populate the employee table. It use Bogus, a simple open-source fake data generator for .NET languages.

To run the solution locally a simple F5 should be enough. Aspire will start the Azurite container and the API. You can then use the Api.http file to generate some employees and get the list of employees.

To deploy the solution to Azure, you can use the Azure Developer CLI (azd). With azd init you can create a new project, and with azd up you can deploy the solution to Azure. In a few minutes the solution will be available in the cloud, but this time it will be using a real Azure Storage account. Nothing else to change, the code is the same.

Part 2: Lesson Learn while migrating AzUrlShortener

The little experiment with AspireAzStorage convinced me. Using Azure Table Storage with .NET Aspire is simple, but we all know, a real project is more complex. Therefore I was expecting some challenges. What a disappointment, there was none. Everything worked as expected.

The AzUrlShortener project was written a few years ago and was using the Microsoft.Azure.Cosmos.Table package. This package is still totally valid today, but there is now one for Azure Table. The migration to use the Azure.Data.Tables package wasn't straightforward. A few objects had different names, and the query was a bit different. But the migration was done in a few hours.

The deployment worked on the first try. I tested the data migration using the Azure Storage Explorer. The GitHub Action will have to get updated but with the bicep files that azd generates it should be simple.

Conclusion

I really enjoyed this journey of migrating the AzUrlShortener project as much as building AspireAzStorage. I invite you to fork that repository and play with it. Would you have done something differently? Do you have any questions? Feel free to ask in the comments below or reach out to me directly at @fboucheros.bsky.social.

Want to Learn more?

To learn more about Azure Container Apps I strongly suggest this repository: Getting Started .NET on Azure Container Apps, it contains many step by step tutorial (with videos) to learn how to use Azure Container Apps with .NET.

In video please

 

Updated May 02, 2025
Version 4.0
No CommentsBe the first to comment