Blog Post

Azure SQL Blog
4 MIN READ

10K Request per Second: REST API with Azure SQL, Dapper and JSON

damauri's avatar
damauri
Icon for Microsoft rankMicrosoft
Feb 23, 2020

In the previous article I described how easy is to create a REST API with Python and Azure SQL. But what about .NET? Would that be as easy as well? Absolutely yes: the native support to JSON offered by Azure SQL will also help a lot again. In addition to that, the Micro-ORM Dapper will make the interaction with database a breeze, and the new System.Text.Json will take care of manipulating that developer-friendly format. Sounds too good and easy to be true? Well, let's prove it!


I started creating a WebAPI project using .NET CLI (I'm using .NET Core 3):

 

dotnet new webapi

 

and then I added the following packages 

  • Dapper
  • Microsoft.Data.SqlClient
  • System.Text.Json

via

 

dotnet add package

 

to add the libraries mentioned before. Dapper is an amazing MicroORM, that removes all the plumbing code needed to connect and query a database, so that you can focus more one the really important code. It is battle tested, as it is used by StackOverflow itself, and very recommended. If you want to have a kickstart, there is a 10 post tutorial here: Dapper.NET @ Medium, along with full source code for all described features.
 
From a database perspective I'm using almost the same code I used to build the same sample with Python, leveraging as much as possible the native support to JSON the Azure SQL provides:

 

CREATE OR ALTER PROCEDURE web.get_customer
@Id INT
AS
SET NOCOUNT ON;
SELECT
[CustomerID],
[CustomerName],
[PhoneNumber],
[FaxNumber],
[WebsiteURL],
[DeliveryAddressLine1] AS 'Delivery.AddressLine1',
[DeliveryAddressLine2] AS 'Delivery.AddressLine2',
[DeliveryPostalCode] AS 'Delivery.PostalCode'
FROM
[Sales].[Customers]
WHERE
[CustomerID] = @Id
FOR JSON PATH
GO


Thanks to the FOR JSON PATH clause, the result will be like the following:

 

[{
"CustomerID": 123,
"CustomerName": "Tailspin Toys (Roe Park, NY)",
"PhoneNumber": "(212) 555–0100",
"FaxNumber": "(212) 555–0101",
"WebsiteURL": "http://www.tailspintoys.com\/RoePark",
"Delivery": {
"AddressLine1": "Shop 219",
"AddressLine2": "528 Persson Road",
"PostalCode": "90775"
}
}]


Since the result is a JSON string, all it's needed to turn it into an object is to deserialize it. In the sample on GitHub I decided to keep the JSON as JSON as I want to take advantage of its flexibility as much as possible, but that's of course up to you.


Using Dapper you literally need just two lines of code:

 

var qr = await conn.ExecuteScalarAsync<string>(
sql: procedure,
param: parameters,
commandType: CommandType.StoredProcedure
);
if (qr != null)
result = JsonDocument.Parse(qr);

 

And you're done. In the result variable you'll have the JSON ready to be used. If you are creating a REST API, you can just return it to the client and…well, nothing else: done!


As usual the full source code is available on GitHub for you to play with:

 

Azure-Samples/azure-sql-db-dotnet-rest-api

 

To make the sample easily reusable, I wrapped the logic that execute the request to the database into the ControllerQuery base class, so it will be very easy to derive from there to create more specialized controller classes.

 

On the database side, I created a convention for stored procedure naming so that code can automatically call the correct procedure for the correct verb and controller:

 

web.<verb>_<controller> [@id], [@payload]


For example, the Get method for the CustomersController will call the stored procedure web.get_customers behind the scenes, and is something like:

 

[HttpGet]
public async Task<JsonElement> Get()
{
return await this.Query("get", this.GetType());
}

 

id and payload are two optional parameters that can be used to target a specific row via its own id and the payload is the JSON document that will be passed to the database for further manipulation.


Honestly, that's amazing. Clean code, very lean, maintainable, flexible and fast. Fast? Yeah, what about performances? There will be a dedicated post on this subject, but let's just say that with this configuration:

  • Azure SQL Hyperscale HS_Gen5_2
  • Web App Linux DotNet Core 3.1, P1V1, 1 node

I was able to execute 1100 Requests Per Seconds with a median response time of 20msec. If you can accept a bit higher latency, you can also reach 1500 RPS but the median response time becomes 40msec and the 95 percentile is set at 95msec. Database usage never goes above 20% in such cases…and in fact the bottleneck is the Web App (better, the Web App Plan) and more specifically the CPU. Time to scale up or out the Web App Plan.

 

By scaling up and out a bit, I was able to reach almost 10.000 request per second with just an HS_Gen5_4. Quite impressive.

 


If you're wondering what solution I used to create load test, you can find all details, along with deployable code, here: Running Locust On Azure.


Now, the solution could seem complete already…but it really is? Not really, when working with the cloud, we always need to take think about resiliency. Luckily for us, the .NET SqlClient driver already does quite a good job for us as it automatically handles Idle Connection Resiliency.


For now, it's more than enough. I'll discuss Connection Resiliency in detail in a future post, as it is a topic that deserves an article on its own.


In the meantime, have fun, with Azure SQL, .NET, Python or any language you want to use.

Updated Nov 09, 2020
Version 4.0
  • Hi marekott ,

    you're absolutely right, if you have to return a big JSON, you need to manually concatenate the results as they will be split in more than one row. I'll make sure to call this out in the GitHub repo, so everyone will be aware.

     

    Here's the doc page where this behavior, along with the technique to deal with this, is explained: https://docs.microsoft.com/en-us/sql/relational-databases/json/use-for-json-output-in-sql-server-and-in-client-apps-sql-server?view=sql-server-ver15#use-for-json-output-in-a-c-client-app.

     

    Another option is to use a Dapper extension that my colleague Jovan created: https://github.com/JocaPC/Dapper.Stream#creating-json-rest-api.

     

    Last option (or maybe "trick") is to put the JSON into a variable right on the server side, so you will not have to deal with JSON concatenation:

     

    DECLARE @j NVARCHAR(MAX) = (SELECT * FROM <Table> FOR JSON AUTO)
    SELECT @j AS JSON
  • marekott's avatar
    marekott
    Copper Contributor

    Hi Davide,

    great article, thanks. However, I am not sure if ExecuteScalarAsync is the best method for retrieving data in that case. Dapper can only retrieve 2033 characters with a single query. In our case, if the dataset is large enough, JSON will be truncated which will lead to an exception when trying to parse it. My proposition is to use QueryAsync<string> or something else which returns collection which one we can later join into one string before trying to call JsonDocument.Parse(). Let me know what you think.