Entity Framework Core Migrations: Assembly Version Mismatch
ASP.NET Core: Beware - Singleton may not be singleton

ASP.NET Core Web API Performance - Throughput for Upload and Download

After working with the new ASP.NET Core server Kestrel and the HttpClient for a while in a number of projects I run into some performance issues. Actually, it was a throughput issue.
It took me some time to figure out whether it is the server or the client responsible for the problems. And the answer is: both.

Here are some hints to get more out of your web applications and Web APIs.

The code for my test server and client are on GitHub: https://github.com/PawelGerr/AspNetCorePerformance

In the following sections we will download and upload data using different schemes, storages and parameters measuring the throughput.

Download data via HTTP

Nothing special, we download a 20 MB file from the server using the default FileStreamResult:

public IActionResult Download()
    return File(new MemoryStream(_bytes), "application/octet-stream");

The throughput on my machine is 140 MB/s.
For the next test we are using a CustomFileResult with increased buffer size of 64 KB and suddenly get a throughput of 200 MB/s.

Upload multipart/form-data via HTTP

The ASP.NET Core introduced a new type IFormFile that enables us to receive multipart/form-data without any manual work. For that we create a new model with a property of type IFormFile and use this model as an argument of a Web API method.

public class UploadMultipartModel
    public IFormFile File { get; set; }
    public int SomeValue { get; set; }


public async Task<IActionResult> UploadMultipartUsingIFormFile(UploadMultipartModel model)
    var bufferSize = 32 * 1024;
    var totalBytes = await Helpers.ReadStream(model.File.OpenReadStream(), bufferSize);

    return Ok();


public static async Task<int> ReadStream(Stream stream, int bufferSize)
    var buffer = new byte[bufferSize];

    int bytesRead;
    int totalBytes = 0;

       bytesRead = await stream.ReadAsync(buffer, 0, bufferSize);
        totalBytes += bytesRead;
    } while (bytesRead > 0);
    return totalBytes;

Using the IFormFile to transfer 20 MB we get a pretty bad throughput of 30 MB/s. Luckily we got another means to get the content of a multipart/form-data request, the MultipartReader.
Having the new reader we are able to improve the throughput up to 350 MB/s.

public async Task<IActionResult> UploadMultipartUsingReader()
    var boundary = GetBoundary(Request.ContentType);
    var reader = new MultipartReader(boundary, Request.Body, 80 * 1024);

    var valuesByKey = new Dictionary<string, string>();
    MultipartSection section;

    while ((section = await reader.ReadNextSectionAsync()) != null)
        var contentDispo = section.GetContentDispositionHeader();

        if (contentDispo.IsFileDisposition())
            var fileSection = section.AsFileSection();
            var bufferSize = 32 * 1024;
            await Helpers.ReadStream(fileSection.FileStream, bufferSize);
        else if (contentDispo.IsFormDisposition())
            var formSection = section.AsFormDataSection();
            var value = await formSection.GetValueAsync();
            valuesByKey.Add(formSection.Name, value);

    return Ok();

private static string GetBoundary(string contentType)
    if (contentType == null)
        throw new ArgumentNullException(nameof(contentType));

    var elements = contentType.Split(' ');
    var element = elements.First(entry => entry.StartsWith("boundary="));
    var boundary = element.Substring("boundary=".Length);

    boundary = HeaderUtilities.RemoveQuotes(boundary);

    return boundary;

Uploading data via HTTPS

In this use case we will upload 20 MB using different storages (memory vs file system) and different schemes (http vs https).

The code for uploading data:

var stream = readFromFs
    ? (Stream) File.OpenRead(filePath)
    : new MemoryStream(bytes);

var bufferSize = 4 * 1024; // default

using (var content = new StreamContent(stream, bufferSize))
    using (var response = await client.PostAsync("Upload", content))

Here are the throughput numbers:

  • HTTP + Memory: 450 MB/s
  • HTTP + File System: 110 MB
  • HTTPS + Memory: 300 MB/s
  • HTTPS + File System: 23 MB/s

Sure, the file system is not as fast as the memory but my SSD is not that slow to get just 23 MB/s .... let's increase the buffer size instead of using the default value of 4 KB.

  • HTTPS + Memory + 64 KB: 300 MB/s
  • HTTPS + File System + 64 KB: 200 MB/s
  • HTTPS + File System + 128 KB: 250 MB/s

With bigger buffer size we get huge improvements when reading from slow storages like the file system.

Another hint: Setting the Content-Length on the client yields better overall performance.


When I startet to work on the performance issues my first thought was that Kestrel is to blame because it had not enough time to mature yet.  I even tried to place IIS in front of Kestrel so that IIS is responsible for HTTPS stuff and Kestrel for the rest. The improvements are not worth of mentioning. After adding a bunch of trace logs, measuring time on the client and server, switching between schemes and storages I realized that the (mature) HttpClient is causing issues as well and one of the major problem were the default values like the buffer size.



Feed You can follow this conversation by subscribing to the comment feed for this post.


Thanks you alot, it's very helpful :)


git repos seem to be missing.


@Jacob github has solved the problem


This is really nice! Another import aspect of using "MultipartReader" to upload file is to save memory on the sever, because it can be directly stream to another source, while IFormFile buffers the whole content in memory. Of course for small size files probably is not an issue, however for popular site it may become an issue. Or if big file has to be uploaded then "MultipartReader" is the way to go.


There is good article about "Uploading large files with streaming" on MSDN: https://docs.microsoft.com/en-us/aspnet/core/mvc/models/file-uploads

Kirk Quinbar

I have created a similar setup to this and also looking at that link you posted at docs.microsoft.com and I cannot upload any file over 16384 bytes in size. When it hits the first line with ReadNextSectionAsync, that just comes back with an error "Multipart body length limit 16384 exceeded.". Have you ever seen this and any suggestions to get past it. I have been digging around the internet on that exact error and tried everything others had suggested to try with setting different options, etc, but so far nothing has worked. I'm surprised you didnt run into this as i assume you were able to upload rather large files to prove you could get 350 mb/s for upload

I am using Microsoft.Aspnet.Core 2.0.1. Was that version ever tested with this code, or what version of the core nuget package was this tested with? Any thoughts?


Please note that these tests were made with .NET Core 1.x and you are using .NET Core 2.x, so some optimizations are not required anymore because ASP.NET Core and Kestrel have improved very much.
Regarding your error, 16384 bytes is the default "MultipartReader.HeadersLengthLimit". It may be that you are sending either a lot of multipart sections or the sections have a lot of headers (or they are just big).


Kirk Quinbar

just an fyi that i found out there was a bug in the core 2.0.1 release. once i upgraded to 2.0.3 then the error i was seeing with the 16384 bytes size went away.


How to save in a disk file ?

The comments to this entry are closed.