reverse-proxy: Poor performance and errors when proxying to https

Describe the bug

I’ve been running some benchmarks on my local machine for my API gateway that is currently using Ocelot, but that I’m in the process of migrating to YARP. To perform the benchmarks I’ve been using Crank from the .NET team, which is awesome by the way.

What I’ve seen so far is that if the downstream service I’m proxying to is using HTTP, YARP is much faster than Ocelot. For example, here are the results for my current gateway using Ocelot:

First Request (ms):   850
Requests:             18,068
Bad responses:        0
Mean latency (us):    214,006
Max latency (us):     499,452
Requests/sec:         1,181
Requests/sec (max):   7,272

While these are the results for a similar scenario, but using YARP:

First Request (ms):   1,205
Requests:             48,984
Bad responses:        0
Mean latency (us):    78,480
Max latency (us):     2,128,675
Requests/sec:         3,258
Requests/sec (max):   13,420

Those are not too shabby, although the max latency for YARP is interesting. Do note that my Ocelot based gateway is running on .NET Core 3.1, while my YARP gateway is running on .NET 5 Preview 7.

However, when I ran the same benchmark with the gateway proxying to the downstream service using HTTPS, I’m seeing very different results:

Ocelot:

First Request (ms):   1,088
Requests:             12,819
Bad responses:        0
Mean latency (us):    301,467
Max latency (us):     3,571,386
Requests/sec:         839
Requests/sec (max):   7,944

YARP:

First Request (ms):   1,631
Requests:             527
Bad responses:        515
Mean latency (us):    9,021,413
Max latency (us):     10,132,335
Requests/sec:         18
Requests/sec (max):   6,587

Obviously the number of bad responses triggered me, so I’ve looked into the logs to see if I can figure out what’s going wrong. I see a lot of these exceptions:

[04:31:02.965] System.OperationCanceledException: The operation was canceled.
[04:31:02.965]    at System.Threading.CancellationToken.ThrowOperationCanceledException()
[04:31:02.965]    at System.Net.Sockets.Socket.AwaitableSocketAsyncEventArgs.ThrowException(SocketError error, CancellationToken cancellationToken)
[04:31:02.965]    at System.Net.Sockets.Socket.AwaitableSocketAsyncEventArgs.GetResult(Int16 token)
[04:31:02.965]    at System.Net.Security.SslStream.<FillHandshakeBufferAsync>g__InternalFillHandshakeBufferAsync|180_0[TIOAdapter](TIOAdapter adap, ValueTask`1 task, Int32 minSize)
[04:31:02.965]    at System.Net.Security.SslStream.ReceiveBlobAsync[TIOAdapter](TIOAdapter adapter)
[04:31:02.965]    at System.Net.Security.SslStream.ForceAuthenticationAsync[TIOAdapter](TIOAdapter adapter, Boolean receiveFirst, Byte[] reAuthenticationData, Boolean isApm)
[04:31:02.965]    at System.Net.Http.ConnectHelper.EstablishSslConnectionAsyncCore(Stream stream, SslClientAuthenticationOptions sslOptions, CancellationToken cancellationToken)
[04:31:02.965]    at System.Net.Http.HttpConnectionPool.ConnectAsync(HttpRequestMessage request, Boolean allowHttp2, CancellationToken cancellationToken)
[04:31:02.965]    at System.Net.Http.HttpConnectionPool.CreateHttp11ConnectionAsync(HttpRequestMessage request, CancellationToken cancellationToken)
[04:31:02.965]    at System.Net.Http.HttpConnectionPool.GetHttpConnectionAsync(HttpRequestMessage request, CancellationToken cancellationToken)
[04:31:02.965]    at System.Net.Http.HttpConnectionPool.SendWithRetryAsync(HttpRequestMessage request, Boolean doRequestAuth, CancellationToken cancellationToken)
[04:31:02.965]    at Microsoft.ReverseProxy.Service.Proxy.HttpProxy.NormalProxyAsync(HttpContext context, HttpRequestMessage upstreamRequest, Transforms transforms, HttpMessageInvoker httpClient, ProxyTelemetryContext proxyTelemetryContext, CancellationToken shortCancellation, CancellationToken longCancellation)
[04:31:02.965]    at Microsoft.ReverseProxy.Telemetry.TextOperationLogger`1.ExecuteAsync(String operationName, Func`1 action)
[04:31:02.965]    at Microsoft.ReverseProxy.Middleware.ProxyInvokerMiddleware.Invoke(HttpContext context)
[04:31:02.965]    at Microsoft.AspNetCore.Routing.EndpointMiddleware.<Invoke>g__AwaitRequestTask|6_0(Endpoint endpoint, Task requestTask, ILogger logger)
[04:31:02.965]    at Microsoft.AspNetCore.Authorization.AuthorizationMiddleware.Invoke(HttpContext context)
[04:31:02.965]    at Microsoft.AspNetCore.Diagnostics.DeveloperExceptionPageMiddleware.Invoke(HttpContext context)

That looks like it is somehow not able to complete the SSL handshake with the downstream service, but my Ocelot based gateway doesn’t have that same problem and both are going to the exact same service. That service is running on ASP.NET Core 3.1 by the way.

To Reproduce

So far I’ve followed the basic guide for setting up a reverse proxy with YARP. I’m using the following configuration:

"ReverseProxy": {
  "Routes": [
    {
        "RouteId": "realization",
        "ClusterId": "EchoService",
        "Match": {
          "Methods": [ "GET", "POST" ],
          "Hosts": [ "localhost" ],
          "Path": "/realization/{*remainder}"
        },
        "Transforms": [
            { "PathSet": "/api/values" }
        ]
     }
  ],
  "Clusters": {
    {
      "EchoService": {
        "LoadBalancing": {
          "Mode": "Random"
        },
        "Destinations": {
          "backend_destination1": {
            "Address": "https://localhost:5003/"
          }
        }
   }
}
  }
}

Changing backend_destination1’s Address to http://localhost:5002/ fixes the issue.

Further technical details

  • Include the version of the packages you are using: 1.0.0-preview.3
  • The platform (Linux/macOS/Windows): macOS

About this issue

  • Original URL
  • State: closed
  • Created 4 years ago
  • Comments: 28 (27 by maintainers)

Most upvoted comments

I was using the default out of the box Bombardier job, which seems to use 256 concurrent connections.I’ll try to see if I can lower that to see what happens then. Is there some way with YARP to control which protocol is used for the downstream request?