deno: Slow upload speed for http server

I tested upload speed for varius file sizes: 10M, 100M, 1000M in comparison of node.js upload speed in Deno is about 2x-3x slower and on 1G file deno is 10x slower and use almost all CPU resources (my ubuntu 20.04 vps server has 2 cpu cores and 2G ram). What I’m doing wrong or how to improve upload speed??

Send test data to server(localhost) via curl:

time curl -H "Content-Type:application/octet-stream" --data-binary @/dir/dev/foo1000M.bin  http://127.0.0.1:8080/upload/foo.bin

Test file for uploading was created by command:

truncate -s 10M foo10M.bin
import { serve } from "https://deno.land/std@0.125.0/http/server.ts";
import { config } from "./modules/config.js"
import * as path from "https://deno.land/std/path/mod.ts"
import {
    writableStreamFromWriter,
    readableStreamFromReader,
} from "https://deno.land/std/streams/mod.ts";

const PORT=8080

async function handler(req){
    console.log(`>>>new request: ${req.method}, ${req.url}, path: ${req.pathname} params: ${req.searchParams}`)
    console.log("HEADERS:",req.headers)

    if (req.body) {
        console.log("==>hasBody!");
        const file = await Deno.open(path.join(config.upload_dir, "native_deno.bin"), { create: true, write: true })
        const writableStream = writableStreamFromWriter(file)
        await req.body.pipeTo(writableStream)
    }
    console.log(">>upload complete!")
    return new Response("upload complete!");
}

serve(handler, { port: PORT })
console.log(">>>server listen...",PORT)

Results:

for 10M file:
   Deno:  60 - 90ms      Node:  35-40ms

for 100M file:
    Deno:  500-600ms   Node:  260ms

for 1000M file:
    Deno:  20s-5m          Node: 1000-1700ms

I repeat upload test multiple times and every time on 1G file deno server become slower and slower (for 10M and 100M test file I don’t see any performance reduction) but node server upload time stay the same avery time.

About this issue

  • Original URL
  • State: open
  • Created 2 years ago
  • Comments: 43 (12 by maintainers)

Most upvoted comments

We are actively working on improving performance of HTTP server.

On Windows, the gap between deno and node is getting closer.
Results for uploading an 1GB file:

node ~ 3.5 sec deno ~ 3.8 sec

deno 1.27.0 (release, x86_64-pc-windows-msvc)
v8 10.8.168.4
typescript 4.8.3

I’m working on this, I’ll try to have a PR by tomorrow.

ReadableStreams are quite slow as compared to the old Reader/Writer interface, because the buffer is not being reused. Allocating a new Uint8Array on every chunk is one of the reasons of the slow performance on big uploads.

https://github.com/denoland/deno/blob/d7b27ed63bf74c19b0bc961a52665960c199c53a/ext/http/01_http.js#L383-L390

By reusing the buffer (not using ReadableStream) uploading a 2gb file takes around 3 seconds, the same upload using a ReadableStream takes 4.8 seconds on my computer.

I love having spec compliant Response and Request, but when dealing with a lot of data is nice to have some lower level APIs. I think we should expose lower level APIs for users that really need that speed.

The same bottleneck happens in fetch when downloading large files.

Same slow results.

deno 1.25.1 (release, x86_64-unknown-linux-gnu)
v8 10.6.194.5
typescript 4.7.4

I’m tested same server but with new Deno.serve(), result is the same: 2x slower vs node.

deno 1.25.0 (release, x86_64-unknown-linux-gnu)
v8 10.6.194.5
typescript 4.7.4

Tested again on new deno and std versions:

deno 1.19.2
std@0.128.0

Deno still 2x slower vs node : 2.3sec vs 5.5sec

@kitsonk Maybe - I have some stuff to do on Monday, but I’ll try to investigate this on Tuesday.