sharp: Cannot pipe output to S3 with @aws-sdk/lib-storage.

Possible bug

Error: Premature close
    at new NodeError (node:internal/errors:372:5)
    at Sharp.onclose (node:internal/streams/end-of-stream:136:30)
    at Sharp.emit (node:events:527:28)
    at Object.<anonymous> (/usr/src/app/node_modules/sharp/lib/output.js:1144:16) {
  code: 'ERR_STREAM_PREMATURE_CLOSE'
}

Is this a possible bug in a feature of sharp, unrelated to installation?

  • Running npm install sharp completes without error.
  • Running node -e "require('sharp')" completes without error.

Are you using the latest version of sharp?

  • I am using the latest version of sharp as reported by npm view sharp dist-tags.latest.

If you are using another package which depends on a version of sharp that is not the latest, please open an issue against that package instead.

What is the output of running npx envinfo --binaries --system --npmPackages=sharp --npmGlobalPackages=sharp?

  System:
    OS: Windows 10 10.0.22000
    CPU: (12) x64 AMD Ryzen 5 2600 Six-Core Processor
    Memory: 1.19 GB / 15.95 GB
  Binaries:
    Node: 17.9.0 - C:\ProgramData\chocolatey\bin\node.EXE
    Yarn: 1.22.19 - C:\ProgramData\chocolatey\bin\yarn.CMD
    npm: 8.5.5 - C:\ProgramData\chocolatey\bin\npm.CMD
  npmPackages:
    sharp: ^0.30.7 => 0.30.7 

What are the steps to reproduce?

Use AWS NodeJS SDK v3 to stream the image output to a bucket, pipe the output to the Upload instance.

What is the expected behaviour?

It successfully uploads to S3 without error.

Please provide a minimal, standalone code sample, without other dependencies, that demonstrates this problem

This only happens with the AWS SDK v3 + Sharp so it’s hard to replicate without it.

import sharp from 'sharp';
import fs from 'fs';
import AWS from '@aws-sdk/client-s3';
import { Upload } from '@aws-sdk/lib-storage';

const resizer = sharp().resize({ fit: sharp.fit.cover, height: 800, width: 800 }).avif();

const upload = new Upload({
	client: new AWS.S3({
		region: 'us-west-000',
		endpoint: 'https://s3.us-west-000.backblazeb2.com',
		credentials: {
			accessKeyId: '[REDACTED]',
			secretAccessKey: '[REDACTED]',
		}
        }),
	params: { Bucket: 'bucket-name', Key: 'test.avif', Body: fs.createReadStream('./input.png').pipe(resizer) }
});

await upload.done();

Please provide sample image(s) that help explain this problem

N/A

About this issue

  • Original URL
  • State: closed
  • Created 2 years ago
  • Reactions: 1
  • Comments: 15 (8 by maintainers)

Commits related to this issue

Most upvoted comments

v0.31.1 now available

@BlakeB415 we have a similar flow in our app, and we used a PassThrough stream to work around this same issue

import { PassThrough } from "stream";
import { S3Client } from "@aws-sdk/client-s3";
import { Upload } from "@aws-sdk/lib-storage";

function doUpload(data, fileName) {
    const uploadStream = new PassThrough(); // <-------- HERE

    const imageStream = sharp()
        .resize({
          width: 1400,
          height: 1400,
          fit: "inside",
          withoutEnlargement: true,
        })
        .jpeg({
          force: true,
          mozjpeg: true,
          optimizeCoding: true,
          progressive: true,
          quality: 80,
        });

    const s3Client = new S3Client({
        apiVersion: "v4",
        credentials: {
            accessKeyId: [REDACTED],
            secretAccessKey: [REDACTED],
        },
        endpoint: [REDACTED],
        region: "auto",
    });

    const upload = new Upload({
        client: s3Client,
        queueSize: 1,
        params: {
            Bucket: [REDACTED],
            ContentType: "image/jpeg",
            Key: [REDACTED],
            Body: uploadStream, // <-------- HERE
        },
    });

    data.pipe(imageStream).pipe(uploadStream); // <-------- HERE

    const results = await upload.done();

    return results;
}

we’re using a newer version of the S3 client, and in the function here data is our file stream in question

also, anecdotally, in general the S3 client behaves strangely with streams, and we’ve found that our two options are to 1.) pass it the file stream directly, and that’s it, or 2.) give it a PassThrough and then use the PassThrough as part of another pipe (as shown in this example)

hope this helps