aws-sdk-js-v3: Can not pass PassThrough stream to send object

Describe the bug

There is no possibility to pass stream as type PassThrough to the Body.

3.3.0

Is the issue in the browser/Node.js/ReactNative? Node.js

Details of the browser/Node.js/ReactNative version Paste output of npx envinfo --browsers or node -v or react-native -v

node -v = v15.5.1

To Reproduce (observed behavior) Steps to reproduce the behavior (please share code or minimal repo)

This is not working:

        const stream1 = createReadStream(__dirname+'/test.json');
        const stream2 = new PassThrough();
        stream1.pipe(stream2);

        type PutObjectParams = ConstructorParameters<typeof PutObjectCommand>[0];
        const params: PutObjectParams = { Bucket: this.configAws.s3.mediaBucket, Key: filePath, Body: stream2, Expires: expiresAt };
        await this.s3.send(new PutObjectCommand(params));

But this is working:

        const stream1 = createReadStream(__dirname+'/test.json');

        type PutObjectParams = ConstructorParameters<typeof PutObjectCommand>[0];
        const params: PutObjectParams = { Bucket: this.configAws.s3.mediaBucket, Key: filePath, Body: stream1, Expires: expiresAt };
        await this.s3.send(new PutObjectCommand(params));

Expected behavior I should be able to pass the stream object fe Duplex, Transform or PassThrough

Screenshots None

Additional context

/home/cojack/Projects/foo/bar/node_modules/@aws-sdk/client-s3/protocols/Aws_restXml.ts:9275
  return Promise.reject(Object.assign(new Error(message), response));
                                      ^
NotImplemented: A header you provided implies functionality that is not implemented
    at deserializeAws_restXmlPutObjectCommandError (/home/cojack/Projects/foo/bar/node_modules/@aws-sdk/client-s3/protocols/Aws_restXml.ts:9275:39)
    at processTicksAndRejections (node:internal/process/task_queues:93:5)
    at /home/cojack/Projects/foo/bar/node_modules/@aws-sdk/middleware-serde/src/deserializerMiddleware.ts:20:18
    at /home/cojack/Projects/foo/bar/node_modules/@aws-sdk/middleware-signing/src/middleware.ts:26:22
    at StandardRetryStrategy.retry (/home/cojack/Projects/foo/bar/node_modules/@aws-sdk/middleware-retry/src/defaultStrategy.ts:125:38)
    at /home/cojack/Projects/foo/bar/node_modules/@aws-sdk/middleware-logger/src/loggerMiddleware.ts:21:20
    at GeneratePackHandler.upload2s3 (/home/cojack/Projects/foo/bar/src/aws-packer/cqrs/generate-pack/generate-pack.handler.ts:90:9)
    at GeneratePackHandler.execute (/home/cojack/Projects/foo/bar/src/aws-packer/cqrs/generate-pack/generate-pack.handler.ts:44:26)
    at ContentPacksService.generatePack (/home/cojack/Projects/foo/bar/src/aws-packer/aws-packer.service.ts:22:9)
npm ERR! code 1
npm ERR! path /home/cojack/Projects/foo/bar
npm ERR! command failed
npm ERR! command sh -c ts-node -P ./tsconfig.build.json src/main.ts

npm ERR! A complete log of this run can be found in:
npm ERR!     /home/cojack/.npm/_logs/2021-01-15T16_43_59_897Z-debug.log

About this issue

  • Original URL
  • State: closed
  • Created 3 years ago
  • Reactions: 14
  • Comments: 19 (2 by maintainers)

Most upvoted comments

You can pass a stream using the @aws-sdk/lib-storage package.

import { S3Client } from '@aws-sdk/client-s3';
import { Upload } from '@aws-sdk/lib-storage';

const upload = new Upload({
    client: new S3Client({}),
    params: {
        Bucket: bucket,
        Key: 'test.txt',
        Body: stream,
        ContentType: 'text/plain',
    },
});

upload.done();

@ajredniwja I don’t think that this issue should be closed, because this is not anyhow described in docs, and sdk allows to pass Readable interface where PassThrough implements.

Any new development updates on this ?

In my case, I cannot use @monken approach since I don’t know the content type, it’s dynamic at runtime. There is no way to know the content size from a stream as well (the whole purpose of streams).

Please fix this, it works for SDK v2, but not on v3

@ajredniwja

I think adding the bug label again would be appropriate. The solution from monken might work, but is in my oppinion just a work around that does not solve the initial problem.

The GetObjectCommand will now return a Readable stream instead of a Buffer (v2).

const obj = s3.send(new GetObjectCommand({ Bucket: 'test', Key: 'test' }));

for await (const chunk of obj.Body) {
  console.log(chunk);
}

// or

obj.Body.pipe(process.stdout);

I think there may be confusion here between ReadableStream (whatwg streams api) and NodeJS.ReadableStream. These two, despite having the same names, are not interface compatible.

Oddly, the PutObjectRequest.Body typing says it’s Readable | ReadableStream | Blob. Readable is an internal class that implements NodeJS.ReadableStream and the second type is ReadableStream, which without any qualifier is going to resolve (in Typescript) to the WHATWG api. I don’t know if this is a mistake in client-s3/dist-types and it’s intended to receive a NodeJS.ReadableStream, but it is not marked that way.

Thank you @monken, I appreciate the quick response.

When checking the type of Body in my code Typescript kept telling me it was of type any. Looking at the type declaration of the GetObjectOutput I can see that it is well defined. However it looks like the types ReadableStream and Blob aren’t defined. Is this some issue with my local environment? I have @types/node installed and can access NodeJS.ReadableStream.

Screen Shot 2021-02-18 at 8 32 51 am

When I add "WebWorker" to my lib in tsconfig.json I get types, but it looks like the Webworker ReadableStream is not equivalent to the NodeJS.ReadableStream

#1891 its a known issue