stream response from nodejs request to s3
Asked Answered
K

3

11

How do you use request to download contents of a file and directly stream it up to s3 using the aws-sdk for node?

The code below gives me Object #<Request> has no method 'read' which makes it seem like request does not return a readable stream...

var req = require('request');
var s3 = new AWS.S3({params: {Bucket: myBucket, Key: s3Key}});
var imageStream = req.get(url)
    .on('response', function (response) {
      if (200 == response.statusCode) {
        //imageStream should be read()able by now right?
        s3.upload({Body: imageStream, ACL: "public-read", CacheControl: 5184000}, function (err, data) {  //2 months
          console.log(err,data);
        });
      }
    });
});

Per the aws-sdk docs Body needs to be a ReadableStream object.

What am I doing wrong here?

This can be pulled off using the s3-upload-stream module, however I'd prefer to limit my dependencies.

Kristykristyn answered 17/6, 2015 at 21:38 Comment(2)
Where/how is req defined?Jacobsohn
@Jacobsohn question updated to reflect reqKristykristyn
J
7

You want to use the response object if you're manually listening for the response stream:

var req = require('request');
var s3 = new AWS.S3({params: {Bucket: myBucket, Key: s3Key}});
var imageStream = req.get(url)
    .on('response', function (response) {
      if (200 == response.statusCode) {
        s3.upload({Body: response, ACL: "public-read", CacheControl: 5184000}, function (err, data) {  //2 months
          console.log(err,data);
        });
      }
    });
});
Jacobsohn answered 18/6, 2015 at 18:16 Comment(6)
thanks! For others refrence, request's documented way of getting stream is a bit misleading github.com/request/request/issues/931Kristykristyn
I'm having this same problem -- this answer is helpful but the file on s3 ends up being zero bytes. Piping the same request to disk results in a valid file.Mckie
@JoshSantangelo if you're still having that problem, maybe have a look at my alternate solution.Chirr
@JoshSantangelo Probably too late, but it might be that you are using the wrong encoding. Request assumes text data if you do not provide encoding, e.g. image data will be corrupted. Use req.get({ url: url, encoding: null }). Zero bytes seem weird though.Meng
Any ideas how to specify the name the file is stores as?Avoidance
@Chet, I think that's the s3Key you enter in the params objectLeandraleandre
C
12

Since I had the same problem as @JoshSantangelo (zero byte files on S3) with [email protected] and [email protected], let me add an alternative solution using Node's own http module (caveat: simplified code from a real life project and not tested separately):

var http = require('http');

function copyToS3(url, key, callback) {
    http.get(url, function onResponse(res) {
        if (res.statusCode >= 300) {
            return callback(new Error('error ' + res.statusCode + ' retrieving ' + url));
        }
        s3.upload({Key: key, Body: res}, callback);
    })
    .on('error', function onError(err) {
        return callback(err);
    });
}

As far as I can tell, the problem is that request does not fully support the current Node streams API, while aws-sdk depends on it.

References:

Chirr answered 8/8, 2015 at 11:44 Comment(3)
This was the only solution that worked for me. Thanks.Trapeze
This worked -- but I had to use https instead of httpAvoidance
A simple switch for https: var http = require('http'); var https = require('https'); if (url.toString().indexOf("https") === 0){ http = https; }Ealasaid
J
7

You want to use the response object if you're manually listening for the response stream:

var req = require('request');
var s3 = new AWS.S3({params: {Bucket: myBucket, Key: s3Key}});
var imageStream = req.get(url)
    .on('response', function (response) {
      if (200 == response.statusCode) {
        s3.upload({Body: response, ACL: "public-read", CacheControl: 5184000}, function (err, data) {  //2 months
          console.log(err,data);
        });
      }
    });
});
Jacobsohn answered 18/6, 2015 at 18:16 Comment(6)
thanks! For others refrence, request's documented way of getting stream is a bit misleading github.com/request/request/issues/931Kristykristyn
I'm having this same problem -- this answer is helpful but the file on s3 ends up being zero bytes. Piping the same request to disk results in a valid file.Mckie
@JoshSantangelo if you're still having that problem, maybe have a look at my alternate solution.Chirr
@JoshSantangelo Probably too late, but it might be that you are using the wrong encoding. Request assumes text data if you do not provide encoding, e.g. image data will be corrupted. Use req.get({ url: url, encoding: null }). Zero bytes seem weird though.Meng
Any ideas how to specify the name the file is stores as?Avoidance
@Chet, I think that's the s3Key you enter in the params objectLeandraleandre
C
2

As Request has been deprecated, here's a solution utilizing Axios

const AWS = require('aws-sdk');
const axios = require('axios');

const downloadAndUpload = async function(url, fileName) {
  const res = await axios({ url, method: 'GET', responseType: 'stream' });
  const s3 = new AWS.S3(); //Assumes AWS credentials in env vars or AWS config file
  const params = {
    Bucket: IMAGE_BUCKET, 
    Key: fileName,
    Body: res.data,
    ContentType: res.headers['content-type'],
  };
  return s3.upload(params).promise();
}

Note, that the current version of the AWS SDK doesn't throw an exception if the AWS credentials are wrong or missing - the promise simply never resolves.

Chattel answered 21/7, 2020 at 9:54 Comment(3)
I am trying something very similar to this but this is running inside a test file (jest). So code block is enclosed within a await expect(new Promise((resolve, reject) => {. I am enclosing your code inside an async function like so const resp = async (url) => { and then calling resp. The issue is that I end up getting a time-out. Do you have any suggestions?Tonga
I have noticed, that the AWS sdk doesn't throw an error if the credentials are missing or don't work - it just never responds, so maybe check that first.Chattel
+1 This was the issue for me as well. I wish I had known this before banging my head for days and trying different ways to fix this. Thanks @ChattelTonga

© 2022 - 2024 — McMap. All rights reserved.