How to upload a file from the browser to Amazon S3 with node.js, Express, and knox? [closed]
Asked Answered
V

6

21

I'm trying to find some example code that utilizes node.js, Express, and knox.

The docs for Knox only give clear examples of how to upload a file already stored in the file system. https://github.com/learnboost/knox#readme

Additionally, there a number of simple tutorials (even in Express itself) on how to upload files directly to express and save to the file system.

What I'm having trouble finding is an example that lets you upload a client upload to a node server and have the data streamed directly to S3 rather than storing in the local file system first.

Can someone point me to a gist or other example that contains this kind of information?

Vair answered 21/7, 2011 at 9:47 Comment(3)
Andrew Barber, how is it not clear what they are asking? Most of the people that answered the question seemed to have no trouble understanding it.Exfoliate
Agreed. Wtf. Why close a question that is 2 years old and has been updated with valuable information? Also, by closing this you ruin the seo value it has earned.Vair
For me the question is cristal clear, if you don't get the point of the topic, it probably only means you should not give an answer to it as you're not deep enough in that topic.Wiesbaden
P
13

All of the previous answers involve having the upload pass through your node.js server which is inefficient and unnecessary. Your node server does not have to handle the bandwidth or processing of uploaded files whatsoever because Amazon S3 allows uploads direct from the browser.

Have a look at this blog post: http://blog.tcs.de/post-file-to-s3-using-node/

I have not tried the code listed there, but having looked over it, it appears solid and I will be attempting an implementation of it shortly ad will update this answer with my findings.

Polo answered 18/8, 2012 at 18:17 Comment(8)
"All of the previous answers involve having the upload pass through your node.js server which is inefficient and unnecessary." but at the time of my answer (2011) your solution wasn't available. but i agree: now, it's the best way.Homocyclic
I'm using an s3 "compatible" service (vBlob from cloudfoundry) which doesn't accept method="post" but only method="put". Any ideas?Puree
isn't it unsafe having amazon codes in the client side?Protoxylem
your amazon/s3 codes are not on the client side, they are kept server side (where the node end point comes in), and are encoded and served to the client before the upload request to s3 is madeBurhans
what if attacker use the amazon id in the client code and use up your storage?Hallel
@Hallel The AWS access key (id) is essentially a "public" key. It is meant to be used in such cases. In order for someone to make use of the access key they would also need the secret key which is never sent out to the public. The form in this example would send the AWS access key and signature which is calculated from the secret key on the backend and delivered to the clientPolo
@Polo I create the s3Credentials on the server using secret key and passing the access key to the browser code. What if I am an adversary and I do the following: - I intercept your client to s3 request. - I change the s3 policy as it is available in plain text. - create a secret key and private key of my own. - create the s3Credentials again with the new data. - As the aws doesn't store those two ( awsSecretKey and AwsAccessKey) values it has to rely on whatever the client(now an adversary ) is sending. Now I can upload file of any size. Am I correct?Teasley
@Teasley Nope. Any change to any parameter of the request will invalidate the signature. When all else fails, read the documentation docs.aws.amazon.com/AmazonS3/latest/API/…Polo
A
4

Here is an example of streaming directly to s3 without ever touching your hard drive, using multiparty and knox:

var http = require('http')
  , util = require('util')
  , multiparty = require('multiparty')
  , knox = require('knox')
  , Batch = require('batch')
  , PORT = process.env.PORT || 27372

var s3Client = knox.createClient({
  secure: false,
  key: process.env.S3_KEY,
  secret: process.env.S3_SECRET,
  bucket: process.env.S3_BUCKET,
});

var Writable = require('readable-stream').Writable;
util.inherits(ByteCounter, Writable);
function ByteCounter(options) {
  Writable.call(this, options);
  this.bytes = 0;
}

ByteCounter.prototype._write = function(chunk, encoding, cb) {
  this.bytes += chunk.length;
  cb();
};

var server = http.createServer(function(req, res) {
  if (req.url === '/') {
    res.writeHead(200, {'content-type': 'text/html'});
    res.end(
      '<form action="/upload" enctype="multipart/form-data" method="post">'+
      '<input type="text" name="path"><br>'+
      '<input type="file" name="upload"><br>'+
      '<input type="submit" value="Upload">'+
      '</form>'
    );
  } else if (req.url === '/upload') {
    var headers = {
      'x-amz-acl': 'public-read',
    };
    var form = new multiparty.Form();
    var batch = new Batch();
    batch.push(function(cb) {
      form.on('field', function(name, value) {
        if (name === 'path') {
          var destPath = value;
          if (destPath[0] !== '/') destPath = '/' + destPath;
          cb(null, destPath);
        }
      });
    });
    batch.push(function(cb) {
      form.on('part', function(part) {
        if (! part.filename) return;
        cb(null, part);
      });
    });
    batch.end(function(err, results) {
      if (err) throw err;
      form.removeListener('close', onEnd);
      var destPath = results[0]
        , part = results[1];

      var counter = new ByteCounter();
      part.pipe(counter); // need this until knox upgrades to streams2
      headers['Content-Length'] = part.byteCount;
      s3Client.putStream(part, destPath, headers, function(err, s3Response) {
        if (err) throw err;
        res.statusCode = s3Response.statusCode;
        s3Response.pipe(res);
        console.log("https://s3.amazonaws.com/" + process.env.S3_BUCKET + destPath);
      });
      part.on('end', function() {
        console.log("part end");
        console.log("size", counter.bytes);
      });
    });
    form.on('close', onEnd);
    form.parse(req);

  } else {
    res.writeHead(404, {'content-type': 'text/plain'});
    res.end('404');
  }

  function onEnd() {
    throw new Error("no uploaded file");
  }
});
server.listen(PORT, function() {
  console.info('listening on http://0.0.0.0:'+PORT+'/');
});

example taken from https://github.com/superjoe30/node-multiparty/blob/master/examples/s3.js

Academia answered 5/4, 2013 at 9:54 Comment(2)
Thanks for this, would love your help on my question here: https://mcmap.net/q/661055/-streaming-file-to-s3-quot-error-stream-ended-unexpectedly-quot/971592Miscarry
@Academia In the above mentioned answer how should I stop the file upload before it completes. example : the file is currently uploading and I realise that the file is too big so I want to stop the upload by sending appropriatemessage. My node server hangs as I am not sending the correct response back when I want the upload to stop. #33453452Teasley
N
1

the node/express code doesn't work with nodejs v0.4.7

here is the updated code for nodejs v0.4.7

app.post('/upload', function (req, res) {
  // connect-form additions
  req.form.complete(function (err, fields, files) {
    // here lies your uploaded file:
    var path = files['upload-file']['path'];
    // do knox stuff here
  });
});
Nairobi answered 29/7, 2011 at 10:32 Comment(0)
H
0

* update *

as of mid 2009 amazon supports CORS and the upload via your node.js server isn't needed anymore. you can directly upload the file to S3.


with the help of the "connect-form" module you could just upload the file to your server (through normal multipart FORM) and then handle the S3 stuff afterwards ...

<form action="/upload" method="POST" id="addContentForm" enctype="multipart/form-data">
  <p><label for="media">File<br/><input type="file" name="media" /></label></p>
  <p><button type="submit">upload</button></p>
</form>

node/express code:

app.post('/upload', function (req, res) {
  // connect-form additions
  req.form.complete(function (err, fields, files) {
    // here lies your uploaded file:
    var path = files['media']['path'];
    // do knox stuff here
  });
});

you have to add the following line to the app configuration:

app.configure(function(){
  // rest of the config stuff ...
  app.use(form({ keepExtensions: true }));
  // ...
});
Homocyclic answered 21/7, 2011 at 12:8 Comment(2)
Interesting! What are the security precautions we have to put into place when considering client side uploading directly to S3? I imagine we wouldn't want to give anyone and everyone that ability. Thanks :)Octahedron
talentedmrjones has given a solution which is prone to man in the middle attack. There is one more solution which is given here terlici.com/2015/05/23/uploading-files-s3.html which is correct. But it has limitations like you cannot mention the file size etc.Teasley
T
-1

The connect-stream-s3 library can upload all of your forms files to S3 as part of middleware so you don't have to do any logic yourself. It needs express.bodyParser() for it to work at the moment, but I'm working on a version that will stream files direct to Amazon S3 prior to being written to disk:

Please let me know how you get on. Hopefully it's a lot less hassle than doing it yourself once you're in your page handler. :)

Tong answered 5/5, 2012 at 4:43 Comment(0)
P
-1

I made this to upload directly from the Jquery File Upload plugin to S3 with file being public - it should point you in the right direction.

https://gist.github.com/3995819

Pasahow answered 1/11, 2012 at 19:22 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.