Uploading a buffer to google cloud storage
Asked Answered
T

5

31

I'm trying to save a Buffer (of a file uploaded from a form) to Google Cloud storage, but it seems like the Google Node SDK only allows files with a given path to be uploaded (Read / Write streams).

This is what I have used for AWS (S3) - is the anything else similar in the Google node SDK?:

var fileContents = new Buffer('buffer');

var params = {
  Bucket: //bucket name
  Key: //file name
  ContentType: // Set mimetype
  Body: fileContents 
};

s3.putObject(params, function(err, data) {
// Do something 
});

The only way that I have found to do it so far is write the buffer to disk, upload the file using the SDK (specifying the path to the new file) and then delete the file once it's uploaded successfully - the downside to this is that the whole process is significantly slower, to where it seems to be unfeasible to use Google storage. Is there any work around / way to upload a buffer?

Tarbes answered 10/4, 2016 at 20:25 Comment(0)
H
10

We have an issue about supporting this more easily: https://github.com/GoogleCloudPlatform/gcloud-node/issues/1179

But for now, you can try:

file.createWriteStream()
  .on('error', function(err) {})
  .on('finish', function() {})
  .end(fileContents);
Hent answered 10/4, 2016 at 21:52 Comment(1)
As mentioned in other answers, the cloud storage package now supports .save which does exactly what you're asking: googleapis.dev/nodejs/storage/latest/File.html#saveNosey
G
29

.save to save the day! Some code below where I save my "pdf" that I created.

https://googleapis.dev/nodejs/storage/latest/File.html#save

const { Storage } = require("@google-cloud/storage");

const gc = new Storage({
  keyFilename: path.join(__dirname, "./path to your service account .json"),
  projectId: "your project id",
});

      const file = gc.bucket(bucketName).file("tester.pdf");
      file.save(pdf, (err) => {
        if (!err) {
          console.log("cool");
        } else {
          console.log("error " + err);
        }
      });
Grandmother answered 25/5, 2020 at 17:19 Comment(0)
H
10

We have an issue about supporting this more easily: https://github.com/GoogleCloudPlatform/gcloud-node/issues/1179

But for now, you can try:

file.createWriteStream()
  .on('error', function(err) {})
  .on('finish', function() {})
  .end(fileContents);
Hent answered 10/4, 2016 at 21:52 Comment(1)
As mentioned in other answers, the cloud storage package now supports .save which does exactly what you're asking: googleapis.dev/nodejs/storage/latest/File.html#saveNosey
C
10

This is actually easy:

  let remotePath = 'some/key/to/store.json';
  let localReadStream = new stream.PassThrough();
  localReadStream.end(JSON.stringify(someObject, null, '   '));

  let remoteWriteStream = bucket.file(remotePath).createWriteStream({ 
     metadata : { 
        contentType : 'application/json' 
     }
  });

  localReadStream.pipe(remoteWriteStream)
  .on('error', err => {
     return callback(err);      
  })
  .on('finish', () => {
     return callback();
  });
Cryptomeria answered 29/7, 2017 at 13:43 Comment(4)
Please elaborate the answer.Mabellemable
const stream = require('stream'); is missingTim
Where's the multer in-menory buffer object in this example?Naze
remote path Slashes will get replaced by underscores and no "folder" will be created.Cretonne
S
6

The following snippet is from a google example. The example assumes you have used multer, or something similar, and can access the file at req.file. You can stream the file to cloud storage using middleware that resembles the following:

function sendUploadToGCS (req, res, next) {
  if (!req.file) {
    return next();
  }

  const gcsname = Date.now() + req.file.originalname;
  const file = bucket.file(gcsname);

  const stream = file.createWriteStream({
    metadata: {
      contentType: req.file.mimetype
    },
    resumable: false
  });

  stream.on('error', (err) => {
    req.file.cloudStorageError = err;
    next(err);
  });

  stream.on('finish', () => {
    req.file.cloudStorageObject = gcsname;
    file.makePublic().then(() => {
      req.file.cloudStoragePublicUrl = getPublicUrl(gcsname);
      next();
    });
  });

  stream.end(req.file.buffer);
}
Spruce answered 13/6, 2018 at 20:40 Comment(1)
Where is getPublicUrl() functionGeometrid
P
5

I have this approach working to me:

const destFileName = `someFolder/${file.name}`;
const fileCloud = this.storage.bucket(bucketName).file(destFileName);
    fileCloud.save(file.buffer, {
        contentType: file.mimetype
     }, (err) => {
        if (err) {
        console.log("error");
     }
});
Pitiless answered 13/7, 2021 at 19:57 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.