I am using the fluent-ffmpeg library with node.js to transcode videos originally in a flash movie format to the mp3 format with multiple resolutions, 1080p, etc.. Once the transcoding is complete, I would like to move the transcoded video to an s3 bucket.
I pull the original .flv file from a source s3 bucket and pass the stream to the ffmpeg constructor function. The issue is after the transcoding completes, how do I then get the stream of the mp4 data to send to s3.
Here is the code I have so far:
var params = {
Bucket: process.env.SOURCE_BUCKET,
Key: fileName
};
s3.getObject(params, function(err, data) {
if (err) console.log(err, err.stack); // an error occurred
var format = ffmpeg(data)
.size('854x480')
.videoCodec('libx264')
.format('flv')
.toFormat('mp4');
.on('end', function () {
//Ideally, I would like to do the uploading here
var params = {
Body: //{This is my confusion, how do I get the stream to add here?},
Bucket: process.env.TRANSCODED_BUCKET,
Key: fileName
};
s3.putObject(params, function (err, data) {
});
})
.on('error', function (err) {
console.log('an error happened: ' + err.message);
});
});
For the code above, where can I get the transcoded stream to add to the "Body" property of the params object?
Update:
Here is a revision of what I am trying to do:
var outputStream: MemoryStream = new MemoryStream();
var proc = ffmpeg(currentStream)
.size('1920x1080')
.videoCodec('libx264')
.format('avi')
.toFormat('mp4')
.output(outputStream)
// setup event handlers
.on('end', function () {
uploadFile(outputStream, "").then(function(){
resolve();
})
})
.on('error', function (err) {
console.log('an error happened: ' + err.message);
});
I would like to avoid copying the file to the local filesystem from s3, rather I would prefer to process the file in memory and upload back to s3 when finished. Would fluent-ffmpeg allow this scenario?