How to wrap a buffer as a stream2 Readable stream?
Asked Answered
A

3

72

How can I transform a node.js buffer into a Readable stream following using the stream2 interface ?

I already found this answer and the stream-buffers module but this module is based on the stream1 interface.

Angeliaangelic answered 16/4, 2013 at 13:41 Comment(0)
C
183

The easiest way is probably to create a new PassThrough stream instance, and simply push your data into it. When you pipe it to other streams, the data will be pulled out of the first stream.

var stream = require('stream');

// Initiate the source
var bufferStream = new stream.PassThrough();

// Write your buffer
bufferStream.end(Buffer.from('Test data.'));

// Pipe it to something else  (i.e. stdout)
bufferStream.pipe(process.stdout)
Coadunate answered 16/4, 2013 at 18:25 Comment(8)
Unless node.js does so internally, this solution doesn't slice up the buffer into smaller chunks and so might not be ideal for some pipe destinations. But if you look, neither does the streamifier library from the accepted answer. So +1 for keeping it simple.Jaynejaynell
I do wonder if using var bufferStream = stream.PassThrough(); might make the intent clearer to later readers of the code, though?Jaynejaynell
Also, note that if your destination expects the stream to finish at some point you'll likely need to call bufferStream.end().Jaynejaynell
@Jaynejaynell There's no need to slice the buffer because the internal code of streams2 takes care of it (search "fromList", here). Actually, if you slice the buffer, the performance will be worse because if the stream needs to read more bytes than the buffer length, then if you slice it, streams2 will concat them again (here).Elizaelizabet
This requires two steps while streamifier only requires one.Nathanielnathanil
How would one test or set the chunk size?Contestation
I am not able to call events on the bufferStream object. For ex, bufferStream .on('readable', function () { var chunk; while (null !== (chunk = bufferStream .read())) { //Do something } }) .on('end', function () { //Do something });Remonstrant
Note that using the Buffer constructor has been deprecated. Use the Buffer.from('Test data.') method instead.Grishilda
H
36

As natevw suggested, it's even more idiomatic to use a stream.PassThrough, and end it with the buffer:

var buffer = new Buffer( 'foo' );
var bufferStream = new stream.PassThrough();
bufferStream.end( buffer );
bufferStream.pipe( process.stdout );

This is also how buffers are converted/piped in vinyl-fs.

Highclass answered 15/1, 2015 at 13:17 Comment(3)
Why would you end with the entire buffer? And why does end come after pipe hereAcromegaly
end( buffer ) is just write( buffer ) and then end(). I end the stream because it is not needed anymore. The order of end/pipe does not matter here, because PassThrough only starts emitting data when there's some handler for data events, like a pipe.Highclass
@Acromegaly Not slicing up the buffer means less overhead. If your consumer cannot handle large chunks, then guard it with something that splits chunks.Nathanielnathanil
I
4

A modern simple approach that is usable everywhere you would use fs.createReadStream() but without having to first write the file to a path.

const {Duplex} = require('stream'); // Native Node Module 

function bufferToStream(myBuuffer) {
    let tmp = new Duplex();
    tmp.push(myBuuffer);
    tmp.push(null);
    return tmp;
}

const myReadableStream = bufferToStream(your_buffer);
  • myReadableStream is re-usable.
  • The buffer and the stream exist only in memory without writing to local storage.
  • I use this approach often when the actual file is stored at some cloud service and our API acts as a go-between. Files never get wrote to a local file.
  • I have found this to be the very reliable no matter the buffer (up to 10 mb) or the destination that accepts a Readable Stream. Larger files should implement
Inventor answered 1/7, 2020 at 17:31 Comment(1)
This worked for me. Really helpful since I don't want to download the file on the server. I just manipulate the data and send straight to a DB.Function

© 2022 - 2024 — McMap. All rights reserved.