Converting a Buffer into a ReadableStream in Node.js
Asked Answered
G

9

173

I have a library that takes as input a ReadableStream, but my input is just a base64 format image. I could convert the data I have in a Buffer like so:

var img = new Buffer(img_string, 'base64');

But I have no idea how to convert it to a ReadableStream or convert the Buffer I obtained to a ReadableStream.

Is there a way to do this?

Grimonia answered 5/11, 2012 at 10:46 Comment(0)
S
32

You can create a ReadableStream using Node Stream Buffers like so:

// Initialize stream
var myReadableStreamBuffer = new streamBuffers.ReadableStreamBuffer({
  frequency: 10,      // in milliseconds.
  chunkSize: 2048     // in bytes.
}); 

// With a buffer
myReadableStreamBuffer.put(aBuffer);

// Or with a string
myReadableStreamBuffer.put("A String", "utf8");

The frequency cannot be 0 so this will introduce a certain delay.

Salomon answered 16/1, 2013 at 23:8 Comment(7)
Thanks, even though a bit late. I don't remember how I solved the problem, but this looks a nice solution. If anybody confirm this it would be great. I remember finding ZERO about this conversion.Grimonia
Confirming that it works - found this when looking up how to turn filebuffers into streams.Kulseth
If you have files you deal with files you should rather open a file read stream straight away with this: nodejs.org/api/fs.html#fs_fs_createreadstream_path_optionsSalomon
Milliseconds is not a measurement of frequency - I suppose they mean period.Butanone
@Butanone I cannot change it as this is the property name and the unit IS milliseconds.Salomon
@Salomon - I wasn't suggesting it was your fault :) Just poor naming on the part of the node-stream-buffer devs.Butanone
"Node Stream Buffer is obviously designed for use in testing; the inability to avoid a delay makes it a poor choice for production use."Bryan LarsenBolinger
M
241

For nodejs 10.17.0 and up:

const { Readable } = require('stream');

const stream = Readable.from(myBuffer);
Misjoinder answered 2/6, 2020 at 0:52 Comment(2)
does this have any impact on image files?Ticker
It will work for the case described by OP, but in case the buffer contains binary data .toString() will corrupt itInocenciainoculable
S
101

something like this...

import { Readable } from 'stream'

const buffer = new Buffer(img_string, 'base64')
const readable = new Readable()
readable._read = () => {} // _read is required but you can noop it
readable.push(buffer)
readable.push(null)

readable.pipe(consumer) // consume the stream

In the general course, a readable stream's _read function should collect data from the underlying source and push it incrementally ensuring you don't harvest a huge source into memory before it's needed.

In this case though you already have the source in memory, so _read is not required.

Pushing the whole buffer just wraps it in the readable stream api.

Strangulate answered 20/5, 2017 at 22:52 Comment(4)
Wouldn't it be more correct to push() the buffer inside the _read() method? I.e. readable._read = () => {readable.push(buffer); readable.push(null);} . Not sure it matters, but allowing the stream to manage the timing of when data is fed into seems less likely to run into unexpected behavior. Other than that this should be the accepted answer, as it doesn't rely on 3rd party modules.Through
Generally, you'd be right, but for this specific use case I wouldn't push inside the read method. Conceptually I think _read should be reserved for "harvesting" data from an underlying source. In this case we not only have the data in memory, but no conversion is required. So for wrapping data in a stream this is how I would do it, but for converting or accumulating data in a stream, that logic would happen in the _read method.Strangulate
Your underlying source is the buffer ;)Southeaster
@FranckFreiburger Yes, but you're not "harvesting" data from that source, it's already in memory and you're always going to consume it all in one go, you're not pulling it in on demand.Strangulate
L
36

Node Stream Buffer is obviously designed for use in testing; the inability to avoid a delay makes it a poor choice for production use.

Gabriel Llamas suggests streamifier in this answer: How to wrap a buffer as a stream2 Readable stream?

Larochelle answered 12/8, 2013 at 14:58 Comment(0)
S
32

You can create a ReadableStream using Node Stream Buffers like so:

// Initialize stream
var myReadableStreamBuffer = new streamBuffers.ReadableStreamBuffer({
  frequency: 10,      // in milliseconds.
  chunkSize: 2048     // in bytes.
}); 

// With a buffer
myReadableStreamBuffer.put(aBuffer);

// Or with a string
myReadableStreamBuffer.put("A String", "utf8");

The frequency cannot be 0 so this will introduce a certain delay.

Salomon answered 16/1, 2013 at 23:8 Comment(7)
Thanks, even though a bit late. I don't remember how I solved the problem, but this looks a nice solution. If anybody confirm this it would be great. I remember finding ZERO about this conversion.Grimonia
Confirming that it works - found this when looking up how to turn filebuffers into streams.Kulseth
If you have files you deal with files you should rather open a file read stream straight away with this: nodejs.org/api/fs.html#fs_fs_createreadstream_path_optionsSalomon
Milliseconds is not a measurement of frequency - I suppose they mean period.Butanone
@Butanone I cannot change it as this is the property name and the unit IS milliseconds.Salomon
@Salomon - I wasn't suggesting it was your fault :) Just poor naming on the part of the node-stream-buffer devs.Butanone
"Node Stream Buffer is obviously designed for use in testing; the inability to avoid a delay makes it a poor choice for production use."Bryan LarsenBolinger
I
24

You can use the standard NodeJS stream API for this - stream.Readable.from

const { Readable } = require('stream');
const stream = Readable.from(buffer);

Note: Don't convert a buffer to string (buffer.toString()) if the buffer contains binary data. It will lead to corrupted binary files.

Inocenciainoculable answered 4/1, 2022 at 17:29 Comment(2)
I noted that this only works with Node.js 12+. The "readable.push(buffer)" method works with Node.js 8.17.0 and Node.js 10.24.1 in my tests. As I stick to supported Node.js versions, it is a non-issue for me.Unprepared
It's clean and also it works for me.Noman
C
10

You don't need to add a whole npm lib for a single file. i refactored it to typescript:

import { Readable, ReadableOptions } from "stream";

export class MultiStream extends Readable {
  _object: any;
  constructor(object: any, options: ReadableOptions) {
    super(object instanceof Buffer || typeof object === "string" ? options : { objectMode: true });
    this._object = object;
  }
  _read = () => {
    this.push(this._object);
    this._object = null;
  };
}

based on node-streamifier (the best option as said above).

Cataldo answered 4/5, 2019 at 13:28 Comment(0)
L
7

Here is a simple solution using streamifier module.

const streamifier = require('streamifier');
streamifier.createReadStream(new Buffer ([97, 98, 99])).pipe(process.stdout);

You can use Strings, Buffer and Object as its arguments.

Labiche answered 9/1, 2018 at 3:22 Comment(2)
Another equivalent alternative is tostream: const toStream = require('tostream'); toStream(new Buffer ([97, 98, 99])).pipe(process.stdout);Dogmatism
@YushinWashio Definitely. Plenty of modules are available in Node.Labiche
A
7

This is my simple code for this.

import { Readable } from 'stream';

const newStream = new Readable({
                    read() {
                      this.push(someBuffer);
                    },
                  })
Archiepiscopal answered 17/3, 2020 at 11:20 Comment(0)
A
2

Try this:

const Duplex = require('stream').Duplex;  // core NodeJS API
function bufferToStream(buffer) {  
  let stream = new Duplex();
  stream.push(buffer);
  stream.push(null);
  return stream;
}

Source: Brian Mancini -> http://derpturkey.com/buffer-to-stream-in-node/

Anabal answered 17/5, 2020 at 12:3 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.