The nodejs Duplex stream requires the implementer to specify both a write and a read method:
import stream from 'stream';
const duplex = new stream.Duplex({
write: (chunk, encoding, next) {
// Do something with the chunk and then call next() to indicate
// that the chunk has been processed. The write() fn will handle
// data piped into this duplex stream. After the write() has
// finished, the data will be processed by the read() below.
next();
},
read: ( size ) {
// Add new data to be read by streams piped from this duplex
this.push( "some data" )
}
})
The official nodejs documentation on streams is available here: API for Stream Implementers
The websocket scenario
The websocket example described above should probably use a Readable rather than a duplex stream. Duplex streams are useful in store-and-forward or process-and-forward scenarios. However, it sounds like the stream in the websocket example is used solely to move data from the websocket to a stream interface. This can be achieved using a Readable:
import stream from 'stream';
const onSocketConnection = ( socket ) => {
const readable = new stream.Readable({
// The read logic is omitted since the data is pushed to the socket
// outside of the script's control. However, the read() function
// must be defined.
read(){}
});
socket.on('message', ( data ) => {
// Push the data on the readable queue
readable.push( data );
});
readable.pipe( ffmpeg );
}
push
inside read() . Also when you continuously write (push) to a Readable stream, that feels like a Readable + Writable. How is that different to a Duplex then? Im not sure why the example code I have uses Duplex streams but it just pipes the stream to sox and then toffmpeg
, but if I use aReadable
that behaves like you describe that should work fine I guess – Lammers