Node.js PassThrough stream not closing properly?
Asked Answered
H

1

10

I'm curious about my PassThrough stream and why it isn't closing after a resource I pipe it to closes. I'm using it as a mediator, one resource needs a ReadableStream and I need to pass the user a WriteableStream to allow them to write the underlying resource. At first a Duplex stream seemed ideal, but required some implementation, then I found a PassThrough stream.

EDIT: Best description of this problem here: https://gist.github.com/four43/46fd38fd0c929b14deb6f1744b63026a

Original example: Check this out:

const fs = require('fs');
const stream = require('stream');

const passThrough = new stream.PassThrough({allowHalfOpen: false});
const writeStream = new fs.createWriteStream('/tmp/output.txt');

passThrough.pipe(writeStream)
    .on('end', () => console.log('full-end'))
    .on('close', () => console.log('full-close'))
    .on('unpipe', () => console.log('full-unpipe'))
    .on('finish', () => console.log('full-finish'));
passThrough
    .on('end', () => console.log('passThrough-end'))
    .on('close', () => console.log('passThrough-close'))
    .on('unpipe', () => console.log('passThrough-unpipe'))
    .on('finish', () => console.log('passThrough-finish'));

passThrough.end('hello world');

Actual output:

passThrough-finish
passThrough-end
full-unpipe
full-finish
full-close

Seems like the write side does it's job, but the "read" side of the PassThrough stream doesn't propegate the close, even though the "allowHalfOpen" option was passed as false (and I can verify the option took in the debugger).

Am I going about this all wrong? How would I propagate the close of the writeStream?

Thanks.

Edit: I'm finding out the same is true of transform streams, they just aren't ended cone the pipe is closed. Is there a way to manually close them? transform.end() never causes the stream to toss a "close" event, just "finish" and "end" events which are fired before the underlying resource succeeds.

Edit2: I put together this Gist: https://gist.github.com/four43/46fd38fd0c929b14deb6f1744b63026a

That shows me that the readable in readable.pipe(writable) is closed down properly when the writable finishes. That would lead me to believe that when I do transform.pipe(writable) it would close the "readable" side of the transform stream, and since I already "closed" the writable side with .end(), it should close the whole stream. Side note of interest: read is tossing events even though we never use it in Test 2. Could be an isolation issue, but I think my timeout wait does a pretty good job.

Hallucinate answered 15/12, 2016 at 3:18 Comment(0)
S
1

If you want to know when writeStream is done writing then just listen for the 'finish' event on writeStream

const fs = require('fs');
const stream = require('stream');

const passThrough = new stream.PassThrough({allowHalfOpen: false});
const writeStream = new fs.createWriteStream('/tmp/output.txt');

passThrough
    .on('error', (err) => console.error(err))
    .on('end', () => console.log('passThrough-end'))
    .on('close', () => console.log('passThrough-close'))
    .on('unpipe', () => console.log('passThrough-unpipe'))
    .on('finish', () => console.log('passThrough-finish'));

writeStream
    .on('error', (err) => console.error(err))
    .on('close', () => console.log('full-close'))
    .on('unpipe', () => console.log('full-unpipe'))
    .on('finish', () => console.log('full-finish'));

// passThrough-finish written because all Writes are complete
passThrough.end('hello world');

passThrough.pipe(writeStream);
Sodom answered 15/12, 2016 at 4:54 Comment(3)
I could, but that underlying stream is opaque and hidden from me. (I'm using my PassThrough stream and piping it to aws-sdk's s3.upload() method). I'll edited my question to show the inconsistent behavior.Hallucinate
@cr125rider why do you care if the Writable being piped to finished? If you don't have access to it then it doesn't matter no? In that case, knowing that your Readable is closed should be sufficient. Also worth noting that there is no 'close' event for Transform StreamsSodom
In order to guarantee the contents of the underlying stream have completed writing, you need to wait until the write stream is done. You'll run into all sorts of strange issues if you aren't careful. This actually came up because of a failing unit test was looking for something that finished writing to the stream, read the file, not there.Hallucinate

© 2022 - 2024 — McMap. All rights reserved.