I was testing streams with node and I setup a program to read a large file and write it again using streams. The problem is when running the program, memory usage of node goes up to 1.3 GB, which is exactly the size of the file that is being read. It is like it doesn't stream it, it buffers it and writes it in one go OR the garbage collector doesn't destroy the chunk
variables in memory. This is the program:
const { createReadStream, createWriteStream } = require('fs');
const readStream = createReadStream('../movie.mp4', {
highWaterMark: 10000
});
const writeStream = createWriteStream('./copy.mp4', {
highWaterMark: 10000
});
readStream.on('data', function (chunk) {
writeStream.write(chunk);
})
readStream.on('end', function () {
console.log("reading done");
writeStream.end();
});
writeStream.on('close', function () {
console.log("Writing done.");
})
And the weird thing is if I pipe these streams, it works as expected and the memory usage won't go above 20 MB. Like this:
const { createReadStream, createWriteStream } = require('fs');
const readStream = createReadStream('../movie.mp4', {
highWaterMark: 10000
});
const writeStream = createWriteStream('./copy.mp4', {
highWaterMark: 10000
});
readStream.pipe(writeStream);
What could cause such behavior?
Node version: v14.15.4