You should use the pipe method because the flow of data will be automatically managed so that the destination Writable stream is not overwhelmed by a faster Readable stream.
If your readable stream is faster than the writable stream then you may experience data loss in des.write(data)
method so better you should use src.pipe(des);
If the file size is big then you should use streams, thats the correct way of doing it, I tried similar example like yours to copy 3.5 GB file with streams and pipe, it worked flawlessly in my case. Check you must be doing something wrong.
The example which I tried
'use strict'
const fs =require('fs')
const readStream = fs.createReadStream('./Archive.zip')
const writeStream = fs.createWriteStream('./Archive3.zip')
readStream.pipe(writeStream)
However, if you still need to use stream des.write(data)
, you can handle backpressure to avoid loss of data when readStream
is faster. If the response from des.write(data)
is false
, then the writeStream
is loaded, pause the readStream src.pause()
.
To continue when writeStream
is drained, handle drain
event on writeStream and resume in the callback.
des.on("drain", () => src.resume())
To allow higher writeStream buffer memory, you can set highWaterMark
for readStream
to a very high value, example
const des = fs.createWriteStream('newTest.txt',{
highWaterMark: 1628920128
});
Be careful of too massive highWaterMark
because this takes of too much memory and defeat the primary advantage of streaming data.
I will definitely still recommend using pipe
as this handles everything for you with lesser code.
Docs:
https://nodejs.org/api/stream.html#stream_writable_write_chunk_encoding_callback
https://nodejs.org/api/stream.html#stream_readable_pipe_destination_options