I want to download an encrypted file from my server, decrypt it and save it locally. I want to decrypt the file and write it locally as it is being downloaded rather than waiting for the download to finish, decrypting it and then putting the decrypted file in an anchor tag. The main reason I want to do this is so that with large files the browser does not have to store hundreds of megabytes or several gigabytes in memory.
This is only going to be possible with a combination of service worker + fetch + stream A few browser has worker and fetch but even fewer support fetch with streaming (Blink)
new Response(new ReadableStream({...}))
I have built a streaming file saver lib to communicate with a service worker in other to intercept network request: StreamSaver.js
It's a little bit different from node's stream here is an example
function unencrypt(){
// should return Uint8Array
return new Uint8Array()
}
// We use fetch instead of xhr that has streaming support
fetch(url).then(res => {
// create a writable stream + intercept a network response
const fileStream = streamSaver.createWriteStream('filename.txt')
const writer = fileStream.getWriter()
// stream the response
const reader = res.body.getReader()
const pump = () => reader.read()
.then(({ value, done }) => {
let chunk = unencrypt(value)
// Write one chunk, then get the next one
writer.write(chunk) // returns a promise
// While the write stream can handle the watermark,
// read more data
return writer.ready.then(pump)
)
// Start the reader
pump().then(() =>
console.log('Closed the stream, Done writing')
)
})
There are also two other way you can get streaming response with xhr, but it's not standard and doesn't mather if you use them (responseType = ms-stream || moz-chunked-arrayBuffer) cuz StreamSaver depends on fetch + ReadableStream any ways and can't be used in any other way
Later you will be able to do something like this when WritableStream + Transform streams gets implemented as well
fetch(url).then(res => {
const fileStream = streamSaver.createWriteStream('filename.txt')
res.body
.pipeThrogh(unencrypt)
.pipeTo(fileStream)
.then(done)
})
It's also worth mentioning that the default download manager is commonly associated with background download so ppl sometimes close the tab when they see the download. But this is all happening in the main thread so you need to warn the user when they leave
window.onbeforeunload = function(e) {
if( download_is_done() ) return
var dialogText = 'Download is not finish, leaving the page will abort the download'
e.returnValue = dialogText
return dialogText
}
New solution has arrived: showSaveFilePicker
/FileSystemWritableFileStream
, supported in Chrome and all major derivatives (including Edge and Opera) since the end of 2020, and with a shim (written by the author of the other major answer!) for Firefox and Safari, will allow you to do this directly:
async function streamDownloadDecryptToDisk(url, DECRYPT) {
// create readable stream for ciphertext
let rs_src = fetch(url).then(response => response.body);
// create writable stream for file
let ws_dest = window.showSaveFilePicker().then(handle => handle.createWritable());
// create transform stream for decryption
let ts_dec = new TransformStream({
async transform(chunk, controller) {
controller.enqueue(await DECRYPT(chunk));
}
});
// stream cleartext to file
let rs_clear = rs_src.then(s => s.pipeThrough(ts_dec));
return (await rs_clear).pipeTo(await ws_dest);
}
Depending on performance—if you're trying to compete with MEGA, for instance—you might also consider modifying DECRYPT(chunk)
to allow you to use ReadableStreamBYOBReader
with it:
…zero-copy reading from an underlying byte source. It is used for efficient copying from underlying sources where the data is delivered as an "anonymous" sequence of bytes, such as files.
fetch
stream appropriately, just as it does when downloading files normally. –
Timeserver reader
in this case is nothing more than the ReadableStream
produced by Fetch itself. It spits out data in whatever chunk size is efficient for downloading, probably to do with the network, at the browser's discretion. If you need to process the data in 32B or 5MB or whatever chunk sizes, you'll need to package such units up yourself, from the stream Fetch gives you. –
Timeserver © 2022 - 2024 — McMap. All rights reserved.
ReadableStream
, see JS Promise - instantly retrieve some data from a function that returns a Promise – FaveolateBlob
orArrayBuffer
, as the data is streaming, following any decryption processes; then usecreateObjectURL
ordata URI
of data for user to download when stream is complete – Faveolatethe browser does not have to store hundreds of megabytes or several gigabytes in memory.
– Immersionismnodejs
. Here, would want for entire stream to be completed and verified before offering a file for download. What if 100MB of file is ok, though last byte is corrupted? – Faveolatebrowserify
to usenodejs
at browser. – Faveolatenodejs
is not, generally, intended to be used at browser. Was attempting to link to illustrations of using.pipe()
. Process data at server, or evenWorker
, then offer download. Again, not certain how a file can be downloaded as a stream. Closest have viewed is a.zip
file being populated as it downloads. What is the size of the file that will be downloaded? – FaveolateWhat is the size of the file that will be downloaded?
- it's in the question – ImmersionismrequestFileSystem
, then use.toURL()
to offer download. Though, admittedly, have not tried to append characters to adata URI
set ata
elementhref
after user has clicked anchor – FaveolateBlob
andArrayBuffer
defined; though, again, have not tried appending bytes to aBlob
orArrayBuffer
as the bytes are being downloaded to a local file. – Faveolatedata URI
afterclick
at<a>
element havingdownload
attribute, and the bytes were appended to the saved file. So, this may be possible jsfiddle.net/6xazmmpp – Faveolatei < 100000
, tab crashed – Faveolate