file.slice fails second time
Asked Answered
G

1

0

I'm trying to make a (front-end) Javascript that would be able to copy very large files (i.e. read them from a file input element and 'download' them using StreamSaver.js).

This is the actual code:

<html>
<header>
    <title>File copying</title>
</header>
<body>
<script src="https://cdn.jsdelivr.net/npm/[email protected]/dist/ponyfill.min.js"></script>
<script src="https://cdn.jsdelivr.net/npm/[email protected]/StreamSaver.min.js"></script>

<script type="text/javascript">
    const streamSaver = window.streamSaver;

    async function copyFile() {
        const fileInput = document.getElementById("fileInput");
        const file = fileInput.files[0];
        if (!file) {
            alert('select a (large) file');
            return;
        }
        const newName = file.name + " - Copy";
        let remaining = file.size;
        let written = 0;
        const chunkSize = 1048576; // 1MB

        const writeStream = streamSaver.createWriteStream(newName);
        const writer = writeStream.getWriter();

        while (remaining > 0) {
            let readSize = chunkSize > remaining ? remaining : chunkSize;
            let blob = file.slice(written, readSize);
            let aBuff = await blob.arrayBuffer();
            await writer.write(new Uint8Array(aBuff));
            written += readSize;
            remaining -= readSize;
        }
        await writer.close();
    }
</script>
<input type="file" id="fileInput"/>
<button onclick="copyFile()">Copy file</button>
</body>
</html>

It seems that during the second loop in the while the aBuff variable value (the blob.arrayBuffer) is an empty ArrayBuffer.

Am I reading the file the wrong way? My intent is to read a (potentially huge) file, chunk by chunk and do something with each chunk (in this case just output it to the downloading file by StreamSaver.js). What better approach is available in today's browsers?

Glioma answered 12/6, 2020 at 14:50 Comment(0)
M
0

I would go with something like blob.stream() or new Response(blob).body to read all chunks of the file and potentially a TransformStream if needed. But if you need a custom slice size or better browser support than you can create your own blob -> readableStream utility

// Taken from https://www.npmjs.com/package/screw-filereader
function stream (blob) {
  var position = 0
  var blob = this

  return new ReadableStream({
    pull (controller) {
      var chunk = blob.slice(position, position + 1048576)

      return chunk.arrayBuffer().then(buffer => {
        position += buffer.byteLength
        var uint8array = new Uint8Array(buffer)
        controller.enqueue(uint8array)

        if (position == blob.size)
          controller.close()
      })
    }
  })
}

stream(blob).pipeTo(writeStream)

This way you can just pipe it to streamsaver instead of writing each chunk manually

Manizales answered 16/10, 2020 at 21:4 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.