I am trying to capture the download progress of a Fetch request and use that to change the width of a progress bar. I looked at ProgressEvent.lengthComputable as a potential solution but unsure if this can be used with the Fetch API.
Fetch API Download Progress Indicator?
Asked Answered
Not true. The promise from a fetch() resolves after the first packet has been received, but doesn't wait until the whole body is there. –
Lawabiding
#36454450 –
Judicatory
then #35712224 would be better besides it's older –
Gaslight
Can't flag as duplicate because of the bounty, but it's all there. –
Lawabiding
Adriani6 Touffy thanks a lot for that information –
Gaslight
Let's reopen because this question is download specific, and the suggested duplicate answer is upload-specific –
Armbruster
without checking for errors (as in try/catch etc...)
const elStatus = document.getElementById('status');
function status(text) {
elStatus.innerHTML = text;
}
const elProgress = document.getElementById('progress');
function progress({loaded, total}) {
elProgress.innerHTML = Math.round(loaded/total*100)+'%';
}
async function main() {
status('downloading with fetch()...');
const response = await fetch('https://fetch-progress.anthum.com/30kbps/images/sunrise-baseline.jpg');
const contentLength = response.headers.get('content-length');
const total = parseInt(contentLength, 10);
let loaded = 0;
const res = new Response(new ReadableStream({
async start(controller) {
const reader = response.body.getReader();
for (;;) {
const {done, value} = await reader.read();
if (done) break;
loaded += value.byteLength;
progress({loaded, total})
controller.enqueue(value);
}
controller.close();
},
}));
const blob = await res.blob();
status('download completed')
document.getElementById('img').src = URL.createObjectURL(blob);
}
main();
<div id="status"> </div>
<h1 id="progress"> </h1>
<img id="img" />
adapted from here
This doesn't work if the server sends compressed encoded response. For example gzip compressed. Say the client sends
Accept-Encoding: gzip
header, and server responds with - Content-Type: application/json
Content-Encoding: gzip
Content-Length: xxx
then the length xxx
will much smaller than the total length of chunks while reading from the body reader. Basically loaded
will be more than total
after a certain point. Because the content-length
header contains the size of the compressed response. But loaded
is the chunk size calculated after decompressing. –
Loper Apparently you're out of luck for compressed stuff period. There's no standard header that sends the size of the compressed data. If you control the server you could send a custom header with that info. –
Rafaellle
Progress incorrect when content is gzip encoded: github.com/AnthumChris/fetch-progress-indicators/issues/13 / Incorrect progress for gzip encoded response: github.com/samundrak/fetch-progress/issues/22 –
Postconsonantal
Using this utility:
async function* streamToAsyncIterable(stream) {
const reader = stream.getReader()
try {
while (true) {
const {done, value} = await reader.read()
if (done) return
yield value
}
} finally {
reader.releaseLock()
}
}
Then you can use for await...of
loop:
const response = await fetch(url)
let responseSize = 0
for await (const chunk of streamToAsyncIterable(response.body)) responseSize += chunk.length
IMPORTANT
But be aware that responseSize
is response-size! Not necessarily download-size (content-length
)!
What is the difference?
There is no difference if there is no content-encoding
(gzip
, br
, ...). But if a comperession was applied, download-size will be the size of compressed data (the same content-length
), and response-size will be the size of uncompressed data.
See @ecthiender comment and this thread.
More verbose example with resolving the full response:
// https://mcmap.net/q/510963/-fetch-api-download-progress-indicator
let responseSize = 0 // `responseSize` is response-size! Not necessarily download-size ('content-length')! See the above link.
const chunks = []
const asyncIterable = streamToAsyncIterable(res.body)
for await (const chunk of streamToAsyncIterable(res.body)) {
responseSize += chunk.length
console.log(`${responseSize.toLocaleString('en-US')} decompressed bytes received.`)
chunks.push(chunk)
}
const bytes = new Uint8Array(responseSize)
let offset = 0
for (const chunk of chunks) {
bytes.set(chunk, offset) // `chunk` is a `Uint8Array`
offset += chunk.length
}
const resBody = new TextDecoder().decode(bytes)
This seems like a good answer. I have one issue. I tried to call foo = await response.json() after and it failed because the body stream already read. is there a way to get the json from the response at the end? –
Ardin
@dooderson; AFAIK, in that case, you need to do that manually. I mean you need to read (and concatenate)
totalBytes
(in the same for await
loop). Then you need to convert totalBytes
to string
and then JSON.parse()
it. Don't convert every chunk
to string
(then concatente string
s)! This causes some issues with multi-byte characters that may be placed at the boundaries of two sequent chunks. –
Postconsonantal you can use axios instead
import axios from 'axios'
export async function uploadFile(file, cb) {
const url = `//127.0.0.1:4000/profile`
try {
let formData = new FormData()
formData.append("avatar", file)
const data = await axios.post(url, formData, {
onUploadProgress: (progressEvent) => {
console.log(progressEvent)
if (progressEvent.lengthComputable) {
let percentComplete = progressEvent.loaded / progressEvent.total;
if (cb) {
cb(percentComplete)
}
}
}
})
return data
} catch (error) {
console.error(error)
}
}
Specifically asked for FETCH and not xmlHTTP ! –
Guria
© 2022 - 2024 — McMap. All rights reserved.