Consuming chunked data asyncrhonously in javascript
Asked Answered
R

1

7

I have a (GET) endpoint that sends data in chunks (Transfer-Encoding: chunked). The data is JSON encoded and sent line by line.

Is there a way to consume the data sent by this endpoint in an asynchronous manner in JavaScript (or using some JavaScript library)?

To be clear, I know how to perform an asynchronous GET, but I would like to have the GET request not waiting for the whole data to be transfered, but instead read the data line by line as it arrives. For instance, when doing:

curl  http://localhost:8081/numbers

The lines below are shown one by one as they become available (the example server I made is waiting a second between sending a line and the second).

{"age":1,"name":"John"}
{"age":2,"name":"John"}
{"age":3,"name":"John"}
{"age":4,"name":"John"}

I would like to reproduce the same behavior curl exhibits, but in the browser. I don't want is leave the user wait till all the data becomes available in order to show anything.

Radiograph answered 10/3, 2018 at 8:22 Comment(7)
The future answer would likely be developer.mozilla.org/en-US/docs/Web/API/Streams_APIUncouple
Thanks! That's why I cannot find an answer to this problem anywhere.Radiograph
Well Streams_API doesn't look like coming to Firefox anytime soon but ReadableStream is already available with Fetch API. You might find this article on how to handle streams with Fetch API interesting.Ottoman
Damn! I wish I had seen your answer before re-implementing the end-point to use server sent events :) I'm gonna give ReadableStream a try.Radiograph
Right, you may save some workload and a websockets library dependency at the server side.Ottoman
The good news I have a working version of this! The bad news: it only works on Chrome (Firefox 58 will give the TypeError: response.body is undefined).Radiograph
This isn't valid Transfer-Encoding: chunked stream format. It's missing the chunk length.Amnesia
R
9

Thanks to Dan and Redu I was able to put together an example that consumes data incrementally, using the Fetch API . The caveat is that this will not work on Internet Explorer, and it has to be enabled by the user in Firefox:

   /** This works on Edge, Chrome, and Firefox (from version 57). To use this example
    navigate to about:config and change

    - dom.streams.enabled preference to true
    - javascript.options.streams to true


    See https://developer.mozilla.org/en-US/docs/Web/API/ReadableStream
*/

fetch('http://localhost:8081/numbers').then(function(response) {

  console.log(response);

  const reader = response.body.getReader();

  function go() {
    reader.read().then(function(result) {
      if (!result.done) {
        var num = JSON.parse(
          new TextDecoder("utf-8").decode(result.value)
        );
        console.log(
          "Got number " + num.intVal
        );        
        go ();
      }
    })
  }

  go ();
})

The full example (with the server) is available at my sandbox. I find it illustrative of the limitations of XMLHttpRequest to compare this version with the this one, which does not use the fetch API.

Radiograph answered 10/3, 2018 at 16:30 Comment(6)
Glad to see that you have worked it out with the ReadStream. Just one quick reminder; unlike in Haskell, in JS using recursion in production code might turn out to be hazardous since your call stack might get blown up eventually if you have like 200K+ chunks to read. So if you know the chunk count in advance you might as well fill an array of that size with reader.read() promises and then chain their .then() stages up by reducing the array, sequencing the promises. Just an idea.Ottoman
Thanks for the reminder Redu. I spent too much time doing Haskell :)Radiograph
Note that there is no guarantee that the body stream will give you the full chunks in one read (#57412598). You might be better off using NDJSON (buffering the decoded text until you hit a newline, then parsing the JSON in the buffer up until then).African
@African your answer pretty much covers every use case, but i have a hard time applying ndjson parsing to my reader, do you have any usage sample?Delia
@Delia This might be of some assistance: github.com/deanhume/streamsAfrican
As already mentioned, this fails to handle multiple received chunks in one read or partially read chunks.Amnesia

© 2022 - 2024 — McMap. All rights reserved.