Fetch vs Request
Asked Answered
R

2

9

I'm consuming a JSON stream and am trying to use fetch to consume it. The stream emits some data every few seconds. Using fetch to consume the stream gives me access to the data only when the stream closes server side. For example:

var target; // the url.
var options = {
  method: "POST",
  body: bodyString,
} 
var drain = function(response) {
  // hit only when the stream is killed server side.
  // response.body is always undefined. Can't use the reader it provides.
  return response.text(); // or response.json();
};
var listenStream = fetch(target, options).then(drain).then(console.log).catch(console.log);

/*
    returns a data to the console log with a 200 code only when the server stream has been killed.
*/

However, there have been several chunks of data already sent to the client.

Using a node inspired method in the browser like this works every single time an event is sent:

var request = require('request');
var JSONStream = require('JSONStream');
var es = require('event-stream');

request(options)
.pipe(JSONStream.parse('*'))
.pipe(es.map(function(message) { // Pipe catches each fully formed message.
      console.log(message)
 }));

What am I missing? My instinct tells me that fetch should be able to mimic the pipe or stream functionality.

Rockling answered 3/8, 2016 at 4:25 Comment(11)
you're better off using the old XMLHttpRequest I suspectSwivet
Can you link to a streaming XHR example analogous to the request example?Rockling
maybe this first result of a search engine search for xhr streaming exampleSwivet
haha fair enough :) i haven't had the right luck with XHR yet but i'll take a look.Rockling
no guarantees - I have used similar methods in the deep dark past - before server sent eventsSwivet
i think the issue i'm seeing with XHR is that onreadystatechange doesn't change till the server stream's done - roughly the same as fetch. In which case I'm curious to understand what the request family is doing is differently to be able to run on the browser.Rockling
Change to what value. You want 3 or 4 rather than just the traditional 4Swivet
Agreed. I'm not getting a change event yet until close.Rockling
Possibly server side dependant on that, I know it's damned hard to do with PHP - although you say the node inspired method works in a browser ?Swivet
Works perfectly. The data exists browser side in all cases - it's a question of surfacing the events correctly.Rockling
@JaromandaX you were right. Check my answer; doesn't really answer the question but does offer a solution :)Rockling
R
19

response.body gives you access to the response as a stream. To read a stream:

fetch(url).then(response => {
  const reader = response.body.getReader();

  reader.read().then(function process(result) {
    if (result.done) return;
    console.log(`Received a ${result.value.length} byte chunk of data`);
    return reader.read().then(process);
  }).then(() => {
    console.log('All done!');
  });
});

Here's a working example of the above.

Fetch streams are more memory-efficient than XHR, as the full response doesn't buffer in memory, and result.value is a Uint8Array making it way more useful for binary data. If you want text, you can use TextDecoder:

fetch(url).then(response => {
  const reader = response.body.getReader();
  const decoder = new TextDecoder();

  reader.read().then(function process(result) {
    if (result.done) return;
    const text = decoder.decode(result.value, {stream: true});
    console.log(text);
    return reader.read().then(process);
  }).then(() => {
    console.log('All done!');
  });
});

Here's a working example of the above.

Soon TextDecoder will become a transform stream, allowing you to do response.body.pipeThrough(new TextDecoder()), which is much simpler and allows the browser to optimise.

As for your JSON case, streaming JSON parsers can be a little big and complicated. If you're in control of the data source, consider a format that's chunks of JSON separated by newlines. This is really easy to parse, and leans on the browser's JSON parser for most of the work. Here's a working demo, the benefits can be seen at slower connection speeds.

I've also written an intro to web streams, which includes their use from within a service worker. You may also be interested in a fun hack that uses JavaScript template literals to create streaming templates.

Rabbinism answered 3/8, 2016 at 11:39 Comment(4)
thanks for the answer! As mentioned in code comments I'm consistently seeing response.body as undefined against the same example where the XHR version works. Any idea why that might be? We do control the data source and are making each new object on a new line for exactly the reasons you outlined. @jaffa-the-cakeRockling
Any suggestions on the body issue? stackoverflow.com/users/123395/jaffa-the-cakeRockling
response.body being undefined suggests the browser hasn't implemented streams yet. What browser & version are you using?Rabbinism
XHR approaches (such as request.js) buffer the full response in memory? e.g. If I http GET a 1 GB file it'll consume 1 GB of RAM before I stream it to disk?Galenic
R
0

Turns out I could get XHR to work - which doesn't really answer the request vs. fetch question. It took a few tries and the right ordering of operations to get it right. Here's the abstracted code. @jaromanda was right.

var _tryXhr = function(target, data) {
  console.log(target, data);
  var xhr = new XMLHttpRequest();

  xhr.onreadystatechange = function () {
    console.log("state change.. state: "+ this.readyState);
    console.log(this.responseText);
    if (this.readyState === 4) {
      // gets hit on completion.
    }
    if (this.readyState === 3) {
       // gets hit on new event
    }
  };

  xhr.open("POST", target);
  xhr.setRequestHeader("cache-control", "no-cache");
  xhr.setRequestHeader("Content-Type", "application/json");
  xhr.send(data);   
};
Rockling answered 3/8, 2016 at 6:29 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.