How to use streams to JSON stringify large nested objects in Node.js?
Asked Answered
G

2

7

I have a large javascript object that I want to convert to JSON and write to a file. I thought I could do this using streams like so

  var fs = require('fs');
  var JSONStream = require('JSONStream');
  var st = JSONStream.stringifyObject()
             .pipe(fs.createWriteStream('./output_file.js'))

  st.write(large_object);

When I try this I get an error:

stream.js:94
  throw er; // Unhandled stream error in pipe.
        ^
TypeError: Invalid non-string/buffer chunk
    at validChunk (_stream_writable.js:153:14)
    at WriteStream.Writable.write (_stream_writable.js:182:12)

So apparently I cant just write an object to this stringifyObject. I'm not sure what the next step is. I need to convert the object to a buffer? Run the object through some conversion stream and pipe it to strinigfyObject

Goner answered 6/9, 2015 at 20:19 Comment(0)
E
3

JSONStream doesn't work that way but since your large object is already loaded into memory there is no point to that.

var fs = require('fs-extra')
var file =   '/tmp/this/path/does/not/exist/file.txt'

fs.outputJson(file, {name: 'JP'},   function (err) {
  console.log(err) // => null
});

That will write the JSON.

If you want to use JSONStream you could do something like this:

var fs = require('fs');                          
var jsonStream = require('JSONStream');          

var fl = fs.createWriteStream('dat.json');       

var out = jsonStream.stringifyObject();          
out.pipe(fl);                                    

obj = { test:10, ok: true };                                    
for (key in obj) out.write([key, obj[key]]);                                                                                
out.end();
Enervate answered 6/9, 2015 at 20:59 Comment(5)
Your first suggestion leads to FATAL ERROR: JS Allocation failed - process out of memoryGoner
I just changed the second one to be exact code for your situation unless large is an array. Try that.Enervate
I tried the second version however I have a large nested object as one of the obj[key] values thats thats whats throwing the memory allocation error. I'd need something similliar that is recursive for child objectsGoner
@Goner May I ask how did you solve it with recursion?Lietuva
I've created a gist that streams the json with a TransformStream: gist.github.com/adrai/713b298fd83da0063910aa9f1674a5edLietuva
F
1

Well the question is quite old but still valid for nowadays, I faced same issue but solved it using this JsonStreamStringify package.

const { JsonStreamStringify } = require("json-stream-stringify");

Now,

x = new JsonStreamStringify(cursor).pipe(res);
x.on("data", (doc) => {
    res.write(doc);
  });

Here you can read your file using fs and then write the above code. 'cursor' will be pointing to your file.

In this way, you can stream your file in valid JSON Format.

For Docs: https://www.npmjs.com/package/json-stream-stringify

Fugacity answered 4/8, 2022 at 18:4 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.