MessagePack slower than native node.js JSON
Asked Answered
A

2

5

I just installed node-msgpack and tested it against native JSON. MessagePack is much slower. Anyone know why?

Using the authors' own benchmark...

node ~/node_modules/msgpack/bench.js 
msgpack pack:   4165 ms
msgpack unpack: 1589 ms
json    pack:   1352 ms
json    unpack: 761 ms
Ahmad answered 8/1, 2013 at 21:34 Comment(1)
One of the contributors to node-msgpack has addressed this issue here: github.com/pgriess/node-msgpack/issues/38#issuecomment-22719635Xenophon
B
11

I'll assume you're talking about https://github.com/pgriess/node-msgpack.

Just looking at the source, I'm not sure how it could be. For example in src/msgpack.cc they have the following:

Buffer *bp = Buffer::New(sb._sbuf.size);
memcpy(Buffer::Data(bp), sb._sbuf.data, sb._sbuf.size);

In node terms, they are allocating and filling a new SlowBuffer for every request. You can benchmark the allocation part by doing following:

var msgpack = require('msgpack');
var SB = require('buffer').SlowBuffer;
var tmpl = {'abcdef' : 1, 'qqq' : 13, '19' : [1, 2, 3, 4]};

console.time('SlowBuffer');
for (var i = 0; i < 1e6; i++)
    // 20 is the resulting size of their "DATA_TEMPLATE"
    new SB(20);
console.timeEnd('SlowBuffer');

console.time('msgpack.pack');
for (var i = 0; i < 1e6; i++)
    msgpack.pack(tmpl);
console.timeEnd('msgpack.pack');

console.time('stringify');
for (var i = 0; i < 1e6; i++)
    JSON.stringify(tmpl);
console.timeEnd('stringify');

// result - SlowBuffer: 915ms
// result - msgpack.pack: 5144ms
// result - stringify: 1524ms

So by just allocating memory for the message they've already spent 60% of stringify time. There's just one reason why it's so much slower.

Also take into account that JSON.stringify has gotten a lot of love from Google. It's highly optimized and would be difficult to beat.

Bronwen answered 9/1, 2013 at 9:42 Comment(4)
"It's highly optimized and would be difficult to beat." +1 - No one probably wants to be optimizing C++ string marshaling between msgpack zone and V8 when there's super fast JSON already anyway.Motte
I agree about JSON.stringify best performances, but what about native JSON implementation like NSJSONSerialization on iOS? Will MessagePack for ObjectiveC better that it?Evelynneven
@Evelynneven Feel like we might be comparing apples to oranges. Does your program use both Node and ObjectiveC?Bronwen
Yep! No ways to compare apples to oranges! Nope, my consideration were about the use of a binary serializer instead of a json serializer on mobile clients (like iOS). See here: gist.github.com/frsyuki/2908191Evelynneven
B
3

I decided to benchmark all popular Node.js modules for binary encoding Msgpack, along with the PSON (protocol JSON) encoding library, versus JSON, and the results are as follows:

  • JSON fastest for encoding unless it includes a binary array
  • msgpack second fastest normally and fastest when including a binary array
  • msgpack-js - consistently second to msgpack
  • pson - consistently slower than msgpack-js
  • msgpack5 - dog slow always

I have published the benchmarking repository and detailed results at https://github.com/mattheworiordan/nodejs-encoding-benchmarks

Bouse answered 9/11, 2015 at 16:25 Comment(1)
But, were those modules installed natively? It wouldn't be a fair benchmark then, though? (since JSON is native) -- Obviously it will perform fasterDepreciate

© 2022 - 2024 — McMap. All rights reserved.