What's the most efficient node.js inter-process communication library/method?
Asked Answered
M

8

79

We have few node.js processes that should be able to pass messages, What's the most efficient way doing that? How about using node_redis pub/sub

EDIT: the processes might run on different machines

Mangum answered 24/6, 2011 at 5:51 Comment(7)
none, I would like to get a sense of what should I try..what are the common possibilities?Mangum
well, I am looking for a library , how about redis(pub/sub)?Mangum
inter process communication across machines has to be done over sockets. You can do it through a database like redis but that has to go over the network. UDP is going to be the most efficient.Riverside
UDP is unreliable (there is duplication of packets, packet ordering is not guaranteed) and is not fit for the scenario he describes. Its good for stuff like hearbeats, DNS, streaming or implementing your own protocol.Conflagration
There's a good discussion here: groups.google.com/forum/?fromgroups=#!topic/nodejs/Pxbb_kgOQEsBain
Are you looking to send point to point, broadcast, or both? Any concern about reliability of delivery?Moonfaced
nodejs.org/api/cluster.html#clustersetupprimarysettingsTeeterboard
C
54

If you want to send messages from one machine to another and do not care about callbacks then Redis pub/sub is the best solution. It's really easy to implement and Redis is really fast.

First you have to install Redis on one of your machines.

Its really easy to connect to Redis:

var client = require('redis').createClient(redis_port, redis_host);

But do not forget about opening Redis port in your firewall!

Then you have to subscribe each machine to some channel:

client.on('ready', function() {
  return client.subscribe('your_namespace:machine_name');
});

client.on('message', function(channel, json_message) {
  var message;
  message = JSON.parse(json_message);
  // do whatever you vant with the message
});

You may skip your_namespace and use global namespace, but you will regret it sooner or later.

It's really easy to send messages, too:

var send_message = function(machine_name, message) {
  return client.publish("your_namespace:" + machine_name, JSON.stringify(message));
};

If you want to send different kinds of messages, you can use pmessages instead of messages:

client.on('ready', function() {
  return client.psubscribe('your_namespace:machine_name:*');
});

client.on('pmessage', function(pattern, channel, json_message) {
  // pattern === 'your_namespace:machine_name:*'
  // channel === 'your_namespace:machine_name:'+message_type
  var message = JSON.parse(message);
  var message_type = channel.split(':')[2];
  // do whatever you want with the message and message_type
});

send_message = function(machine_name, message_type, message) {
  return client.publish([
    'your_namespace',
    machine_name,
    message_type
  ].join(':'), JSON.stringify(message));
};

The best practice is to name your processes (or machines) by their functionality (e.g. 'send_email'). In that case process (or machine) may be subscribed to more than one channel if it implements more than one functionality.

Actually, it's possible to build a bi-directional communication using redis. But it's more tricky since it would require to add unique callback channel name to each message in order to receive callback without losing context.

So, my conclusion is this: Use Redis if you need "send and forget" communication, investigate another solutions if you need full-fledged bi-directional communication.

Cybernetics answered 16/9, 2012 at 15:51 Comment(1)
Great answer. I'm just a bit worried about the performance hit of json.parse and json.stringify. I'm utilizing nodejs for my gameserver and are using 3, 4, and even more node instances all to communicate with Redis (so I can do horizontal scaling) -- and it's an aRPG game I am developing so for example attacking a mob, moving, and all that stuff it's going to be extremely busy. Would it still be fine? Or am I border-line preMatureOptimization thinking right now? ThanksChip
G
43

More than 4 years after the question being ask there is an interprocess communication module called node-ipc. It supports unix/windows sockets for communication on the same machine as well as TCP, TLS and UDP, claiming that at least sockets, TCP and UDP are stable.

Here is a small example taken from the documentation from the github repository:

Server for Unix Sockets, Windows Sockets & TCP Sockets

var ipc=require('node-ipc');

ipc.config.id   = 'world';
ipc.config.retry= 1500;

ipc.serve(
    function(){
        ipc.server.on(
            'message',
            function(data,socket){
                ipc.log('got a message : '.debug, data);
                ipc.server.emit(
                    socket,
                    'message',
                    data+' world!'
                );
            }
        );
    }
);

ipc.server.start();

Client for Unix Sockets & TCP Sockets

var ipc=require('node-ipc');

ipc.config.id   = 'hello';
ipc.config.retry= 1500;

ipc.connectTo(
    'world',
    function(){
        ipc.of.world.on(
            'connect',
            function(){
                ipc.log('## connected to world ##'.rainbow, ipc.config.delay);
                ipc.of.world.emit(
                    'message',
                    'hello'
                )
            }
        );
        ipc.of.world.on(
            'disconnect',
            function(){
                ipc.log('disconnected from world'.notice);
            }
        );
        ipc.of.world.on(
            'message',
            function(data){
                ipc.log('got a message from world : '.debug, data);
            }
        );
    }
);

Im currently evaluating this module for a replacement local ipc (but could be remote ipc in the future) as a replacement for an old solution via stdin/stdout. Maybe I will expand my answer when I'm done to give some more information how and how good this module works.

Gemology answered 26/11, 2015 at 21:45 Comment(12)
How was your experience with node-ipc?Yablon
@shashi, I started playing with node-ipc an hour ago and can say it is awesome. I found it easy to setup two node processes talking to each other over a unix socket.Diptych
@Diptych Yes, I started using it as well and till now its been impressive!Yablon
@Yablon Sorry for my late response, got little spare time in the last few days. I tested it with socket and tcp as well, works without any glitches. If I got a little bit more time at the end of the week, I will update my answer to reflect my experiences.Gemology
How is node-ipc compared to ZeroMQ? Which one would be faster?Chip
In my testing node-ipc can send and receive about 15k messages per second. The built in child_process fork functionality can do about 25k per second. This test was ran on Windows using a 3.5GHz intel i5.Psychopath
I just ran another test using the ws library to communicate between processes over websocket. I was able to get 75K. The key is not to do... on message -> send message this will only give 20K. Instead just fire away using a loop inside of a setInterval function. (You will eventually reach a climax where going higher will result in less throughput.)Psychopath
Honestly, I don't like it. node-ipc messes with the String prototype (things like '## connected to world ##'.rainbow that you can see in the example above) and its interface is kind of outdated for 2019 standards.Unreserved
I was surprised that a library which gets a million weekly downloads is modifying the String prototype. I couldn't find the code responsible for it though, so I asked them here - github.com/RIAEvangelist/node-ipc/issues/227Clerc
Github question answered (by me): It's just using the colors NPM package in the example code only, to provide the 'string'.rainbow field. Those string modifications are not part of node-ipc.Clerc
2022 notice: There was malware inserted in node-ipc. Bad versions were removed from main npm repository. I see a lot of people are using v9.2.1 (npm download stats), which seems to be the version before the incident. Latest versions (v11) include the peacenotwar dependency.Quasimodo
Looks pretty easy to do this directly in Node now nodejs.org/api/cluster.html#clustersetupprimarysettingsTeeterboard
C
42

Why not use ZeroMQ/0mq for IPC? Redis (a database) is over-kill for doing something as simple as IPC.

Quoting the guide:

ØMQ (ZeroMQ, 0MQ, zmq) looks like an embeddable networking library but acts like a concurrency framework. It gives you sockets that carry atomic messages across various transports like in-process, inter-process, TCP, and multicast. You can connect sockets N-to-N with patterns like fanout, pub-sub, task distribution, and request-reply. It's fast enough to be the fabric for clustered products. Its asynchronous I/O model gives you scalable multicore applications, built as asynchronous message-processing tasks.

The advantage of using 0MQ (or even vanilla sockets via net library in Node core, minus all the features provided by a 0MQ socket) is that there is no master process. Its broker-less setup is best fit for the scenario you describe. If you are just pushing out messages to various nodes from one central process you can use PUB/SUB socket in 0mq (also supports IP multicast via PGM/EPGM). Apart from that, 0mq also provides for various different socket types (PUSH/PULL/XREP/XREQ/ROUTER/DEALER) with which you can create custom devices.

Start with this excellent guide: http://zguide.zeromq.org/page:all

For 0MQ 2.x:

http://github.com/JustinTulloss/zeromq.node

For 0MQ 3.x (A fork of the above module. This supports PUBLISHER side filtering for PUBSUB):

http://github.com/shripadk/zeromq.node

Conflagration answered 17/9, 2012 at 17:16 Comment(5)
This takes years to set up. And its difficult to maintain. On top of all its not node.js it only has a node wrapper that makes it even more difficult to fix problemsMaharani
ZeroMQ docs are difficult to read.... the thing took me days to get functioning and I'm sure I'll have to relearn it to troubleshoot anythingFugue
Zero mq will be your best friend the moment you find yourself want to use the same server cross different languages. It's my opt for a solution. For example a library only works in python for example. Create a zero-MQ server in python. And make a client in node. And that's a cool binding. ZeroMQ has a learning curve. Generally a heavy put-all in for a whole day. Will get you running. Or maybe extra. Best thing is to resume the most fundamental. So no going back to it after. And once fundamentally grasped. You can be productive in any next project. Especially if you keep some boiler too.Ollayos
Zero mq hold a lot of power. So for instance the same ipc node. Can become a networking node. Without changing anything. Except for the node new information.Ollayos
Now in 2023, I have found ZeroMQ documentation very, very well written. It may be difficult but only because it is very exhaustive, exploring all the edge cases of every scenario. A great read.Selassie
U
14

i would start with the built in functionality that node provide.
you can use process signalling like:

process.on('SIGINT', function () {
  console.log('Got SIGINT.  Press Control-D to exit.');
});

this signalling

Emitted when the processes receives a signal. See sigaction(2) for a list of standard POSIX signal names such as SIGINT, SIGUSR1, etc.

Once you know about process you can spwn a child-process and hook it up to the message event to retrive and send messages. When using child_process.fork() you can write to the child using child.send(message, [sendHandle]) and messages are received by a 'message' event on the child.

Also - you can use cluster. The cluster module allows you to easily create a network of processes that all share server ports.

var cluster = require('cluster');
var http = require('http');
var numCPUs = require('os').cpus().length;

if (cluster.isMaster) {
  // Fork workers.
  for (var i = 0; i < numCPUs; i++) {
    cluster.fork();
  }

  cluster.on('exit', function(worker, code, signal) {
    console.log('worker ' + worker.process.pid + ' died');
  });
} else {
  // Workers can share any TCP connection
  // In this case its a HTTP server
  http.createServer(function(req, res) {
    res.writeHead(200);
    res.end("hello world\n");
  }).listen(8000);
}

For 3rd party services you can check: hook.io, signals and bean.

Urticaria answered 12/9, 2012 at 12:11 Comment(2)
-1: "the processes might run on different machines". Node have a built-in channel between a process and their childs, same machine. The OP needs to communicate 2 DIFFERENT processes from DIFFERENT machines.Gap
the 'different machines' requirement was added as an edit, the builtin process signaling is at least somewhat relevant to the question answeringScaffold
A
3

take a look at node-messenger

https://github.com/weixiyen/messenger.js

will fit most needs easily (pub/sub ... fire and forget .. send/request) with automatic maintained connectionpool

Administrate answered 27/5, 2013 at 11:48 Comment(1)
messenger's claim it supports pub/sub is a bit of an exaggeration: Every subscriber has to subscribe to a different TCP port, and the publisher has to know all these ports. Defeats the purpose of pub/sub.Cioffi
S
2

we are working on multi-process node app, which is required to handle large number of real-time cross-process message.

We tried redis-pub-sub first, which failed to meet the requirements.

Then tried tcp socket, which was better, but still not the best.

So we switched to UDP datagram, that is much faster.

Here is the code repo, just a few of lines of code. https://github.com/SGF-Games/node-udpcomm

Smite answered 29/9, 2012 at 7:46 Comment(0)
M
0

I needed IPC between web server processes in another language (Perl;) a couple years ago. After investigating IPC via shared memory, and via Unix signals (e.g. SIGINT and signal handlers), and other options, I finally settled on something quite simple which works quite well and is fast. It may not fit the bill if your processes do not all have access to the same file system, however.

The concept is to use the file system as the communication channel. In my world, I have an EVENTS dir, and under it sub dirs to direct the message to the appropriate process: e.g. /EVENTS/1234/player1 and /EVENTS/1234/player2 where 1234 is a particular game with two different players. If a process wants to be aware of all events happening in the game for a particular player, it can listen to /EVENTS/1234/player1 using (in Node.js):

fs.watch (or fsPromises.watch)

If a process wanted to listen to all events for a particular game, simply watch /EVENTS/1234 with the 'recursive: true' option set for fs.watch. Or watch /EVENTS to see all msgs -- the event produced by fs.watch will tell you the which file path was modified.

For a more concrete example, I my world I have the web browser client of player1 listening for Server-Sent Events (SSE), and there is a loop running in one particular web server process to send those events. Now, a web server process servicing player2 wants to send a message (IPC) to the server process running the SSEs for player1, but doesn't know which process that might be; it simply writes (or modifies) a file in /EVENTS/1234/player1. That directory is being watched -- via fs.watch -- in the web server process handling SSEs for player1. I find this system very flexible, and fast, and it can also be designed to leave a record of all messages sent. I use it so that one random web server process of many can communicate to one other particular web server process, but it could also be used in an N-to-1 or 1-to-N manner.

Hope this helps someone. You're basically letting the OS and the file system do the work for you. Here are a couple links on how this works in MacOS and Linux:

https://developer.apple.com/library/archive/documentation/Darwin/Conceptual/FSEvents_ProgGuide/Introduction/Introduction.html#//apple_ref/doc/uid/TP40005289

https://man7.org/linux/man-pages/man7/inotify.7.html

Any module you're using in whatever language is hooking into an API like one of these. It's been 30+ years since I've fiddled much with Windows, so I don't know how file system events work there, but I bet there's an equivalent.

EDIT (more info on different platforms from https://nodejs.org/dist/latest-v19.x/docs/api/fs.html#fswatchfilename-options-listener):

Caveats# The fs.watch API is not 100% consistent across platforms, and is unavailable in some situations.

On Windows, no events will be emitted if the watched directory is moved or renamed. An EPERM error is reported when the watched directory is deleted.

Availability# This feature depends on the underlying operating system providing a way to be notified of file system changes.

On Linux systems, this uses inotify(7). On BSD systems, this uses kqueue(2). On macOS, this uses kqueue(2) for files and FSEvents for directories. On SunOS systems (including Solaris and SmartOS), this uses event ports. On Windows systems, this feature depends on ReadDirectoryChangesW. On AIX systems, this feature depends on AHAFS, which must be enabled. On IBM i systems, this feature is not supported. If the underlying functionality is not available for some reason, then fs.watch() will not be able to function and may throw an exception. For example, watching files or directories can be unreliable, and in some cases impossible, on network file systems (NFS, SMB, etc) or host file systems when using virtualization software such as Vagrant or Docker.

It is still possible to use fs.watchFile(), which uses stat polling, but this method is slower and less reliable.

EDIT2: https://www.npmjs.com/package/node-watch is a wrapper that may help on some platforms

Maurey answered 9/2, 2023 at 18:41 Comment(0)
S
0

Not everybody knows that pm2 has an API thanks to which you can communicate to its processes.

// pm2-call.js:
import pm2 from "pm2";

pm2.connect(() => {
    pm2.sendDataToProcessId(
        {
            type: "process:msg",
            data: {
                some: "data",
                hello: true,
            },
            id: 0,
            topic: "some topic",
        },
        (err, res) => {}
    );
});

pm2.launchBus((err, bus) => {
    bus.on("process:msg", (packet) => {
        packet.data.success.should.eql(true);
        packet.process.pm_id.should.eql(proc1.pm2_env.pm_id);
        done();
    });
});
// pm2-app.js:
process.on("message", (packet) => {
    process.send({
        type: "process:msg",
        data: {
            success: true,
        },
    });
});

Selassie answered 18/2, 2023 at 11:58 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.