How to reuse redis connection in socket.io?
Asked Answered
P

5

40

Here is my code using socket.io as WebSocket and backend with pub/sub redis.

var io = io.listen(server),
    buffer = [];

var redis = require("redis");

var subscribe = redis.createClient();  **<--- open new connection overhead**

io.on('connection', function(client) {

    console.log(client.request.headers.cookie);

    subscribe.get("..", function (err, replies) {

    });

    subscribe.on("message",function(channel,message) {

        var msg = { message: [client.sessionId, message] };
        buffer.push(msg);
        if (buffer.length > 15) buffer.shift();
        client.send(msg);
    });

    client.on('message', function(message){
    });

    client.on('disconnect', function(){
        subscribe.quit();
    });
});

Every new io request will create new redis connection. If someone open browser with 100 tabs then the redis client will open 100 connections. It doesn't look nice.

Is it possible to reuse redis connection if the cookies are same? So if someone open many browser tabs also treat as open 1 connection.

Prow answered 21/4, 2011 at 3:59 Comment(2)
I just wrote a scalable socket.io sample you may want to take a look.Gallop
Here is one good linkChristlike
M
64

Actually you are only creating a new redis client for every connection if you are instantiating the client on the "connection" event. What I prefer to do when creating a chat system is to create three redis clients. One for publishing, subscribing, and one for storing values into redis.

for example:

var socketio = require("socket.io")
var redis = require("redis")

// redis clients
var store = redis.createClient()
var pub = redis.createClient()
var sub = redis.createClient()

// ... application paths go here

var socket = socketio.listen(app)

sub.subscribe("chat")

socket.on("connection", function(client){
  client.send("welcome!")

  client.on("message", function(text){
    store.incr("messageNextId", function(e, id){
      store.hmset("messages:" + id, { uid: client.sessionId, text: text }, function(e, r){
        pub.publish("chat", "messages:" + id)
      })
    })
  })

  client.on("disconnect", function(){
    client.broadcast(client.sessionId + " disconnected")
  })

  sub.on("message", function(pattern, key){
    store.hgetall(key, function(e, obj){
      client.send(obj.uid + ": " + obj.text)
    })
  })

})
Margetmargette answered 21/4, 2011 at 20:49 Comment(8)
just to be clear, this is only creating three redis clients in total regardless of how many users are connected. Adding another node process obviously results in more redis clients.Margetmargette
for sub.on('message'), why do you do client.send instead of client.broadcast? ThanksClapperclaw
@Noli good question. you will notice that because we are subscribing to a redis channel within the socket "connection" closure, this is all that is needed to send everyone a messsage because the event "message" on the sub object will get triggered for every client who is connected. if we used client.broadcast() every person would see the message times the number of people in the room.Margetmargette
@Noli just to clarify, we could use broadcast but we would have to bind the listener outside of the "connection" closure so that the event only gets fired once. We also would have to change it to socket.broadcast() because the client object is not available to us. This may be better depending on the situation. Good catch :)Margetmargette
Why need to closure the sub.on instead: sub.subscribe("chat"); sub.on("message", function(pattern, key) { store.hgetall(key, function(e, obj) { io.sockets.send(obj.uid + ": " + obj.text) }) }); io.sockets.on("connection", function(client) { ... }?Savonarola
I know there's been a significant gap of time here, but I believe that the answer to this question is somewhat dangerous. Binding an event listener to the message event of sub will happen every time a new client joins the chat. This event listener will remain around after the client disconnects. This will result in a buildup of stale event listeners handling messages for clients who have already gone.Nevile
@Nevile I know this response is even later, but I've figured out how to resolve the dangling listener issue. #11618311Precipitous
@jackquack a better idea would be to have redis subscribe to the messages outside of the socketio logic. You could then have a global array of current socket connections which the subscribe logic can work with if necessary.Odeen
C
2

Redis is optimized for a high level of concurrent connections. There is also discussion about multiple database connections and connection pool implementation in node_redis module.

Is it possible to reuse redis connection if the cookies are same? So if someone open many browser tabs also treat as open 1 connection.

You can use for example HTML5 storage on the client side to keep actively connected only one tab and others will handle communication/messages through storage events. It's related to this question.

Competent answered 21/4, 2011 at 10:16 Comment(1)
I thought can be store the redis client into sessioncookie. So when next calling with same cookie then can reuse back the redis connection and do publish message. eg. var sessioncookie = redis.createClient(); So it better do it in server site.Prow
L
1

I had this exact problem, with an extra requirement that clients must be able to subscribe to private channels, and publish to those channels should not be sent to all listeners. I have attempted to solve this problem by writing a miniature plugin. The plugin:

  • Uses only 2 redis connections, one for pub, one for sub
  • Only subscribes to "message" once total (not once every redis connection)
  • Allow clients to subscribe to their own private channels, without messages being sent to all other listening clients.

Especially useful if your prototyping in a place where you have a redis connection limit (such as redis-to-go). SO link: https://mcmap.net/q/49253/-what-should-i-be-using-socket-io-rooms-or-redis-pub-sub

Lucilucia answered 7/6, 2013 at 1:49 Comment(2)
The main problem is redis will eat memory for each client subscribe a channel. If have 50k concurrent users, it's not a good idea to implement this.Prow
Very good point, in my case it was for a game, and some client data needed to be sent to groups, and other to only single players. But I guess it would not be a go-ahead with 50k users ^^Lucilucia
C
1

You need to remove the listener when client disconnect.

var io = io.listen(server),
    buffer = [];

var redis = require("redis");

var subscribe = redis.createClient();  

io.on('connection', function(client) {

    console.log(client.request.headers.cookie);

    subscribe.get("..", function (err, replies) {

    });

    var redis_handler = function(channel,message) {

        var msg = { message: [client.sessionId, message] };
        buffer.push(msg);
        if (buffer.length > 15) buffer.shift();
        client.send(msg);
    };

    subscribe.on("message", redis_handler);


    client.on('message', function(message){
    });

    client.on('disconnect', function(){
        subscribe.removeListerner('message', redis_handler)
        //subscribe.quit();
    });
});

See Redis, Node.js, and Socket.io : Cross server authentication and node.js understanding

Claptrap answered 14/2, 2014 at 2:32 Comment(0)
R
0

Using redis as a store has become much simpler since this question was asked/answered. It is built in now.

Note, that if you are using redis because you are using the new node clustering capabilities (utilizing multiple CPUs), you have to create the server and attach the listeners inside of each of the cluster forks (this is never actually explained anywhere in any of the documentation ;) ). The only good code example online that I have found is written in CoffeeScript and I see a lot of people saying this type of thing "just doesn't work", and it definitely doesn't if you do it wrong. Here's an example of "doing it right" (but it is in CoffeeScript)

Reynolds answered 5/8, 2013 at 15:18 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.