I want to get real-time updates about MongoDB database changes in Node.js.
A single MongoDB change stream sends update notifications almost instantly. But when I open multiple (10+) streams, there are massive delays (up to several minutes) between database writes and notification arrival.
That's how I set up a change stream:
let cursor = collection.watch([
{$match: {"fullDocument.room": roomId}},
]);
cursor.stream().on("data", doc => {...});
I tried an alternative way to set up a stream, but it's just as slow:
let cursor = collection.aggregate([
{$changeStream: {}},
{$match: {"fullDocument.room": roomId}},
]);
cursor.forEach(doc => {...});
An automated process inserts tiny documents into the collection while collecting performance data.
Some additional details:
- Open stream cursors count: 50
- Write speed: 100 docs/second (batches of 10 using
insertMany
) - Runtime: 100 seconds
- Average delay: 7.1 seconds
- Largest delay: 205 seconds (not a typo, over three minutes)
- MongoDB version: 3.6.2
- Cluster setup #1: MongoDB Atlas M10 (3 replica set)
- Cluster setup #2: DigitalOcean Ubuntu box + single instance mongo cluster in Docker
- Node.js CPU usage: <1%
Both setups produce the same issue. What could be going on here?
room
, nothing changed. – SheltonIt’s estimated that after 1000 streams you will start to see very measurable performance drops. Why there is not a global change stream option to avoid having so many cursors floating around is not clear. I think it’s something that should be looked at for future versions of this feature. Up to now, many use cases of mongo, specifically in the multi-tenant world, might have > 1000 namespaces on a system. This would make the performance drop problematic.
percona.com/blog/2017/11/22/… – Neural