ECONNRESET in Express.js (Node.js) with multiple requests
Asked Answered
B

2

9

Given a standard Express.js setup

const express = require('express');
const app = express();
const router = express.Router();

router.get('/test/:id', (req, res) => {
  return res.status(200).json({ hello: 'world' });
});

app.use('/api', router);

app.listen(3000, () => console.info('Up on port 3000));

I am making 1000 requests agains the endpoint, one after the other:

const fetch = require('node-fetch');
for (let i = 0; i < 1000; i++) {
  let id = Math.floor(Math.random() * 12) + 1;
  fetch(`http://localhost:3000/api/test/${id}`).then(res => res.json()).then(data => console.log(data)).catch(error => console.error(error));
}

I do see the data returned however, every now and then I see an ECONNRESET error. The amount of ECONNRESET error messages also vary: sometimes I get a few, sometimes a lot more. I do understand the message but I can't get my head around solving the issue behind it.

Here's a sample error:

{ FetchError: request to http://localhost:3000/api/test/8 failed, reason: connect ECONNRESET 127.0.0.1:3000
    at ClientRequest.<anonymous> (node_modules/node-fetch/lib/index.js:1345:11)
    at ClientRequest.emit (events.js:182:13)
    at Socket.socketErrorListener (_http_client.js:399:9)
    at Socket.emit (events.js:182:13)
    at emitErrorNT (internal/streams/destroy.js:82:8)
    at emitErrorAndCloseNT (internal/streams/destroy.js:50:3)
    at process.internalTickCallback (internal/process/next_tick.js:72:19)
  message:
   'request to http://localhost:3000/api/departments/8 failed, reason: connect ECONNRESET 127.0.0.1:3000',
  type: 'system',
  errno: 'ECONNRESET',
  code: 'ECONNRESET' }

Note that I have tried to make the request using axios, the built-in HTTP module all to avail. I'm sure the issue is with my Express app handling the request but not sure how to fix it exactly.

Update 1:

As per the suggestion in the comment, here's the async version:

async function f() {
  const array = Array.from(Array(1000).keys());
  for (const el of array) {
    try {
      let id = Math.floor(Math.random() * 12) + 1;
      const result = await fetch(`http://localhost:3000/api/test/${id}`).then(res => res.json());
      console.log(result);
      return result;
    } catch(e) {
      console.log(e);
    }
  }
}

f();

Now I am receiving occasional ECONNREFUSED messages.

Update 2:

Based on Mazki516's answer here's the solution that works:

// previous require statements
const cluster = require('cluster');
const os = require('os');

if (cluster.isMaster) {
  const cpuCount = os.cpus().length
  for (let i = 0; i < cpuCount; i++) {
      cluster.fork()
  }
} else {
  const app = express();
  // rest of the route definitions
  // also app.listen() etc...
}
cluster.on('exit', worker => {
  console.log(`${worker.id} removed`);
  cluster.fork();
});
Bluster answered 16/11, 2018 at 15:30 Comment(2)
You are trying to use an aync action (fetch) in a sync code (for loop). See this, maybe it helps.Cacoepy
updated the code, now I get ECONNREFUSED from fetch in some cases.Bluster
P
11

One of the reasons you see this is because you make the calls in "parallel" . You do start the calls one after the other , but the loops will end probably before the first results returned from the server. The loop continues until the end , making the call stack filled with 1000 async requests to the server .

Your'e are hitting hardware/software limits and nothing is wrong with the code. if you did want to build a server which can handle 1k (and much more) requests concurrently I would take a look into the "cluster" module of node .

Please notice that when doing network job between server , it's acceptable to use a concurrency limit . (for example: up to 4 requests concurrently)

but you can always scale your server beyond one machine and handle much more traffic .

Peria answered 16/11, 2018 at 15:57 Comment(5)
Is there a way to increase the limit? Can you also elaborate a bit more on the concurrent requests?Bluster
You can always try tweaking file descriptors limits(os level) and timeouts (in express server) . but those are just "small" fixes and not the real solution. Please take a look into the cluster module so you learn quickly how to scale up your app beyond once process(multi-core server) .Peria
Regard concurrency, Nodejs is based on event-driven architecture , where you put tasks into the queue and wait for them to finish . you first code , just created 1k requests , without waiting for some of them to finish before the others. This is a bad practice , especially working on IO (network , filesystem) Again , if you want "more power"/"juice" ;) , scale your app , and make sure you don't put to many tasks into the queue .Peria
Would you know how to set the concurrency limit in an Express app?Bluster
Express concurrency limit is bounded to the Nodejs process concurrency limits which is bounded to the performance of your cpu/memory/network . it's not a number you set or tweak , and it depends on the hardware performance . (better performance , more concurrency)Peria
J
1

I faced the same issue as you, and I was able to solve with the @lependu comment (can't comment below there due to my reputation).

But basically I mapped my data array inside an await Promise.all and it worked perfectly.

In my case I switched from:

for (let i = 0; i < dataSet.length; i++) {
    const object = dataSet[i];
    const objectID = await post(url, object);
    ids.push(objectID);
}

To:

await Promise.all(dataSet.map(async (object) => {
    const objectID = await post(url, object);
    ids.push(objectID);
}));

Credits to @lependu - the answer in Using async/await with a forEach loop and the post How to use async/await inside loops in JavaScript

James answered 13/4, 2023 at 12:34 Comment(0)

© 2022 - 2024 — McMap. All rights reserved.