Unexpected latency in response time at Node server
Asked Answered
A

1

8

Problem statement-

We are using router between internet client and downstream service. Router(service) is written in Node.js. Its responsiiblty is to pass internet client's request to corresponding downstream service, and return back the response to internet client. We are facing some delay at router level.

Library used for http-proxy-

https://github.com/nodejitsu/node-http-proxy

node-http-proxy Example-

Explaining with sample example. We are facing issue in 99 percentile case -

enter image description here

We did load/performance testing with 100 concurrency.

As per result, till 95 percentile, response time looks great to internet client.

But, in 99 percentile, Downstream service responds in expected time (~250ms). But router is taking 10 times more time than expected(~2500 ms).

Service information-

Both router, and downstream service are in same region, and same subnet. So, this delay is not because of network.

Possibilities of this delay-

  1. Some threads are blocked at node service level. Thats why, not able to listen downstream service responses.
  2. Taking more time in dns lookup.
  3. Node level thread counts are less, thats why, not able to listen for all incoming response from downstream service side.

To analyse this-

We twicked below configurations -

keepAlive, maxSockets, maxFreeSockets, keepAliveMsecs, log level. PLease check Node configurations for http/https agent - http agent configuration

Code snippet of node service -

var httpProxy = require('http-proxy');
var http = require('http');
var https = require('https');

var agent = new https.Agent({
    maxSockets: nconf.get(25),
    keepAlive: true,
    maxFreeSockets: nconf.get(10),
    keepAliveMsecs : nconf.get(5000)
});

var proxy = httpProxy.createServer({ agent: agent });
var domain = require('domain');
var requestTimeout = parseInt(nconf.get('REQUEST_TIMEOUT'));
process.env.UV_THREADPOOL_SIZE = nconf.get(4);

Questions-

  1. I am new in node services. It would be great, if you help me to tune above configurations with right values. If I missed any configuration, please let me know ?
  2. Is there any way to intercept network traffic on router machine, that will help me to analyse above mentioned 10 times delay in router response time ?
  3. If you know any profiling tool(network level), that can help me to dive-deep more, please share with me ?

[Update 1]

Found one interesting link - war-story.

If I missed any required information here, please ask me to add here.

Auguste answered 14/11, 2017 at 20:33 Comment(15)
can you replace your router with nginx and check againPiety
can you also increase your ulimit, open filesPiety
@Piety how this question is related to ulimit ?Auguste
a normal process only get 1024 open files limit, maybe you are exceeding itPiety
you are also using https, nodejs is not good at doing computationPiety
@Piety Means, every process is using some files, and if we hit with high traffic volume, then this limit may cause problem. Correct me, if I am wrong ?Auguste
Http proxy means you Will be using 2 ports per connectionPiety
You are also only setting max sockets as 25. But you are testing 100 concurrent connections.Piety
there is too many variables in your test. That make providing an answer difficult. Make sure you are not throttling the proxy. Then test and verifyPiety
@Piety Means, per request, 2 files will open (as per open files limit concept) ? Why 2 ports required per connection ? [Asking as, I am new in node js world]Auguste
As it is acting as proxy. one socket for receive and one for sendPiety
everything what @Piety said! definitily let your Loadbalancer/Proxy handle https!End
@End Currently, I am adding "current time" in header of request API for router service, and downstream service. Then, will see difference at 2 level : when requesting from router to downstream service, and when responding by downstream service to router service. Then, it will help us to dive deep more.Auguste
Try to remove agent.maxSockets which is default set to infinity and agent.maxFreeSockets which is default set to 256Huesman
@Dafuck That will cause socket hang issue. Tested with default setting also.Auguste
E
1

I suspect the http-proxy module to be the issue. If this handles the requests synchronously this can cause javascript to wait for a back-end response before it continues with the next request in the queue. I suggest changing to https://www.npmjs.com/package/http-proxy-async to see if this fixes the issue.

Eventual answered 22/11, 2017 at 13:9 Comment(1)
Your mentioned wiki, is also using "http-proxy" package. github.com/nodejitsu/node-http-proxy Even, I am using the same.Auguste

© 2022 - 2024 — McMap. All rights reserved.