I am building a Node.js application and wanted to understand how concurrent requests are handled.
I build a test server, where high CPU load is being simulated by waiting 10 seconds. To test the behavior, I open two browser tabs and refresh the page simultaneously.
const http = require('http');
const express = require('express');
const bodyParser = require('body-parser');
const app = express();
const server = require('http').createServer(app);
app.use(bodyParser.json());
app.use(bodyParser.urlencoded({ extended: false }));
app.get('*', function (req, res, next) {
var requestTime = new Date().getTime(),
executionTime;
doHeavyWork(requestTime, function(error){
res.send({
requestTime : requestTime,
executionTime : executionTime
});
});
});
function doHeavyWork (requestTime, callback) {
var sleepSeconds = 10;
while (requestTime + sleepSeconds*1000 >= new Date().getTime()) {}
callback(null);
}
server.listen(1337, '127.0.0.1');
From what I heard about Node.js, I was expecting both Tabs to finish loading in the same time. In reality the Tab which is refreshed first also finishes first. The next tab loads after additional 10 seconds. So basically, the server processes the requests one at a time instead of processing them simultaneously. What am I missing here?
while
loop. Instead, why not use asetTimeout
that responds asynchronously to the requesting client. Node really relies on the developer not blocking the event loop, since it is single-threaded. Anything that blocks the event loop will prevent any other operation from happening, until the current operation finishes. Try to use asynchronous methods as much as possible! – Hirsute