I recently started learning about node.js, a javascript library on top of V8 known for its non-blocking IO and incredible speed.
To my understanding, node does not wait for IO to respond, but runs an event loop (similar to a game loop) that keeps checking unfinished operations and continues/completes them as soon as IO responds. Node performance was compared to Apache HTTPD with node being significantly faster while using less memory.
Now if you read about Apache, you learn it uses 1 thread per user, which supposedly slows it down significantly and this is where my question appears:
If you compare threads to what node does internally in its event loop, you start seeing the similarities: Both are abstractions of an unfinished process that waits for a resource to respond, both check if the operation made progress regularly and then don't occupy the CPU for a certain amount of time (at least I think a good blocking API sleeps for a few milliseconds before rechecking).
Now where is that striking, critical difference that makes threads so much worse?