How Node.js Server Serve Next Request, If Current Request Have Huge Computation?
Solution 1:
How node.js server serve next request, if current request have huge computation?
It doesn't - if that computation happens on the main thread and is not divided into smaller parts.
To have a chance of serving other request during a CPU-intensive task, you need to either:
- break up your computation into parts and use setImmediate or process.nextTick to run them
- use an external process for that task and call it like any other external program or service using HTTP, TCP, IPC or child process spawning, or using a queue system, pub/sub etc.
- write a native add-on in C++ and use threads for that
What's important is that you need the stack to unroll often in your V8 thread so that the event loop has a chance of handling events as often as possible. And keep in mind that when you have a long computation that takes 10 second and you divide it into 1000 smaller parts your server will still get blocked from serving new requests or any other I/O or event 1000 times for a duration of 10ms each time.
If you have a lot of CPU-heavy operations then I would strongly recommend moving them out of your process that serves the requests, not only because of blocking the event loop but also because in such a case you want to utilize all of your cores at the same time so it would be optimal to have as many processes (or threads) doing the CPU-heavy work as the cores in your CPU (or possibly more with hyper threading) and to have all of your I/O-bound operations in a separate process that doesn't process CPU-heavy operations by itself.
Solution 2:
Single threaded doesn't mean that the processes will be scheduled by First Come First Serve. I seriously don't think that multiple requests are processed First Come First Serve style, so this won't be much of a problem. The overall system will slow down due to requests that take too long to process though.
And for that, node has a solution:
https://nodejs.org/api/cluster.html
What this does is, you can basically spawn multiple instances of your app, all running at the same port, so if you have multiple requests, a very small fraction of which is taking too long, then the other child processes in the cluster can respond to the subsequent requests.
Post a Comment for "How Node.js Server Serve Next Request, If Current Request Have Huge Computation?"