java - Multiple threads vs single thread -
I am currently working on a server client application (just for learning), and one of the threads in this application Design Decision In
Currently, I have a thread in charge of all non-blocking IOs with customers. When it receives any data, it sends it to a worker thread, which creates "instructions set" from those bytes, then works accordingly. However, according to the instruction set, it can work on any hundred objects (each object will caps between anywhere between 2 and 12 clients, which can communicate with it). I am trying to figure out how to handle all the instructions set on that thread, and whenever I handle every set, or if each object should create different threads, and then get each Passed Instructions Object Threads for Handling
My question is, at what point (if any) are more passive threads waiting for data, one employee Re-threaded all the data (handling each instruction set)
If I have created a separate thread for each object, then I am thinking that it can increase the sensitivity as once the main worker thread Creates a directive set, it can simply turn it off to control and can start working on next instructions set.
However, how do I create and listen to the management threads have an underlying cost, because the OS has to manage them, so if I have created a thread for an object in which maximum of 2 clients Can be able to interact with, whether the underlying cost of management will be to deny the concurrent benefits of it, it should be understood that only 2 customers use that concurrency Spun like?
As always, any advice / article is appreciated :)
Recommend to follow the example set by Java EE Apparatus
Keep a queue for a request for incoming links and handler threads. When a request comes, the controller has got a handler thread from the pool, request a queue closure, and give it to the handler thread to process it. When thread is done, put it back in the pool.
If the number of requests is greater than the number of handler threads, the queue deposits them and wait until a thread is available.
>This design gives you two benefits:
- This lets you set the size of the handler thread pool and match it to your server resources
- It is an incoming request that when you exceed the pool capacity, you are not blocking and are not waiting or waiting for the requests.
Concurrency Your friend is here to help keep your server scalable
Comments
Post a Comment