What is the difference between ThreadPoolExecutor and ThreadPool ?
Verbatim copy of my StackOverflow answer since I’d rather point people here and not to yet another source of user contributed content for OpenAI.
The multiprocessing.dummy.ThreadPool
is a copy of the multiprocessing.Pool
API which uses threads rather than processes, leading to some weirdness since thread and processes are very different, including returning a AsyncResult
type which only it understands.
The concurrent.futures.ThreadPoolExecutor
is a subclass concurrent.futures.Executor
which is a newer simpler API, developed with both processes and threads in mind and so returns a common concurrent.futures.Future
.
On a very broad and far look, both do the same but concurrent.futures.ThreadPoolExecutor
does it better.
References:
From the multiprocessing.dummy
documentation:
multiprocessing.dummy
replicates the API of multiprocessing but is no more than a wrapper around thethreading
module.In particular, the
Pool
function provided bymultiprocessing.dummy
returns an instance ofThreadPool
, which is a subclass ofPool
that supports all the same method calls but uses a pool of worker threads rather than worker processes.
From the multiprocessing.dummy.ThreadPool
documentation
A
ThreadPool
shares the same interface asPool
, which is designed around a pool of processes and predates the introduction of theconcurrent.futures
module. As such, it inherits some operations that don’t make sense for a pool backed by threads, and it has its own type for representing the status of asynchronous jobs,AsyncResult
, that is not understood by any other libraries.Users should generally prefer to use
concurrent.futures.ThreadPoolExecutor
, which has a simpler interface that was designed around threads from the start, and which returnsconcurrent.futures.Future
instances that are compatible with many other libraries, includingasyncio
.