[Spring] 2. Analysis of Custom Thread Pools and Thread Reuse in Spring Async Interfaces
[Spring] 2. Analysis of Custom Thread Pools and Thread Reuse in Spring Async Interfaces
Preface
When handling high-concurrency scenarios in Spring applications, proper use of asynchronous programming and thread pool management is crucial. This article provides an in-depth analysis of Spring’s default thread pool, custom thread pools, and thread reuse mechanisms through practical code examples.
Why Use Custom Thread Pools?
When a Spring Boot application starts, it automatically configures a global task executor (TaskExecutor) with the default name applicationTaskExecutor
. However, using Spring’s default thread pool directly in production environments is not recommended for the following reasons:
- Lack of Isolation: All asynchronous tasks share the same thread pool, causing tasks from different business modules to interfere with each other
- Difficult to Monitor: Unable to perform fine-grained thread pool monitoring and tuning for specific business scenarios
- Single Configuration: Default configuration may not meet the performance needs of all business scenarios
Best Practice: Customize thread pools based on business scenarios to achieve task isolation and fine-grained management.
Custom Thread Pool Configuration
Here’s a typical custom thread pool configuration example:
|
|
Configuration Breakdown:
- Core Pool Size = Maximum Pool Size = 10: Fixed-size thread pool, avoids frequent thread creation and destruction
- Queue Capacity = 10: When all 10 threads are working, up to 10 more tasks can be queued
- Custom Thread Naming:
customer-t-{number}
, convenient for log tracking and problem diagnosis
Async Interface vs Sync Interface Comparison
Async Interface Implementation (asyncQuery1)
|
|
Characteristics:
- Non-blocking: Tomcat thread is immediately released and can handle other requests
- High Throughput: Suitable for I/O-intensive tasks
- Thread Switching: Request switches between Tomcat thread and custom thread pool
Sync Interface Implementation (syncQuery1)
|
|
Characteristics:
- Blocking Wait: Tomcat thread is blocked by
CountDownLatch
, cannot handle other requests - Resource Waste: Occupies both Tomcat thread and Worker thread, two threads doing the work of one
- Essentially Synchronous: Despite using a custom thread pool, the Tomcat thread waits continuously, completely failing to leverage async advantages
- Use Cases: Almost none! Better to execute directly in Tomcat thread, which also saves a Worker thread
Thread Reuse in Practice
Sending 20 concurrent requests via load testing tool to observe thread behavior differences between async and sync interfaces.
Async Interface Concurrency Test
Sending 20 concurrent requests to /goody/async/query1
(each task takes 10 seconds):
|
|
Key Observations:
- Strong Concurrency: Tomcat thread (io-50012-exec-1) received 20 requests in 2 seconds, averaging 100ms per request
- Fixed Threads: Only
customer-t-1
throughcustomer-t-10
Worker threads throughout - Thread Reuse:
customer-t-1
immediately executes the 11th task after completing the 1st task at 09:53:30 (only 1ms interval) - Rejection Policy: When exceeding capacity (10 threads + 10 queue), the 21st request is rejected
Sync Interface Serial Execution
Sending 20 concurrent requests to /goody/sync/query1
(each task takes 1 second):
|
|
Comparative Analysis:
Dimension | Async Interface | Sync Interface |
---|---|---|
Tomcat Thread | Quickly released, receives 20 requests in 2 seconds | Blocked, takes 20 seconds to process 20 requests |
Concurrency | Can handle 20 simultaneously (10 threads + 10 queue) | Can only process serially, one after another |
Worker Thread Reuse | ✅ Exists (customer-t-1 handles 1st and 11th tasks) | ✅ Exists (customer-t-1 handles 1st and 11th tasks) |
Total Time | ~20 seconds (10 seconds × 2 rounds) | ~20 seconds (1 second × 20) |
Thread Utilization | High (Tomcat idle, Worker busy) | Low (Tomcat + Worker both occupied, doing one job) |
System Throughput | High (Tomcat thread can handle other requests) | Low (Tomcat thread occupied) |
Async Nature | ✅ Truly async, releases main thread | ❌ Fake async, essentially sync waiting (two threads doing one job, even slower) |
Key Conclusion:
Although the sync interface also demonstrates Worker thread reuse, it essentially doesn’t leverage async advantages. Instead, it brings additional overhead:
- Tomcat thread blocked → Cannot handle other requests
- Worker thread executes → Occupies thread pool resources
- Two threads cooperating to complete one task is worse than executing directly in Tomcat thread, which also saves thread switching overhead
This approach is an anti-pattern in production environments, used only for comparison to demonstrate async advantages.
Core Mechanism of Thread Reuse
Producer-Consumer Model
Java thread pool’s thread reuse is based on the Producer-Consumer Model:
- Worker Thread Loop: Worker threads in the thread pool continuously fetch tasks from
BlockingQueue
- Task Queue: New tasks are submitted to the queue, and idle threads immediately retrieve and execute them
- Reuse Advantages: Avoids overhead of frequent thread creation and destruction (context switching, memory allocation)
Similarities with IO Multiplexing
Core: Async thread pools are essentially “multiplexing” thinking at the application layer. Although implementation mechanisms differ, the approach to solving problems is highly similar to IO multiplexing.
Similarities
Core Idea: Using Limited Resources to Handle Massive Requests
- IO Multiplexing: 1 thread monitors N socket connections via epoll/select
- Async Thread Pool: A small number of Tomcat threads handle N concurrent requests (through quick release)
Non-blocking Mode
- IO Multiplexing: Main thread doesn’t block on a single IO operation, polls waiting for multiple IO events to be ready
- Async Thread Pool: Tomcat thread doesn’t block on time-consuming tasks, immediately returns to handle next request
Event Notification Mechanism
- IO Multiplexing: epoll notifies which socket is readable/writable
- Async Thread Pool: CompletableFuture notifies task completion
Essential Differences
Dimension | IO Multiplexing | Async Thread Pool |
---|---|---|
Reuse Object | Reuse thread (single thread handles multiple IO) | Reuse Tomcat thread (quick release) |
Use Case | Network IO-intensive | CPU/IO mixed |
Implementation Level | OS level (epoll/select) | Application level (thread pool scheduling) |
Typical Applications | Netty, Redis, Nginx | Spring WebFlux, Traditional Web |
Design Pattern | Reactor pattern | Producer-Consumer pattern |
Analogy
|
|
Conclusion: Although underlying mechanisms differ, both are solving the core problem of “how to handle high concurrency with limited resources”. Async thread pools can be understood as multiplexing thinking implemented at the application layer.
Summary
This article reveals the importance of custom thread pools and thread reuse mechanisms by comparing async and sync interface implementations. Key points:
- ✅ Custom thread pools achieve business isolation and fine-grained management
- ✅ Async interfaces improve system throughput by releasing Tomcat threads
- ✅ Thread reuse avoids overhead of frequent thread creation and destruction
- ✅ Properly configure thread pool parameters to avoid resource waste or task rejection
- ✅ Thread reuse process can be clearly observed through thread names in logs
In actual production environments, it’s also necessary to continuously optimize thread pool configuration by combining monitoring metrics (thread pool activity, queue length, rejection count, etc.).