Skip to content

Instantly share code, notes, and snippets.

@JonCole
Last active August 29, 2024 09:23
Show Gist options
  • Save JonCole/e65411214030f0d823cb to your computer and use it in GitHub Desktop.
Save JonCole/e65411214030f0d823cb to your computer and use it in GitHub Desktop.
Intro to CLR ThreadPool Growth

ThreadPool Growth: Some Important Details

The CLR ThreadPool has two types of threads - "Worker" and "I/O Completion Port" (aka IOCP) threads.

  • Worker threads are used when for things like processing Task.Run(…) or ThreadPool.QueueUserWorkItem(…) methods. These threads are also used by various components in the CLR when work needs to happen on a background thread.
  • IOCP threads are used when asynchronous IO happens (e.g. reading from the network).

The thread pool provides new worker threads or I/O completion threads on demand (without any throttling) until it reaches the "Minimum" setting for each type of thread. By default, the minimum number of threads is set to the number of processors on a system.

Once the number of existing (busy) threads hits the "minimum" number of threads, the ThreadPool will throttle the rate at which is injects new threads to one thread per 500 milliseconds. This means that if your system gets a burst of work needing an IOCP thread, it will process that work very quickly. However, if the burst of work is more than the configured "Minimum" setting, there will be some delay in processing some of the work as the ThreadPool waits for one of two things to happen 1. An existing thread becomes free to process the work 2. No existing thread becomes free for 500ms, so a new thread is created.

Basically, it means that when the number of Busy threads is greater than Min threads, you are likely paying a 500ms delay before network traffic is processed by the application. Also, it is important to note that when an existing thread stays idle for longer than 15 seconds (based on what I remember), it will be cleaned up and this cycle of growth and shrinkage can repeat.

If we look at an example error message from StackExchange.Redis (build 1.0.450 or later), you will see that it now prints ThreadPool statistics (see IOCP and WORKER details below).

System.TimeoutException: Timeout performing GET MyKey, inst: 2, mgr: Inactive, 
queue: 6, qu: 0, qs: 6, qc: 0, wr: 0, wq: 0, in: 0, ar: 0, 
IOCP: (Busy=6,Free=994,Min=4,Max=1000), 
WORKER: (Busy=3,Free=997,Min=4,Max=1000)

In the above example, you can see that for IOCP thread there are 6 busy threads and the system is configured to allow 4 minimum threads. In this case, the client would have likely seen two 500 ms delays because 6 > 4. Note that these ThreadPool usage stats are for the entire client app and are not only threads used by StackExchange.Redis.

Note that StackExchange.Redis can hit timeouts if growth of either IOCP or WORKER threads gets throttled.

Recommendation:

Given the above information, we strongly recommend that customers set the minimum configuration value for IOCP and WORKER threads to something larger than the default value. We can't give one-size-fits-all guidance on what this value should be because the right value for one application will be too high/low for another application. This setting can also impact the performance of other parts of complicated applications, so each customer needs to fine-tune this setting to their specific needs. A good starting place is 100, then test and tweak as needed.

How to configure this setting:

Important Notes:

  • The value specified in this configuration element IS a per-core setting. For example, if you have a 4 core machine and want your minIOThreads setting to be 200 at runtime, you would use <processModel minIoThreads="50"/>.
  • If you are running inside of Azure WebSites, this setting is not exposed through the configuration options and will need to follow the programmatic APIs from within the Application_Start method in Global.asax.

Important Note: The value specified in when calling SetMinThreads IS NOT a per-core setting. If you want the setting to be 200 at runtime, pass 200 to this API regardless of the number of cores the machine has.

@JeremyWeir
Copy link

Hey Jon,

Wondering if you could help me think about this correctly.

I thought I understood how adjusting the min threads would help with timeouts on thread pool growth. After reviewing our error logs I found the highest Busy values for IOCP and WORKER pools and set the minimums to be slightly higher values. About an hour after deploying, we received this exception...

System.TimeoutException: Timeout performing EVAL, inst: 1, mgr: Inactive, err: never, queue: 3, qu: 0, qs: 3, qc: 0, wr: 0, wq: 0, in: 106, ar: 0, IOCP: (Busy=60,Free=940,Min=100,Max=1000), WORKER: (Busy=192,Free=32575,Min=500,Max=32767)

I would have thought this would be an impossible error to have less busy than the minimum. Do you know if it doesn't actually create the thread pool with the minimum threads and just treats the minimum as what to not let it drop under once it scales up to that level?

Cheers,
Jeremy

@JonCole
Copy link
Author

JonCole commented Feb 22, 2016

Busy can be less than Min. The system doesn't keep Min threads around at all time - it just is willing to quickly scale up to Min without any delays. After it hits Min, it will start throttling the growth.

In your case, you hit a timeout but it is not likely to be because of thread pool growth throttling - something else is causing the timeout. For instance, client side CPU usage spikes to 100% (e.g. threads have delays between access to the CPU). See https://gist.github.com/JonCole/db0e90bedeb3fc4823c2 and https://gist.github.com/JonCole/db0e90bedeb3fc4823c2 for more details.

@SkyKicker
Copy link

Awesome post! This solved our problem with massive numbers of timeouts with Azure Redis Cache using the StackExchange.Redis library.

@marsen
Copy link

marsen commented Nov 17, 2016

System.TimeoutException: Timeout performing GET MyKey, inst: 2, mgr: Inactive,
queue: 6, qu: 0, qs: 6, qc: 0, wr: 0, wq: 0, in: 0, ar: 0,
IOCP: (Busy=6,Free=994,Min=4,Max=1000),
WORKER: (Busy=3,Free=997,Min=4,Max=1000)

HI I want to know those parameters mean ,
inst mgr queue qu qs qc wr wr in ar ,
Is any article or document can share to me ?
Thanks

@marsen
Copy link

marsen commented Nov 17, 2016

@RemiBou
Copy link

RemiBou commented Jun 26, 2017

Ok now we know when we have to increase the thread pool size. How do we know that it's too big ? In other word, what prevent me from setting it to 1000 ?

@runxc1
Copy link

runxc1 commented Oct 3, 2017

I always love documentation with language like "You should..." what isn't clear to me is if you call ThreadPool.SetMinThreads in the Application_Start Method or not. It says to see below and then below it only shows how to set this outside of ASP.Net. Can you do the same within ASP.Net when running on Azure Websites?

@amrithyerramilli
Copy link

@runxc1 - Yes, you can do this in the Application_Start method for ASP.net website running on Azure Websites.

I have done this and have seen timeouts drop drastically. This is on an ASP.net MVC 4 application running on a Premium tier on Azure App Service.

Here's a sample code that you can adapt for your application.

// In Global.asax.cs

protected void Application_Start()
{
            int currentMinWorker, currentMinIOC;
            // Get the current settings.
            ThreadPool.GetMinThreads(out currentMinWorker, out currentMinIOC);
            Log.DebugFormat("Application_Start : Current configuration value for IOCP = {0} and WORKER = {1}",
                currentMinIOC, currentMinWorker);
            int workerThreads = string.IsNullOrEmpty(ConfigurationManager.AppSettings["WORKER_THREADS"])
                ? currentMinWorker
                : Convert.ToInt32(ConfigurationManager.AppSettings["WORKER_THREADS"]);
            int iocpThreads = string.IsNullOrEmpty(ConfigurationManager.AppSettings["IOCP_THREADS"])
                ? currentMinWorker
                : Convert.ToInt32(ConfigurationManager.AppSettings["IOCP_THREADS"]);
            // Change the minimum number of worker threads and minimum asynchronous I/O completion threads.
            if (ThreadPool.SetMinThreads(workerThreads, iocpThreads))
            {
                // The minimum number of threads was set uccessfully.
                Log.DebugFormat(
                    "Application_Start : Minimum configuration value set - IOCP = {0} and WORKER threads = {1}",
                    iocpThreads, workerThreads);
            }
            else
            {
                // The minimum number of threads was not changed.
                Log.Debug("Application_Start : The minimum number of threads was not changed");
            }
}

The above is adapted from this MSDN doc

@zhangruiskyline
Copy link

Looks like increase minthread will also have create significant contention problem:
https://docs.microsoft.com/en-us/dotnet/api/system.threading.threadpool.setminthreads?view=netframework-4.7.2#remarks

so we need to increase minthread to solve timeout, but increase limit thread can cause performance degrade. sounds like a dilemma, any suggest?

@3rzx
Copy link

3rzx commented Feb 18, 2019

Same problem with RemiBou, in addition I use ThreadPool.SetMaxThreads(...) to limit number of threads. Is there any principle to set number of MaxThreads and MinThreads ?

@Leonardo-Ferreira
Copy link

Look, here's the thing. Each application is built uniquely and behaves uniquely. THERE IS NOT ONE-SIZE-FITS-ALL solution. You are a developer and it is expect of you to figure out what is the best number FOR EACH APPLICATION.

IF you decided to set the minThreads to 1million, what might occur is that in a case of a surge of requests, the CPU will spend a lot more time SWITCHING between threads than actually working on responding to the requests! For each particular implementation the diminishing returns point changes.

IF you set it to too low, whenever you see a request surge, you might time-out because there won't be available threads to handle the requests, despite the fact that the processor being @ 50% load.

It is up to you to figure it out. If you're dealing with a monolith with a ton of different types of loads, sorry, you're screwed. If you have a nice micro services app, you win! I, myself, mio, jo, io, eu, am in the srewed side of the curve. I choose to set to 200 threads on a 4 core machine... sometimes I get overrun, sometimes I put out timeouts... but, in avg, It works OK...

@JonCole
Copy link
Author

JonCole commented Nov 6, 2019

One suggestion to help would be to use code like this to monitor your actual thread pool usage over time. The goal is to track your busy count, keeping it less than minThreads by some reasonable margin. As the busy count approaches or surpasses the minThreads setting, then you should re-evaluate the value you use.

@Leonardo-Ferreira
Copy link

Leonardo-Ferreira commented Nov 6, 2019

Keep in mind that UNIX based systems DO NOT have IOCP threads! If you're using .Net Core, developing on windows and deploying to any linux dist, results might vary. When you query for IOCP threads, MAC OSX for example, it will report 1000, but no matter what kind of async operation you perform (disk read, database query, network call) that number will not change.

@avparuch
Copy link

Hi John, This article seems (I could be wrong) to be geared toward CLR. What are you suggestions for CoreCLR or .NET Core?

@wbrianwhite
Copy link

This article from Jan 2016 years ago seems geared towards CLR? 😄 DotNet core 1 was released in June 2016

@dayokesola
Copy link

Thanks!!

@dili91
Copy link

dili91 commented Feb 8, 2024

Thanks for this precious resource 🙌
A couple of questions:

  1. Is this a valid resource for recent .NET Core applications ?
  2. The throttling problem mentioned when there are too low minimum values affects both worker and IOCP thread, or just the latter? I'm interested also because I'm running .NET core on Linux 😅 And I'm pretty sure that IOCP-specific comments might not apply in my case

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment