Multithreading Basics

Summary

Introduction

Threads are the basic unit to which an operating system allocates CPU time. Each thread maintains exception handlers, a scheduling priority and a set of structures that the OS uses to save the thread context until it is scheduled. The thread context includes all the information that the thread requires to seamlessly resume execution including the thread's set of CPU registers and stack in the address space of the hosting process.

While there are many advantages to using threads, developers must be aware that threading does have disadvantages as well. For example:

Multiple threads often access the same resource and this creates huge problems in the form of conflicts, deadlocks and race conditions. To avoid these situations, access to shared resources must be controlled or synchronized. Resources that require synchronization often include:

  1. System resources such as communication ports.
  2. Resources shared my multiple processes such as file handles.
  3. Resources of a single application (like global filed) accessed by multiple threads.

With multithreading applications you must also concern yourself with deadlocking. Deadlocking involves understanding two other terms, locking and blocking. Locking means a thread T acquiring a resource that nobody else can access, usually using a critical-section mechanism such as Monitor.Enter. Blocking, means a thread T entering an efficient wait state and stops running while awaiting an event that will kick-start it again. Threads often block with Monitor.Wait.  Now deadlocking occurs when thread T1 locks a resource R1 and then blocks awaiting resource R2 to become available. At the same time, thread T2 locks resource R2 and then blocks awaiting resource R1 to become available. T1 is now waiting for T2 to release R2 and T2 is now waiting for T1 to release R1. No thread is able to continue and deadlocking occurs.

Multithreading defects are very difficult to isolate, reproduce and even eliminate. Traditionally, writing well-behaved multithreaded applications is difficult and requires a great deal of skill and discipline. .NET offers a rich set of synchronization objects that aim to simplify component concurrency management.

In general, using the ThreadPool class is the easiest way to handle multiple requests for relatively short tasks that will not block other threads and when you do not expect any particular ordering of scheduling. However, there are a number of reasons to create your own threads:

Multithreading in .NET

Multithreading is available to all .NET languages including Managed C++, C#, and VB.NET. Multithreading capabilities in the .NET framework class library are encapsulated under the System.Threading namespace. C# automatic garbage collection is an example of multithreading. C# provides a garbage-collector thread that reclaims dynamically allocated memory that is no longer needed (as a performance tip, set an object reference to null when it is no longer needed. This enables the garbage collector thread to determine at the earliest possible moment that that object can be garbage collected.)

.NET provides two synchronization approaches - Automatic synchronzation and manual synchronzation

The CLR provides three strategies to synchronize access to static and instance methods and  instance fields:

The following important table summarizes the three categories by showing what synchronization support is provided for methods and fields:

Category Global Fields Static Fields Static Methods Instance Fields Instance Methods Specific Code Blocks
No synchronization No No No No No No
Synchronized contexts No No No Yes Yes No
Synchronized Code regions No No Only if marked No Only if marked Only if marked
Manual synchronization Yes but manual Yes but manual Yes but manual Yes but manual Yes but manual Yes but manual

For example, the following class will have only its instance fields and methods synchronized because it has been decorated with the [Synchronization]attribute:

[Synchronization]
public class MyClass
{
    ...
}

Working with Threads

.NET threads are the managed-code representation of the underlying threads of the operating system. In .NET on Windows, .NET threads map one-to-one to Win32 native threads (this mapping could be changed in the future to use fibers for example). For each thread, the OS allocates the following:

The OS is the one currently responsible for managing threads such as thread scheduling, thread context switches as well as thread manipulation requests such as suspending, resuming, and sleeping. .NET exposes some of the native properties of Win32 threads such as priorities to developers, but .NET also associated managed-code properties with each thread such as state, security principal, name, unique id and many others. 

.NET threads are represented using the Thread class in System.Threading namespace. As expected, the Thread class provides various methods and properties to control the managed thread. Note that calling methods on the Thread class be they static or instance methods is always done on the stack of the calling thread and not on the stack of thread represented by the Thread object. The following introduces some of the common properties of class Thread:

Creating Threads

To create a new thread, create a new instance of class Thread and associate it with its thread method. The new Thread object will execute the designated thread method on a new thread. This new thread will terminate when the thread method terminates (there are other ways to terminated a thread as well). The thread method can either be static or instance, public or private on any object. The only requirement is that the method has this exact signature:

void <ThreadFunctionName>();        // Returns void and takes no parameters

You associate a Thread object with a thread function using a dedicated delegate called ThreadStart. This association must be established in Thread's constructor:

Thread thread = new Thread( new ThreadStart(MyThreadFunction) );

The thread is created is in the unstarted state (thread.ThreadState  = ThreadState.Unstarted). To start the thread, simple call Start:

thread.Start();

Thread.Start is a non-blocking operation, meaning that control returns immediately to the client that started the thread. even though it may be some time later until the new thread starts running.

Thread Methods

A thread method can do whatever you want it to do. Typically, however, it will contain some sort of a loop where the thread performs some finite amount of work in each iteration and then checks for some condition, letting it know whether to do another iteration or to terminate:

void MyThreadMethod()
{
    while <Some condition>
        Do work ...
}

The condition is usually some external event telling the thread whether its work is done or not. This condition is usually changed in another thread which means that the condition must be handled in a thread-safe manner.

Passing Thread Parameters

The thread method signature does not allow for any parameters to be passed. The proper way to pass parameters to the thread function is to make the thread function a member function of class whose properties can be used to hold parameter values. The thread function (i.e., member function) can then access properties to retrieve parameter values:

public class MyThreadClass
{
    // Data members
    string  m_strName;
    int     m_nID;

    // Properties
    public string Name
    {
        set { m_strName = value;}
        get { return m_strName; }
        
    }
    public int ID
    {
        set { m_nID = value;}
        get { return m_nID; }
        
    }

    // Thread function
    private void ProcessData()
    {
        // Get relevant parameters
        string name = Name;
        int    id   = ID;

        // ,,,

    } 
}

Blocking Threads

Class Thread provides several methods to block the execution of the associated thread. These include suspending the threading, sleeping, and waiting for another thread to terminate. This section explores the various available blocking operations:

Suspending & Resuming

Thread.Suspend and Thread.Resume methods are used to suspend and resume threads, respectively. Anybody can call Suspend on a Thread object including the threading function itself. Obviously, only clients on other threads can call Resume (a suspended thread cannot resume itself).

Suspend is non-blocking, in the sense that control returns immediately to the caller (even if the caller was the thread function) and the thread is actually suspended later usually at the next safe point. A safe point is a point in the code safe for garbage collection. Recall that when garbage collection takes place, .NET suspends all running threads so that it can compact the heap, move objects around and patch client-side references. The JIT compiler identifies those points in code that are safe for thread suspending (such as returning from a method call or doing the next iteration in a for loop). Therefore, when Suspend is called, it will actually suspend once it reaches the next safe point. 

The bottom line is that suspending a thread is not an instantaneous operation. Suspend and Resume are usually used to synchronize the execution of that thread with other threads. This usage of Suspend and Resume is not recommended because you cannot determine when it will actually take place. If you need to synchronize the execution of threads with each other, use the provided .NET synchronization objects such Monitor and WaitHandle-derived classes like Mutex, ManualResetEvent, and AutoResetEvent.

Sleeping

Thread.Sleep is used to put the thread to sleep for a specified amount of time. And because it is a static method, only the thread itself can put itself to sleep:

... 

public void MyThreadFunction()
{
    ...
    Thread.Sleep( 100 );            // milliseconds
}

Thread.Sleep is a blocking call meaning that the thread enter the sleep state immediately and control returns to the thread only after the sleep period has elapsed. Any thread that calls Thread.Sleep is willingly relinquishing the remainder of its allocated CPU time slot, even if the sleep period is less than the remainder of the slot. Consequently, calling Thread.Sleep with a timeout of zero is a way of forcing a thread context switch:

Thread.Sleep( 0 );     // Force a thread context switch

You could also put a thread to sleep indefinitely using the Infinite static constant of class Timeout:

Thread.Sleep( Timeout.Infinite );        // Sleep infinitely

Sleeping infinitely is useless because it would be better to terminate the thread altogether. If you need to block the thread until some event takes place, then again use the provided .NET synchronization objects such Monitor and WaitHandle-derived classes like Mutex, ManualResetEvent, and AutoResetEvent.

Traditionally, putting a thread to sleep was used to cope with race conditions. A race condition occurs where thread T1 needs to have another thread T2 complete a task or reach a certain stage in processing, but thread T1 proceeds as if T2 has already done so. In a poor design, T1 has some processing to do and this keeps it busy until T2 is ready. Occasionally, T1 will finish before T2 is ready and this causes a race condition. Using Thread.Sleep is not appropriate because it does not solve the underlying problem which is the lack of proper synchronization between T1 and T2. Therefore, avoid putting threads to sleep and use .NET synchronization objects instead

Spinning while waiting

Class Thread provides another sleep-like operation called Thread.SpinWait. The static Thread.SpinWait causes the calling thread to wait the number of iterations specified and the thread is never added to the queue of waiting threads. As a result, the thread is put to sleep without effectively relinquishing the remainder of its CPU slot. .NET documentation does not define what an iteration is so SpinWait could possibly take different times at different machines:

const int THOUSAND = 1000;
Thread.SpinWait( THOUSAND );

Thread.SpinWait does not replace Therad.Sleep but is intended as an advanced optimization technique. For example, if you know your thread is waiting for a resource that will be available very soon, then it is possibly more efficient to spin and wait rather than using Thread.Sleep or a .NET synchronization object which actually perform a context switch ( a very expensive operation by the OS).

Using Thread.SpinWait is questionable at best. What happens if the resource is not available at the end of the Thread.SpinWait call? Or if the OS pre-empts your thread because its allocated time slot has elapsed? Or if another thread with a higher priority is running? In general, it is best to use deterministic programming (.NET synchronization objects) and avoid optimization techniques.

Joining 

Thread.Join method allows the calling thread to block until another thread terminates. In the following code, the calling client blocks until thread ProcessingThread terminates:

public void WaitForProcessingThreadToDie(Thread ProcessingThread)
{
    // Wait for the ProcessingThread thread to die. Code that called WaitForProcessingThreadToDie will now block
    ProcessingThread.Join();

    // ProcessingThread thread has died. Perform appropriate actions
    ...
}

Thread.Join will always return regardless of the cause of death - whether the thread terminated naturally by returning from the thread method, or unnaturally by being aborted or by throwing an exception.

Thread.Join is particularly useful when waiting for an application to shut down. When an application starts to shut down it typically asks all its worker threads to terminate and then the application waits for those threads to terminate. The standard way is by calling Join on these worker threads:

public void ShutDownApplicationb()
{
    workerThread1.Join();
    workerThread2.Join();
    workerThread3.Join();
    ...
}

Thread.Interrupt on the other hand can be used to rudely awaken a sleeping or waiting thread. Calling Thread.Interrupt unblocks a sleeping thread or a waiting thread (such as a thread that called Join) and throws ThreadInterruptedException in the interrupted (unblocked) thread. If the code of the unblocked thread does not catch the ThreadInterruptedException, the thread will terminated. Calling Thread.Interrupt on a thread that is neither sleeping nor waiting will cause the thread to throw a ThreadInterruptedException next time the thread tries to sleep or wait.  

Again, do not rely on using Thread.Interrupt to synchronize threads with each other. Use .NET synchronization objects instead.

Aborting Threads

Thread.Abort method is used to forcibly terminate a thread. Thread.Abort method can be called either by the thread itself or by other threads, either way, calling Thread.Abort throws the ThreadAbortException in the thread being aborted. ThreadAbortException is a special exception - even the if thread method catches and handles ThreadAbortException, .NET will still rethrow ThreadAbortException to terminate the thread:

public void MyThredFunction()
{
    try
    {
        // Do work here
    }
    catch
    {
        // handle any exceptions. ThreadAbortException will be re-thrown automatically by .NET
        // even if it was handled in a catch block
    }
}

Thread.Abort has two overloaded version:

public void Abort();
public void Abort(object stateinfo);

With the second overload you can provide a generic stateinfo parameter which can contain any application-specific information to the aborted thread. The aborted thread can access the stateinfo object via the ThreadAbortException.ExceptionState property if the thread is using exception handling. The following program illustrates aborting a thread:

public class MyClass
{
    // thread function
    public void DoSomething()
    {
        try
        {
            int i = 0;
            while (true)
                i = i - i;
        }
        catch( System.Threading.ThreadAbortException ex )
        {
            // Print time this thread acutally aborted
            Trace.WriteLine( "Time worker thread was aborted: " + DateTime.Now.ToString() );

            // Get the state info object
            ThreadStateInfo stateinfo = (ThreadStateInfo)ex.ExceptionState;
            Trace.WriteLine( "This thread was aborted by thread " + stateinfo.nAbortingThreadID + " cause: " + stateinfo.strMessage );
        }
    }
}

private void button1_Click(object sender, System.EventArgs e)
{
    // Create and start a thread
    MyClass ob = new MyClass();
    Thread thread = new Thread( new ThreadStart( ob.DoSomething ));
    thread.Start();

    // Now abort a thread and wait until it is aborted
    ThreadStateInfo stateinfo;
    stateinfo.strMessage = "Thread aborted based on user request";
    stateinfo.nAbortingThreadID = Thread.CurrentThread.GetHashCode();
    Trace.WriteLine( "Time thread was requested to abort in main thread: " + DateTime.Now.ToString() );
    thread.Abort( stateinfo );
    thread.Join();
    Trace.WriteLine( "Time thread was aborted in the main thread: " + DateTime.Now.ToString() );
}

Output:

Time thread was requested to abort in main thread: 05/07/2004 08:52:19
Time worker thread was aborted: 05/07/2004 08:52:25
This thread was aborted by thread 14 cause: Thread aborted based on user request
Time thread was aborted in the main thread: 05/07/2004 08:52:25

If Thread.Abort is called while the thread is blocked (sleeping or waiting on a .NET synchronization object), .NET will unlock the thread and abort it. However, you cannot abort a suspended thread - you will get an exception on the side that attempted to abort the thread and the suspended thread will terminated without letting it handle the exception. It is worth mentioning that Thread.,Abort has a counter-abort method called Thread.ResetAbort that should be called in the thread's ThreadAbortException catch block. Thread.ResetAbort will prevent .NET from rethrowing ThreadAbortException at the end of the catch block.

In general, terminating a thread with Thread.Abort is not recommended.  First, it forces a thread to perform a non-graceful exit and even though you can place all cleanup-code in a finally block, you typically want your thread to exit gracefully and not have to deal with cleaning resources within exceptions. Second, nothing prevents the thread from aborting exceptions with Thread.ResetAbort. If you want to terminate a thread, do so using the .NET synchronization objects, for example, by setting an event (see WaitHandle-derived objects) .

Killing Threads

As mentioned previously, you should avoid calling Thread.Abort to terminate a thread. Instead, in each iteration, the thread should check a flag to determine whether it needs to do another iteration or return from the thread method. The code that follows provides a template for killing thread. But before examining that code block, recall the .NET pattern for cleaning up resources:

To properly clean up resources the following pattern is used (throughout .NET Framework):

The MyWorkerThread class  below uses the pattern above to clean up managed and unmanaged resources. However, the overall concept of this threading class is to encompass the thread function and another function KillThread used to terminate the thread function on demand. The basic idea behind KillThread is simple: set a member variable that the thread function checks on every iteration to see if the thread function should terminate or not. If the variable m_bEndLoop is true, the thread terminates, otherwise, the thread continues with the next iteration. The only complexity arise from the fact MyWorkerThread class will be used by more than one client to run separate thread functions and all access to m_bEndLoop must be synchronized across threads

public class MyWorkerThread : System.IDisposable
{
    /* Data members */
    protected bool           m_bEndLoop     = false;
    protected string         m_strName      = "";
    protected Mutex          m_mutexEndLoop = new Mutex();
    protected AutoResetEvent m_autoEvent    = new AutoResetEvent(false);
    protected Thread         m_objThread    = null;

    /* Constructors/Destructors */
   
public MyWorkerThread(string strName)
    {
        Trace.WriteLine("Enter MyWorkerThread " + strName);
        m_strName = strName;
        Trace.WriteLine("Exit MyWorkerThread " + strName);
    }

    /* Presence of this destructor will generate a call to Finalize which will mark
    this object as finalizable and hence is placed in the finalization queue. The finalization
    queue is examined by the .NET Runtime before it frees memory for objects. This will only
    be called if user did not call Dispose() (Dispose frees both managed and unmanaged resources
    and suppresses finalization as there becomes no need for it) */
    ~MyWorkerThread()
    {
        // Free unmanaged resources only
        Trace.WriteLine("Enter ~MyWorkerThread " + m_strName);
        Dispose(false);
        Trace.WriteLine("Exit ~MyWorkerThread " + m_strName);
    }

    /* IDisposable implementation */

    /* Users must explicitly call this to free managed and unmanaged
    resources. Once managed and unmanaged resources are freed, there becomes no need to call
    the destructor which frees unmanaged resouces and hence the call to GC.SuppressFinalize */

    public void Dispose()
    {
        Trace.WriteLine("Enter IDisposable.Dispose " + m_strName);
        // Free managed and unmanaged resources
        Dispose(true);

        // Supress finalization because Dispose(true) frees both managed and unmanged resources
        GC.SuppressFinalize(this);
        Trace.WriteLine("Exit IDisposable.Dispose " + m_strName);
    }

    /* Properties */
   
public bool EndLoop
    {
        get
        {
            // Protect access to the m_bEndLoop variable which will be shared my all clients using
            // this class to manage threads
    
        bool bResult = false;
            m_mutexEndLoop.WaitOne();
            bResult = m_bEndLoop;
            m_mutexEndLoop.ReleaseMutex();
            return bResult;
        }
        set
        {
            // Protect acces to the m_bEndLoop variable which will be shared my all clients using
            // this class to manage threads
    
        m_mutexEndLoop.WaitOne();
            m_bEndLoop = (bool)value;
            m_mutexEndLoop.ReleaseMutex();
        }
    }

    /* Public interface methods */

    // Thread function
   
public void Run()
    {
        Trace.WriteLine("Enter thread function " + m_strName);

        // Cache the current thread. The current thread should be cached in the thread function
        // and not the in the constructor which runs in the client's thread.
        m_objThread = Thread.CurrentThread;
        m_objThread.Name = m_strName;
        m_autoEvent.Set();

        // Can thread run or was it told to stop
       
while( EndLoop == false)
        {
            // Do thread functionality here
           
Thread.Sleep(100);
            Trace.Write(".");
        }
        Trace.WriteLine("Exit thread function " + m_strName);
    }

    // Killing a thread
   
public void KillThread()
    {
        Trace.WriteLine("Enter KillThread " + m_strName);

        // Is the thread dead anyway
       
if (m_objThread.IsAlive == false)
            return;

        // Thread is alive. Set a property indicating that thread should be killed
        EndLoop = true;

        // This function should only return when the thread has died
        m_objThread.Join();
        Trace.WriteLine("Exit KillThread " + m_strName);
    }

    public void WaitForThreadToStart()
    {
        Trace.WriteLine( "Enter WaitForThreadToStart " + m_strName);
        m_autoEvent.WaitOne();
        Trace.WriteLine( "Exit WaitForThreadToStart " + m_strName);
    }

    /* Helpers */

    // Helper function that implemented the actual clean up
   
protected virtual void Dispose(bool bFreeAll)
    {
        Trace.WriteLine("Enter Helper Dispose " + m_strName);

        // First of all, kill the thread
       
KillThread();

        // Then free resources
       
if (bFreeAll)
        {
            // Called by Dispose(). Free managed and unmanaged resources
            m_mutexEndLoop.Close();
        }
        else
        {
            // Called by Destructor(). Free unmanaged resources only
            m_mutexEndLoop.Close();
        }
        Trace.WriteLine("Exit Helper Dispose " + m_strName);
    }
}

private void button2_Click(object sender, System.EventArgs e)
{
    // Create and start 2 new thread
    MyWorkerThread ob = new MyWorkerThread("T1");
    MyWorkerThread ob2 = new MyWorkerThread("T2");
    //
    Thread thread1 = new Thread( new ThreadStart( ob.Run ));
    Thread thread2 = new Thread( new ThreadStart( ob2.Run ));
    //
   
thread1.Start();
    thread2.Start();

    // Now wait on threads to start
    ob.WaitForThreadToStart();
    ob2.WaitForThreadToStart();

    // Now kill the two threads
    ob.KillThread();
    ob2.KillThread();
}

Output is shown below:

Enter MyWorkerThread T1
Exit  MyWorkerThread T1
Enter MyWorkerThread T2
Exit  MyWorkerThread T2
Enter WaitForThreadToStart T1
Enter thread function T1
Exit  WaitForThreadToStart T1
Enter thread function T2
Enter WaitForThreadToStart T2
Exit  WaitForThreadToStart T2
Enter KillThread T1
Exit  thread function T1
    The thread 'T1' (0x24c) has exited with code 0 (0x0).
Exit  KillThread T1
Enter KillThread T2
Exit  thread function T2
    The thread 'T2' (0x340) has exited with code 0 (0x0).
Exit  KillThread T2
Enter ~MyWorkerThread T1
Enter Helper Dispose T1
Enter KillThread T1
Exit  Helper Dispose T1
Exit ~MyWorkerThread T1
Enter ~MyWorkerThread T2
Enter Helper Dispose T2
Enter KillThread T2
Exit  Helper Dispose T2
Exit ~MyWorkerThread T2
The program '[328] SyncObjects.exe' has exited with code 0 (0x0).

Thread States - Life Cycle of a Thread

At any time, a thread is said to be in one or more of several thread states as indicated by the thread's Thread.ThreadState property. Thread states are illustrated in the figure below:

These states are explained in the table below,

Thread State Action Notes
Unstarted
  • CLR creates a new thread.
When a new thread is created by the common language runtime, it will begin its life in the Unstarted state. A thread remains in this state until the program calls Thread.Start method.
Running
  • A thread calls Start.
  • Another thread call Interrupt, Pulse, PulseAll.
  • Another thread calls Resume.
The newly created thread remains in the Unstarted state until the program calls Thread.Start, at which time the thread enters the Running state and immediately returns control to the calling program.

A thread in this state actually starts running when the OS assigns a processor to the thread. At this point, the thread will start executing its ThreadStart delegate (the thread function in Win32 terminology)

AbortRequested
  • Another thread calls Thread.Abort.
The Thread.Abort function has been called but the thread has not yet received the ThreadAbortException exception which will attempt to stop it.

Note that if a thread was already blocked from a previous call to Wait and another thread calls Thread.Abort on the blocked thread, the thread will be in both AbortRequested and WaitSleepJoin states.

Stopped
  • Thread function terminates naturally.
  • The thread responds to Thread.Abort.
A Running thread enters the Stopped state when its ThreadStart delegate terminates. This can happen either because the delegate finished what it was supposed to do or because the Thread.Abort method was called, in which case a ThreadAbortException is thrown.

If there are no references to the Stopped thread, the garbage collector can remove the thread object from memory.

Note from the above figure that a thread enters the Stopped state only from the Running state indirectly.

WaitSleepJoin
  • The thread calls Thread.Sleep.
  • The thread calls Monitor.Wait on another object.
  • The thread calls Thread.Join on another thread.
If a thread encounters code it cannot execute (because of some unsatisfied condition), the thread can call Monitor.Wait to enter the WaitJoinSleep state.  Once in this state, the blocked thread can return to the Running state when another thread calls Monitor.Pulse or Monitor.PulseAll.

A thread can also call Thread.Sleep to enter WaitSleepJoin for a specified period of time. A sleeping thread returns to the Running state after the sleep time expires.

If a thread calls Thread.Sleep or Monitor.Wait to enter WaitSleepJoin state, then it also return to the Running state if the sleeping or waiting thread's Thread.Interrupt method is called by another thread.

If a thread cannot continue executing until another thread terminates, then the dependent thread can call the other thread's Thread.Join method to 'join' the two threads. When the two threads are 'joined', the dependent thread leaves the WaitSleepJoin state when the other thread terminates. 

Suspended
  • The thread calls Thread.Suspend.
If a Running thread calls Thread.Suspend, it will enter the Suspended state. A Suspended state returns to the Running state if another thread in the program invokes the suspended thread's Thread.Resume.

As stated earlier, a thread can be in more than one state. For example, if a thread is blocked from a Wait call and another thread calls Abort on this blocked thread, then the blocked thread will be in both WaitSleepJoin and AbortRequested states at the same time. In this case, as soon as the thread returns from the call to Wait or is interrupted, it will receive the ThreadAbortException exception.

In general, it is not recommended to design multithreaded applications that rely on the information provided by the ThreadState property. You should design so that your code does not depend on a thread being in a particular state. In addition, by the time you retrieve a thread's state and act on it, the state could have changed. If your thread transitions between logical states specific to your application, then use .NET synchronization objects to synchronize transitioning between these states.

Thread Levels

A managed thread is either a background thread or a foreground thread, Background threads are identical to foreground threads except with one exception: a background thread will not keep the managed execution environment alive once all foreground threads have exited. Note the following points:

Thread-Local Storage

Thread-Local Storage TLS can be used to store data that is unique to a thread. TLS should be used when your requirements for storing thread-relative data are discovered at run-time, whereas thread-relative static fields should be used if you can anticipate your exact needs for storing thread-relative data at compile-time. 

TLS is actually part of the thread's stack, and therefore, only that thread can access it. Obviously, objects allocated off the TLS require no synchronization an only one thread can access them. The downside is that TLS imposes thread-affinity because components or objects using TLS  must execute on the same thread to access their TLS-specific variables.

TLS provides dynamic data slots that are unique to a thread and an application domain combination. A slot is an object of type LocalDataSlot. There are two types of TLS data slots:

To use TLS simply use Thread.AllocateNamedDataSlot or Thread.AllocatedDataSlot and use the appropriate methods to set / get data.

Thread-Relative Static Fields

By default, static variables are visible to all threads in an application domain. Obviously, having static variables accessible by threads can cause corruption and more importantly deadlocks. If you need to share these static variables across threads, then access to them must be synchronized. However, when you have no need to share static variables across threads, you can use thread-relative static variables. Thread-relative static fields can be used to store data that is unique to a thread. If you know that a field of your type should always be unique to a thread - application domain combination, then use thread-relative static fields by using the [ThreadStatic] attribute:

public class MyWorkerThreadClass
{
    [ThreadStatic]
    protected static string m_strThreadName;

    ...
}

[ThreadStatic] can only be applied to static fields only, but you can still wrap the static field with a static property. Note the following two pitfalls when using thread-relative static variables:

Thread priorities & Thread Scheduling

Every thread has a priority in the range between ThreadPriority.Lowest to ThreadPriority.Highest, with a default of ThreadPriority.Normal. A thread's priority can be adjusted with the Priority property which accepts values from the ThreadPriority enumeration.

The Windows OS supports a concept called timeslicing that enables threads of equal priority to share a CPU. Without timeslicing, each thread in a set of equal-priority threads will run to completion before any of the other threads get a chance to execute (unless the running threads enters a Suspended, Stopped, or WaitJoinSleep state). But with timeslicing, each thread receives a brief burst of processor time called a quantum, during which the thread can execute. At the completion of the quantum, the processor time is taken away from the thread (even if it has not finished executing) and given to another thread of equal priority, if one is available. Thread schedueling is therefore preemptive - meaning that the OS will pre-empt (i.e.,. stop) any thread to allow other threads to run.

The job of the OS's thread scheduler it to keep the highest priority threads running at all times, and if there is more than one highest-priority thread, to ensure that all such threads execute for a quantum in a round-robin fashion. The following figure illustrates the multi-level priority queue for threads.

 

Highest-priority threads A and B each execute for a quantum in a round-robin fashion until both threads finish executing. Next thread C gets processor attention and run to completion. Then, BelowNormal-priority threads D and E each execute for a quantum in a round-robin fashion until both threads finish executing  This process continues until all threads run to execution. Note that new higher-priority threads could postpone indefinitely the execution of low-priority threads (this is often known as starvation).

In general, avoid controlling application flow by setting thread priorities. Use .NET synchronization objects to control and coordinate the flow of your multithreaded application and to resolve race conditions. Se thread priorities to values other than Normal only when the semantics of the operation requires it. For example, a screen saver's thread priority should run at priority ThreadPriority.Lowest so other other operations like compilation, browsing, networking, and so on could take place. 

Project Basic Threading displays the basic programming procedures for multithreading in C# and the .NET Framework.

ISynchronizeInvoke

Assume you have a class that wraps a thread function similar to MyWorkerThread and that somewhere in the program you instantiate this class and start its thread function:

MyWorkerThread obClient = new MyWorkerThread( "BufferProcessor" );
Thread thread = new Thread( new ThreadStart( obClient.Process ) );
thread.Start();

When the thread function obClient.Process call a method on some other object, that method will run on the same thread as obClient.Process. But what should happen if the called method must always run on some other thread say T2? Such situations are fairly common especially in Windows Forms where windows and controls must always process messages on the same thread that created those windows / controls. For example, any Windows Form is usually created on the primary thread T1. Now assume that such a form spawns a new worker thread T2 in its Load handler to retrieve and process data. After T2 has finished processing data, it must initialize the GUI on the form. However, the GUI on the form has been created on thread T1 and cannot be updated directly from thread T2. The usual and recommended way is for T2 to update GUIs running on T1 using each control's  InvokeControl method. And this takes us directly to the way .NET addresses this situation. ISynchronizeInvoke.

ISynchronizeInvoke provides a standard and generic mechanism for invoking methods on object (controls in the above example) running on other threads. Using the same example above, T2 call's the control's ISynchronizeInvoke.Invoke method. The implementation of ISynchronizeInvoke.Invoke will block the calling thread T2, marshal the call to thread T1, process it on thread T1, marshal the return values to thread T2 and then return control to thread T2. ISynchronizeInvoke.Invoke accepts a delegate targeting the method to be invoked on the object residing on the other thread.

ISynchronizeInvoke provides a standard and generic mechanism for invoking methods asynchronously. This is done using ISynchronizeInvoke.BeginInvoke and ISynchronizeInvoke.EndInvoke.  These methods are used with the general asynchronous programming model used by .NET.

Windows Forms and ISynchronizeInvoke

As mentioned previously, Windows Forms controls make extensive use of ISynchronizeInvoke. class Control and every class derived from it depend on the underlying Windows messages and message-loop to process them. The message loop has thread-affinity, because messages to a window are only delivered to the same thread that created the window. In general, always use ISynchronizeInvoke to invoke controls on a Windows Forms window running on another thread.

.NET And COM Apartments

.NET does not have an equivalent's to COM apartments. Every .NET component resides in a multithreaded environment and it is up to the developer to provide proper synchronization. Apartments come into play only when .NET needs to interact with COM components via COM interop. The Thread.ApartmentState property can be used to instruct .NET what apartment to present to COM. By default, Thread.ApartmentState property is set to ApartmentState.Unknown which results in an MTA.

When .NET interacts with COM, if the apartment state of the managed thread matches that of the COM object, then COM will run the object on that thread. If the threading model is incompatible, then COM will marshal the call to the COM object's apartment according to the COM rules. Obviously a match in the apartment model will result in better performance.

You can also use either the [STAThread] or [MTAThread] method attributes to declaratively set the apartment state. The Windows Forms application wizard automatically applies [STAThread] to the main method of a Windows Forms project. This is done for two reasons:

  1. The application might host ActiveX controls which are STA objects by definition.
  2. The application might interact with the Clipboard which still uses COM interop.