programming4us
programming4us
ENTERPRISE

Programming .NET Components : Working with Threads (part 5) - Thread States, Thread Priority and Scheduling

- Free product key for windows 10
- Free Product Key for Microsoft office 365
- Malwarebytes Premium 3.7.1 Serial Keys (LifeTime) 2019

4. Thread States

.NET manages a state machine for each thread and moves the threads between states. The ThreadState enum defines the set of states a .NET managed thread can be in:

    [Flags]
public enum ThreadState
{
Aborted = 0x00000100,
AbortRequested = 0x00000080,
Background = 0x00000004,
Running = 0x00000000,
Stopped = 0x00000010,
StopRequested = 0x00000001,
Suspended = 0x00000040,
SuspendRequested = 0x00000002,
Unstarted = 0x00000008,
WaitSleepJoin = 0x00000020
}

For example, if a thread is in the middle of a Sleep( ), Join( ), or wait call on one of the synchronization objects, the thread is in the ThreadState.WaitSleepJoin state. .NET throws an exception of type ThreadStateException when it tries to move the thread to an inconsistent state—for example, by calling Start( ) on a thread at the ThreadState.Running state or trying to abort a suspended thread (ThreadState.Suspended). The Thread class has a public read-only property called ThreadState that you can access to find the exact state of a thread:

    public ThreadState ThreadState{get;}

The ThreadState enum values can be bit-masked together, so testing for a given state is typically done as follows:

    Thread workerThread;
//Some code to initialize workerThread, then:
ThreadState state = workerThread.ThreadState;

if((state & ThreadState.Unstarted) == ThreadState.Unstarted)
{
workerThread.Start( );
}

However, by the time you retrieve the thread's state and decide to act upon it, the state may already have changed. I don't recommend ever designing your application so that you rely on the information provided by the ThreadState property; rather, you should design so that your code doesn't depend on the thread being in a particular state. If your thread transitions between logical states specific to your application, such as beginning or finishing tasks, use .NET synchronization objects to synchronize transitioning between those states.

The only time you might need to rely on state information is to check whether the thread is alive, which is required sometimes for diagnostics or control flow. Even then, you should use the Boolean read-only public property IsAlive instead of the ThreadState property:

    public bool IsAlive { get; }

For example, there is little point in calling Join( ) on a thread if the thread isn't alive:

    Thread workerThread;
//Some code to start workerThread, then:
if(workerThread.IsAlive)
{
workerThread.Join( );
}
Trace.WriteLine("Thread is dead");

5. Foreground and Background Threads

.NET defines two kinds of managed threads: background and foreground. The two thread types are exactly the same, except that .NET keeps the process alive as long as there is at least one foreground thread running, whereas a background thread doesn't keep the .NET process alive once all foreground threads have exited.

New threads are created as foreground threads by default. To mark a thread as a background thread, you need to set the Thread object's IsBackground property to true:

    public bool IsBackground { get; set; }

When the last foreground thread in a .NET application terminates, .NET shuts down the application. The .NET runtime then tries to terminate all the remaining background threads by throwing a ThreadAbortException in each. Background threads are a poor man's solution for application shutdown: instead of designing the application correctly to keep track of what threads it created (and which threads are still running and need to be terminated when the application shuts down), a quick and dirty solution is to let .NET try to terminate all the background threads for you. Normally, you shouldn't count on .NET to kill your background threads for you. You should have a deterministic, structured way of shutting down your application—in other words, you should do your own bookkeeping and explicitly control the lifecycles of each of your threads, taking steps to shut down all threads on exit.

6. Thread Priority and Scheduling

Each thread is allocated a fixed time slot to run on the CPU and assigned a priority. In addition, each thread is either ready to run or waiting for some event to occur, such as a synchronization object being signaled or a sleep timeout elapsing. The underlying operating system schedules for execution those threads that are ready to run based on the threads' priorities. Thread scheduling is preemptive, meaning that the thread with the highest priority always gets to run. If a thread T1 with priority P1 is running, and suddenly thread T2 with priority P2 is ready to run, and P2 is greater than P1, the operating system will preempt (pause) T1 and allow T2 to run. If multiple threads with the same (highest) priority are ready to run, the operating system will let each run for the duration of its CPU time slot and then preempt it in favor of another thread with the same priority, in a round-robin fashion.

The Thread class provides the Priority property of the enum type ThreadPriority, which allows you to retrieve or set the thread priority:

    public ThreadPriority Priority { get; set; }

The enum ThreadPriority provides five priority levels:

    public enum ThreadPriority
{
Lowest,
BelowNormal,
Normal,
AboveNormal,
Highest
}

New .NET threads are created by default with a priority of ThreadPriority.Normal. Developers often abuse thread-priority settings as a way to control the flow of a multithreaded application, to work around race conditions. Tinkering with thread priorities generally isn't an appropriate solution, though, and it can lead to some adverse side effects and other race conditions. For example, say you have two threads that are involved in a race condition. By increasing one thread's priority in the hope that it will preempt the other and thus win the race, you often just decrease the probability of the race condition occurring (rather than eliminating it altogether), because the thread with the higher priority can still be switched out or blocked. In addition, you must consider whether it makes sense to always run that thread at a higher priority. Granting it a high priority could paralyze other aspects of your application, because it won't only preempt the thread with which you're trying to avoid the race condition. You could, of course, increase the priority only temporarily, but then you would address just that particular occurrence of the race condition and remain exposed to future occurrences.

You may be tempted to always keep that thread at a high priority and also increase the priorities of other affected threads, but this is also problematic. Often, increasing one thread's priority causes an inflation of thread priorities all around, because the normal balance and time-sharing governed by the operating system is disturbed. The result can be a set of threads, all with the highest priority, still involved in race conditions. The major adverse effect now is that .NET itself suffers, because many of its internal threads (such as threads used to manage memory, execute remote calls, and so on) are suddenly competing with your high-priority threads.

A further complication when manipulating priority settings is that preemptive operating systems (such as Windows) may dynamically change threads' priorities to resolve priority-inversion situations. A priority inversion occurs when a thread with a lower priority runs instead of a thread with a higher priority. Because .NET threads are currently mapped to the underlying Windows threads, these dynamic changes propagate to the managed threads as well. Consider, for example, three managed threads, T1, T2, and T3, with respective priorities of ThreadPriority.Lowest, ThreadPriority.Normal, and ThreadPriority.Highest. T3 is waiting for a resource held by T1. T1 is ready to run to release the resource, except that T2 is now running, preventing T1 from executing. As a result, T2 prevents T3 from running, and priority inversion takes place because T3 has a priority greater than that of T2.

To cope with priority inversions, the operating system not only keeps track of thread priorities but also maintains a scoreboard showing who got to run and how often. If a thread is denied the CPU for a long time (a few seconds), the operating system dynamically boosts that thread's priority to a higher priority, lets it run for a couple of time slots with the new priority, and then resets the priority back to its original value. In the previous scenario, this allows T1 to run, release the resource T3 is waiting for, and then regain its original priority. Once the resource is available, T3 will be ready to run and will preempt T2.

The point of this example and the earlier arguments is that you should avoid trying to control the application flow by setting thread priorities. Instead, use .NET synchronization objects to control and coordinate the flow of your application and to resolve race conditions. Set threads' priorities to values other than ThreadPriority.Normal only when the semantics of the application require it. For example, if you develop a screen saver, its threads should run at priority ThreadPriority.Lowest so that other background operations (such as compilation, network access, or number crunching) can take place without being affected by the screen saver.

Other  
  •  System Center Configuration Manager 2007 : Integrating Virtual Applications (part 3) - Creating Adobe Reader as a Virtual Application in ConfigMgr R2
  •  System Center Configuration Manager 2007 : Integrating Virtual Applications (part 2) - Activating Application Virtualization in ConfigMgr 2007 R2
  •  System Center Configuration Manager 2007 : Integrating Virtual Applications (part 1) - What Is SoftGrid?
  •  Microsoft Lync Server 2010 : Planning for Internal Non-Voice Deployment - Planning for Archiving (part 2)
  •  Microsoft Lync Server 2010 : Planning for Internal Non-Voice Deployment - Planning for Archiving (part 1)
  •  Microsoft Lync Server 2010 : Planning for Internal Non-Voice Deployment - Planning for Conferencing
  •  Microsoft Lync Server 2010 : Planning for Internal Non-Voice Deployment - Planning for IM
  •  Microsoft Lync Server 2010 : Planning for Internal Non-Voice Deployment - Planning for Capacity
  •  Microsoft Lync Server 2010 : Planning for Internal Non-Voice Deployment - Determining Your Infrastructure Needs
  •  Microsoft Lync Server 2010 : Determining the Scope of the Deployment
  •  
    Top 10
    Free Mobile And Desktop Apps For Accessing Restricted Websites
    MASERATI QUATTROPORTE; DIESEL : Lure of Italian limos
    TOYOTA CAMRY 2; 2.5 : Camry now more comely
    KIA SORENTO 2.2CRDi : Fuel-sipping slugger
    How To Setup, Password Protect & Encrypt Wireless Internet Connection
    Emulate And Run iPad Apps On Windows, Mac OS X & Linux With iPadian
    Backup & Restore Game Progress From Any Game With SaveGameProgress
    Generate A Facebook Timeline Cover Using A Free App
    New App for Women ‘Remix’ Offers Fashion Advice & Style Tips
    SG50 Ferrari F12berlinetta : Prancing Horse for Lion City's 50th
    - Messages forwarded by Outlook rule go nowhere
    - Create and Deploy Windows 7 Image
    - How do I check to see if my exchange 2003 is an open relay? (not using a open relay tester tool online, but on the console)
    - Creating and using an unencrypted cookie in ASP.NET
    - Directories
    - Poor Performance on Sharepoint 2010 Server
    - SBS 2008 ~ The e-mail alias already exists...
    - Public to Private IP - DNS Changes
    - Send Email from Winform application
    - How to create a .mdb file from ms sql server database.......
    programming4us programming4us
    programming4us
     
     
    programming4us