Monday, March 2, 2009

A few years ago a friend of mine bought himself a Range Rover – not the pansy-Discovery from Land Rover, but the all-out, pick a fight with a rhinoceros and win, Range Rover. When he showed it off to me, he also showed me the Range Rover Driver’s Manual. Now, there’s not anything special about most Driver’s Manuals for cars, but this one had an interesting section – a section that should have been titled:

“Things You Probably Shouldn’t Do, But If You Must, Here’s How”

Instructions on how to drive your shiny, new Range Rover through a boulder-strewn gully, across 60-degree slopes and stuff like that. You know, the kind of things that wind up on YouTube with viewers shaking their heads and muttering, “stupid people doing stupid things”.

And the really bad part is that knowing those things are possible and having the instructions right there makes it very tempting to search out a boulder-strewn gully.

Anyway, Task Manager in the .Net 4 Parallel Extensions should have a documentation section like that: the things you probably shouldn’t do, but here’s how. After the perils of parallel programming in general, this is probably the functionality that’s going to get more people into more trouble than any other, simply from the temptation of knowing it’s there.

TaskManager is the class that creates and manages the threads used by the Parallel Extension classes. I believe I saw in a blog post somewhere that it’s going to be renamed to TaskScheduler before the .Net 4 release. Its interface is extremely simple:

1:publicclass TaskManager : IDisposable

2: {

3:public TaskManager();

4:public TaskManager(TaskManagerPolicy policy);

5:

6:publicstatic TaskManager Current { get; }

7:publicstatic TaskManager Default { get; }

8:public TaskManagerPolicy Policy { get; }

9:

10:publicvoid Dispose();

11:protectedvirtualvoid Dispose(bool disposing);

12: }

The only things of real interest are the three properties: Current, Default and Policy.

Current and Default get the TaskManager that created the current thread and the default TaskManager for the application respectively. Policy gets the TaskManagerPolicy that the TaskManager was created with. So, basically, you can create a TaskManager and get some values from it, so what can we actually do with one?

As with the original example, this takes between four- and five-seconds to complete; but working on the theory that more is better, what happens if we increase the number of threads per core? Since we’re on a four-core system, let’s try 25 threads per core – that will create 100 threads, one for each iteration of the loop:

1: var mgr = new TaskManager(

2:new TaskManagerPolicy(

3: TaskManager.Default.Policy.MinProcessors,

4: TaskManager.Default.Policy.IdealProcessors,

5: 25, //TaskManager.Default.Policy.IdealThreadsPerProcessor,

6: TaskManager.Default.Policy.MaxStackSize,

7: TaskManager.Default.Policy.ThreadPriority

8: )

9: );

This had a slightly negative effect on the overall performance. Why? Well, because even if we have 100-threads running, they still have to share time on four cores, so there’s going to be contention as the system tries to fairly allocate time on each core to twenty-five threads.

The next thing someone might want to change is the ThreadPriority. For this test, I started the Parallel Extensions Ray Tracer example in the background. This application uses some significant processing resources, so all cores are close to 100% utilization while it’s running and there’s a significant performance degradation in the test application:

By changing the ThreadPriority, we can effect the performance either positively or negatively:

But while setting the Highest thread priority brought some performance back to our application, it had a very negative impact on the RayTracer running in the background, dropping the framerate from 5.2 to 2.8 frames-per-second:

Doing this without the user’s permission is rather bad form, since it steals CPU cycles from other applications that the user may be actively working in or relying on.

Conclusion

TaskManager (or TaskScheduler, if it’s renamed to that) exists, but its power shouldn’t be used arbitrarily, because the effects can be undesirable. There are times when you might have perfectly legitimate needs to modify these defaults, but if you’re going to take your application on a ride across that 60-degree slope, make sure you know what you’re doing and test thoroughly so you don’t wind up on YouTube with people laughing at you.

1 comment:

Nice article. This essentially shows you why its called "Parallel extensions" as this gives you the power of true parallism: threads on cores, core affinity, thread priorities etc. In multi-threading these topics are largely ignored/abstracted. I think this and PLINQ are going to be great additions for scalaility/performance on large systems.