A pet hate of mine is end users (journalists especially) or—even worse—developers who don’t understand how process priorities are supposed to work. Often, both in the press and elsewhere, people write that increasing the process priority makes the program in question run faster. This is a fallacy.
What process priority controls is which higher priority tasks can interrupt any given task to use the processor. If you give a process a low priority, and then do nothing else with the machine, it will still take the same amount of time to complete. Process priority only has an effect when you are doing something else with the machine at the same time.
The right way to use process priorities is shown in the following table:
|Type of task||Example||Appropriate Priority|
|Safety-critical real-time||Controlling chemical plant||Highest|
|Monitoring Nuclear reactor temperature|
|Real-time||Playing audio or video files||High|
|Video game rendering|
|Web browser page rendering|
|Checking for email|
|Manipulating large data sets|
Whilst the table above is very general, the point is that it makes it clear that user interaction (for instance) should take precedence over processing. If I, as an end-user, wish to wordprocess on my machine whilst I wait for a 3D rendering to complete, I should be able to do so. If, on the other hand, I want the rendering to finish as quickly as possible, I can leave my machine alone. Inappropriate priority settings (such as 3D rendering at high priority) make the machine unusable because the interactive response time becomes unacceptable. They don’t make the rendering run any faster!