PARALLE_LISM
11/9/96
Parallelism is a technique for speeding up data processing operations by doing things simultaneously which would otherwise be done serially. It is roughly the equivalent of assigning two people to the same job previously done by one of them. It has much the same advantages and disadvantages. Often it gets things done twice as fast. Sometimes it is less effective. In the worst case, the negotiations and interlocks needed to keep the parallel entities from interfering actually slow things down. Parallelism takes several forms. Offloading of work moves work from a heavily loaded "bottleneck" element to a less heavily loaded element. For example many common display functions are offloaded from the CPU to the display system. Pipelining -- introduced into mainframes in the 1960s and into Intel CPUs with the 486 -- breaks jobs up into discrete elements and attempts to move tasks through these elements much as manufactured goods move down a production line. Full parallel processing of computer instructions was introduced into mainframes in the 1970s and into PCs starting with the Pentium where it is heavily constrained. It is much more fully developed in the Pentium Pro, Cyrix 6x86, and other CPUs of the same vintage.
Return To Index
Copyright 1994-2008 by Donald Kenney.