In computing, a pipeline is a set of data processing elements connected in series, where the output of one element is the input of the next one. the elements of a pipeline are often executed in parallel or in time-sliced fashion; in that case, some amount of buffer storage is often inserted between elements. computer-related pipelines include: instruction pipelines, such as the classic risc pipeline, which are used in central processing units (cpus) to allow overlapping execution of multiple instructions with the same circuitry. the circuitry is usually divided up into stages, including instruction decoding, arithmetic, and register fetching stages, wherein each stage processes one instruction at a time. graphics pipelines, found in most graphics processing units (gpus), which consist of multiple arithmetic units, or complete cpus, that implement the various stages of common rendering operations (perspective projection, window clipping, color and light calculation, rendering, etc.). software pipelines, where commands can be written where the output of one operation is automatically fed to the next, following operation. the unix system call pipe is a classic example of this concept, although other operating systems do support pipes as well.