A computer monitor screen contains a lot of data, a single screen at 600x400 16 colors requires about 1,000,000 bits be supplied many times a second. This requires a great deal of bandwidth from the video card and monitor. High bandwidth costs money. Although most monitors can handle low resolution screens comfortably, higher resolution screens require sending more data -- which means either greater bandwidth or fewer refreshes. If the screen isn't refreshed often enough -- 60 times a second or more -- it fades perceptably between refreshes causing flicker. Perception of flicker is somewhat subjective and also depends on material. Many users find annoying flicker even at 60Hz refresh and prefer 70Hz or higher.
One way to do more frequent updates without more bandwidth is to interlace the screen by transmitting all the even numbered lines in one pass and all the odd numbered ones in the next. Fading still occurs, but each fading line is flanked by two newly written lines. The affect is noticeable, but nowhere near as annoying as flicker due to slow refresh.
Video signals are usually interlaced and may look best on a computer monitor if displayed in an interlaced video mode at a refresh rate the same as the video refresh rate even when the rate is slow for normal material.
Another problem with interlacing is that to optimize appearance, interlaced lines are intentionally made wider than the same line would be if not interlaced. This results in slightly reduced sharpness of image. Yet another problem is that horizontal lines jitter slightly.
Most users consider more expensive non-interlaced monitors to be better than interlaced monitors, but also consider interlace to be better than flicker.
Return To Index Copyright 1994-2002 by Donald Kenney.