Learn about the technologies behind the Internet with The TCP/IP Guide!|
NOTE: Using robot software to mass-download the site degrades the server and is prohibited. See here for more.
Find The PC Guide helpful? Please consider a donation to The PC Guide Tip Jar. Visa/MC/Paypal accepted.
|View over 750 of my fine art photos any time for free at DesktopScenes.com!|
Refresh Rates and Interlacing
The image you see on the monitor's screen is displayed by the dots of phosphorescent material. Each dot is illuminated to a specific intensity based on the video signal, and the use of red, green and blue dots allows the creation of a large number of different colors. Each time a dot of material is struck by the electron beam in the CRT, it glows for a fraction of a second and then fades.
In order to maintain a stable image, the electron beam must sweep the entire surface of the screen and then return to redraw it, many times per second. This process is called refresh or refreshing of the screen. If the electron beam takes too long to return and redraw a pixel, the pixel will begin to fade in brightness and then return to full brightness when redrawn. Over the full surface of the screen, this becomes visible as a "flicker" in the image, which can be both distracting and hard on the eyes.
In order to avoid flicker, the screen image must be redrawn sufficiently quickly that the eye cannot tell that refresh is going on. The refresh rate is the number of times per second that the screen is refreshed. The refresh rate necessary to avoid flicker varies with the individual, because it is based on the eye's ability to notice the repainting of the image many times per second. Some people are more sensitive to this than others. Note that flicker also depends on the size of the monitor. Flicker is easier to see on a larger monitor than on a small one because there is more screen in the field of vision of the user. My experience has generally been as follows:
Higher refresh rates are preferred for better comfort in viewing the monitor, although above a certain point there is no appreciable difference. Certainly most people cannot tell the difference between refresh rates above 80 Hz, and I know of nobody who can distinguish rates above 100 Hz. Bear in mind also that environmental factors, including the lighting level in the room, can affect perceived flicker as well.
The maximum refresh rate possible depends on the resolution of the image. Higher resolution images generally max out at lower refresh rates than lower resolution images, because the monitor has more surface area to cover with each sweep. Support for a given refresh rate requires two things: support from the video card to generate the refresh signal at the appropriate speed, and support from the monitor to handle the refresh. The video card has several factors that determine its ability to generate high refresh rates. This can be somewhat complicated; see the companion section on refresh in the video card chapter, for more details. The monitor's ability to handle a given refresh rate is a function of its design and size, and also its input bandwidth, the limit of how much information it can handle being sent from the video card every second. The input bandwidth must be at least as high as the bandwidth being output from the video card.
Every monitor should include as part of its specifications a list of the resolutions it supports and what the maximum refresh rate is for each resolution. This means that you don't have to concern yourself with the details of bandwidth, scan speeds and the like--all you need to do is to read the manual and determine what the maximum refresh is at whatever resolutions you want to use. The video card must be set not to attempt to refresh at a rate that exceeds the limit for the monitor.
In fact, it's even easier than that now; to eliminate potential problems, many video cards now include setup utilities that are pre-programmed with information about popular monitors. You select a monitor and the video card automatically knows what resolutions are supported, as well as the refresh limits for each. Windows 95 extends this a step by supporting Plug and Play for monitors; you plug the monitor in and Windows will detect it, set the correct display type and choose the optimal refresh rate, all automatically.
Warning: Setting the refresh
rate too high for the monitor can in theory damage it. In practice, this does not usually
happen; if you set the refresh too high the monitor appears to go "haywire" but
returns to normal when the refresh rate is dropped. Still, better safe than sorry.
Since the higher the resolution, the harder it is to maintain a decent refresh rate, most monitors (and video cards) reach a point where they cannot maintain a refresh rate high enough to allow an acceptable display. Some monitors, at this point, use a technique called interlacing to cheat a bit and allow themselves to display at a higher resolution than they otherwise could. Instead of redisplaying every line of the screen, when in an interlaced mode the electron guns sweep alternate lines on each pass. So the first pass, odd-numbered lines are refreshed, and the next pass, even-numbered lines, and so on. This allows the refresh rate to be "doubled" because only half the screen is redrawn at a time. The usual refresh rate for interlaced operation is 87 Hz, which corresponds to 43.5 Hz of "real" refresh given the half-screen interlacing.
While running at 87 Hz interlaced produces less flicker than the large amount you would see running at 43.5 Hz non-interlaced, it still produces flicker compared to even a regular 60 Hz non-interlaced mode. Many people find interlaced images difficult to view; combined with the fact that interlacing is normally coupled with high resolutions that result in very small features on the screen, the result is almost always very hard on the eyes.
Next: Monitor Size