For example, most movie projectors advance from one frame to the next 24 times each second. But each frame is illuminated twice or three times before the next frame is projected using a shutter in front of its lamp. As a result, the movie projector runs at 24 frames per second, but has a 48 or 72 Hz refresh rate.
On CRT displays, increasing the refresh rate decreases flickering, thereby reducing eye strain. However, if a refresh rate is specified that is beyond what is recommended for the display, damage to the display can occur.
For computer programs or telemetry, the term is also applied to how frequently a datum is updated with a new external value from another source (for example; a shared public spreadsheet or hardware feed).
The refresh rate can be calculated from the horizontal scan rate by dividing by the number of horizontal lines and multiplying the result by 0.95 (since about 5% of the time it takes to scan the screen is spent moving the electron beam back to the top). For instance, a monitor with a horizontal scanning frequency of 96 kHz at a resolution of 1280 × 1024 results in a refresh rate of 96,000 / 1024 × 0.95 = 89 Hz (rounded down).
The closest thing liquid crystal shutters have to a refresh rate is their response time, while nearly all LCD backlights (most notably fluorescent cathodes, which commonly operate at ~200Hz) have a separate figure known as flicker, which describes how many times a second the backlight pulses on and off.
Different operating systems set the default refresh rate differently. Windows 95 and Windows 98(SE) set the highest refresh rate that they believe the display supports. Windows NT based OSs such as Windows 2000, its descendant Windows XP and Windows Vista, however, by default set the refresh rate to the lowest supported, usually 60 Hz. And the many variations of Linux usually have the user set up the display manager during installation and set the preferred settings. Although with xfree86 a default option is usually included. Many full-screen applications, such as games, are expected to allow the user to reconfigure their refresh rate before entering full-screen mode. Some poorly designed applications will launch directly into full-screen mode in an out-of-range setting and force the user to reconfigure their video settings "blind".
Old monitors could be damaged if a user set the video card to a higher refresh rate than supported by the monitor. Currently most monitors would simply display a notice that the video signal uses an unsupported refresh rate.
When the cathode ray tube was developed in the 1920s, technology limitations of the time made it difficult to run monitors at anything other than a multiple of the AC line frequency used to power the set. Thus producers had little choice but to run sets at 60 Hz in America, and 50 Hz in Europe. These rates formed the basis for the NTSC (60 Hz) and PAL & SECAM (50 Hz) sets used today. It was widely perceived that this accident of chance gave European sets an advantage, because the slower 50 Hz refresh rate gave the CRT time to scan more detail. This allowed PAL sets to have higher resolution and detail than NTSC counterparts. (640x480 NTSC and 768x576 for PAL/SECAM) However, the lower scan rate can introduce more flicker on high speed motion, so sets that use digital technology to double the refresh rate to 100 Hz are now very popular.
Another difference between 50 Hz and 60 Hz standards is the way motion pictures (Film Sources as opposed to Video Camera Sources) are transferred or presented. 35 mm Film is typically at 24 frame/s. For PAL 50 Hz this allows film sources to be easily transferred by accelerating the film by 4%. The resulting picture is perfectly smooth, however, there is a slight shift in the pitch of the audio which cannot normally be noticed. NTSC sets display both 24 frame/s and 25 frame/s material without any speed shifting by using a technique called , but at the expense of introducing unsmooth playback in the form of Telecine Judder.
Unlike computer monitors, HDTV and some DVDs, analog television systems use interlace, which increases flicker compared to a progressive scan image at the same refresh rate. The amount of extra flicker is largely dependent on the motion content of the image, and the brightness of the screen. Many newer televisions are flicker-free in the form of 100 Hz technology.
As movies are usually filmed at a rate of 24 frames per second, while tv-sets operate at different rates, some conversion is necessary. Different techniques exist to give the viewer an optimal experience.
The combination of content production, playback-device, and display device processing may also give artifacts that are unnecessary. A display device producing a fixed 60frame/s rate cannot display a 24frame/s movie at an even, Judder-free rate. Usually, a 3:2 pulldown is used, giving a slight uneven movement.
While common multisync CRT computer monitors have been capable of running at even multiples of 24 Hz since the early '90s, recent "120Hz" LCD displays have been produced for the purpose of having smoother, more fluid motion. As 120 is an even mutiple of 24, it is possible to present a 24frame/s sequence without Judder on a well-designed 120 Hz display. If the 120 Hz rate is produced by frame-doubling a 60frame/s 3:2 pulldown signal, the uneven motion could still be visible.
"50Hz" tv-sets (when fed with "50Hz" content) usually get a movie that is slightly faster than normal, avoiding any problems with uneven pulldown.