The National Television System Committee was established in 1940 by the United States Federal Communications Commission (FCC) to resolve the conflicts that arose between companies over the introduction of a nationwide analog television system in the United States. In March 1941, the committee issued a technical standard for black-and-white television that built upon a 1936 recommendation made by the Radio Manufacturers Association (RMA). Technical advancements of the vestigial sideband technique allowed for the opportunity to increase the image resolution broadcast to consumer televisions. The NTSC compromised between RCA's desire to keep a 441–scan line standard (which was already being used by RCA's NBC TV network) and Philco's desire to increase the number of scan lines to between 605 and 800: A 525-line transmission standard was selected. Other technical standards in the final recommendation were a frame rate (image rate) of 30 frames per second consisting of two interlaced fields per frame (2:1 interlacing) at 262.5 lines per field or 60 fields per second, along with an aspect ratio of 4:3, and frequency modulation (FM) for the sound signal (which was quite new at the time).
In January 1950 the Committee was reconstituted to standardize color television. In December 1953, it unanimously approved what is now called simply the NTSC color television standard (later defined as RS-170a). The updated standard retained full backwards compatibility ("compatible color") with older black-and-white television sets. Color information was added to the black-and-white image by adding a color subcarrier of 4.5 × 455/572 MHz (approximately 3.58 MHz) to the video signal. In order to minimize interference between the chrominance signal and FM sound carrier, the addition of the color subcarrier also required a slight reduction of the frame rate from 30 frames per second to 30/1.001 (very close to 29.97) frames per second, and changing the line frequency from 15,750 Hz to 15,734.26 Hz.
The FCC had briefly approved a different color television standard, starting in October 1950, which was developed by CBS. However, this standard was incompatible with black-and-white broadcasts. It used a rotating color wheel (a technique re-used in the first DLP projectors developed in the late 1980s), reduced the number of scan lines from 525 to 405, and increased the field rate from 60 to 144 (but had an effective frame rate of only 24 frames a second). Legal action by rival RCA kept commercial use of the system off the air until June 1951, and regular broadcasts only lasted a few months before manufacture of all color television sets was banned by the Office of Defense Mobilization (ODM) in October, ostensibly due to the Korean War. CBS rescinded its system in March 1953, and the FCC replaced it on December 17, 1953 with the NTSC color standard, which was cooperatively developed by several companies (including RCA and Philco). The first publicly announced network TV broadcast of a program using the NTSC "compatible color" system was an episode of NBC's Kukla, Fran and Ollie on August 30, 1953, although it was viewable in color only at the network's headquarters. The first nationwide view of NTSC color came on the following January 1 with the coast-to-coast broadcast of the Tournament of Roses Parade, viewable on prototype color receivers at special presentations across the country.
The first color NTSC television camera was the RCA TK-40, used for experimental broadcasts in 1953; an improved version, the TK-40A, introduced in March 1954, was the first commercially available color TV camera. It was replaced later that year by an improved version, the TK-41, which became the standard camera used throughout much of the 1960s.
The NTSC standard has been adopted by other countries, including most of the Americas and Japan. With the advent of digital television, analog broadcasts are being phased out. NTSC broadcasts are mandated by the FCC to end in the United States on February 17, 2009.
The NTSC field refresh frequency was originally exactly 60 Hz in the black-and-white system, chosen because it matched the nominal 60 Hz frequency of alternating current power used in the United States. Matching the field refresh rate to the power source avoided wave interference which produces rolling bars on the screen. Synchronization of the refresh rate to the power incidentally helped kinescope cameras record early live television broadcasts, as it was very simple to synchronize a film camera to capture one frame of video on each film frame by using the alternating current frequency as a shutter trigger.
The figure of 525 lines was chosen as a consequence of the limitations of the vacuum-tube-based technologies of the day. In early TV systems, a master voltage-controlled oscillator was run at twice the horizontal line frequency, and this frequency was divided down by the number of lines used (in this case 525) to give the field frequency (60 Hz in this case). This frequency was then compared with the 60 Hz power-line frequency and any discrepancy corrected by adjusting the frequency of the master oscillator.
The only practical method of frequency division available at the time was the use of multivibrators, which could only divide by small numbers. For interlaced scanning an odd number of lines per frame was required in order to make the vertical retrace distance identical for the odd and even fields; an extra odd line means that the same distance is covered in retracing from the final odd line to the first even line as from the final even line to the first odd line, so simplifying the retrace circuitry. This meant that a chain of multivibrators was needed, each of which had to divide by a small, odd number. (Note that an odd number is never integrally divisible by any even number). The closest practical sequence to 500 was 3 × 5 × 5 × 7 = 525. Similarly, 625-line PAL & SECAM uses 5 × 5 × 5 × 5. The British 405-line system used 3 × 3 × 3 × 3 × 5, the French 819-line system used 3 × 3 × 7 × 13. Although other values were theoretically possible, all of them involved division by unacceptably large numbers, which produced reliability problems. Modern systems derive all their frequencies from the color subcarrier frequency (see below).
In the color system the refresh frequency was shifted slightly downward to 59.94 Hz to eliminate stationary dot patterns in the color carrier, as explained below in "Color encoding".
The system used in North America is NTSC. Western Europe, Australia, and Eastern South America use PAL. Eastern Europe used SECAM, but switched to PAL after the change of the political regimes there. France still uses SECAM. Generally, a device (such as a television) can only read or display video encoded to a standard which the device is designed to support; otherwise, the source must be converted (such as when European programs are broadcast in North America or vice versa).
This table illustrates the differences:
|NTSC M||PAL B,G,H||PAL I||PAL N||PAL M||SECAM B,G,H||SECAM D,K,K',L|
|Horizontal Frequency||15.734 kHz||15.625 kHz||15.625 kHz||15.625 kHz||15.750 kHz||15.625 kHz||15.625 kHz|
|Vertical Frequency||60 Hz||50 Hz||50 Hz||50 Hz||60 Hz||50 Hz||50 Hz|
|Color Subcarrier Frequency||3.579545 MHz||4.43361875 MHz||4.43361875 MHz||3.582056 MHz||3.575611 MHz|
|Video Bandwidth||4.2 MHz||5.0 MHz||5.5 MHz||4.2 MHz||4.2 MHz||5.0 MHz||6.0 MHz|
|Sound Carrier||4.5 MHz||5.5 MHz||5.9996 MHz||4.5 MHz||4.5 MHz||5.5 MHz||6.5 MHz|
For backward compatibility with black-and-white television, NTSC uses a luminance-chrominance encoding system invented in 1938 by Georges Valensi. Luminance (derived mathematically from the composite color signal) takes the place of the original monochrome signal. Chrominance carries color information. This allows black-and-white receivers to display NTSC signals simply by ignoring the chrominance.
The original chromaticities of the NTSC color primaries were R=[0.67,0.33], G=[0.21,0.71], B=[0.14,0.08], yielding a far larger gamut than most of today's monitors. Over the decades, however, desire for a brighter picture prompted TV manufacturers to deviate from that specification, sacrificing saturation for increased brightness. This deviation from the standard, which happened both at the receiver and broadcaster stage, was the source of considerable color variation in the 1960s As a result, in 1968 the SMPTE recommended a new set of phosphor primaries for studio use, which in 1979 became part of SMPTE 170M, the engineering standard describing the American broadcasting system. Although the old 1953 NTSC specifications are still part of the United States Code of Federal Regulations, all modern broadcast equipment follows the SMPTE 170M standard instead and thus encodes a signal for the SMPTE "C" set of phosphor primaries.
In NTSC, chrominance is encoded using two 3.579545 MHz signals which are 90 degrees out of phase, known as I (in-phase) and Q (quadrature) QAM. Mathematically, the combination of two sine waves 90 degrees out of phase with each other, with varying respective amplitudes, can be viewed as a single sine wave with varying phase relative to a reference, and varying amplitude. In essence, the phase represents the instantaneous color hue captured by a TV camera and the amplitude represents the color saturation.
For a TV or a display to recover hue information from the I/Q phase as just described, it must know the reference for it (i.e. what phase is zero). It also needs a reference against which to compare the amplitude to make saturation sense out of it. So the NTSC signal includes a short sample of this reference signal, known as the color burst, located on the 'back porch' of each horizontal line (the time between the end of the horizontal synchronization pulse and of the blanking pulse on each line). The color burst consists of a minimum of eight cycles of the unmodulated (fixed phase and amplitude) color subcarrier. By comparing the reference signal derived from color burst to the chrominance signal's amplitude and phase at a particular point in the scan, the device determines what chrominance to assign to the pixel then being displayed. Combining that with the amplitude of the luminance signal, the receiver calculates exactly what color to make the pixel.
When a transmitter broadcasts an NTSC signal, it amplitude-modulates a radio-frequency carrier with the NTSC signal just described, while it frequency-modulates a carrier 4.5 MHz higher with the audio signal. If non-linear distortion happens to the broadcast signal, the 3.58 MHz color carrier may beat with the sound carrier to produce a dot pattern on the screen. To make the resulting pattern less noticeable, designers adjusted the original 60 Hz field rate down by a factor of 1.001 (0.1%), to approximately 59.94 fields per second.
The 59.94 rate is derived from the following calculations. Designers chose to make the chrominance subcarrier frequency an n + 0.5 multiple of the line frequency to minimize interference between the luminance signal and the chrominance signal. They then chose to make the audio subcarrier frequency an integer multiple of the line frequency to minimize interference between the audio signal and the chrominance signal. The original black-and-white standard, with its 15750 Hz line frequency and 4.5 MHz audio subcarrier, does not meet these requirements, so designers had either to raise the audio subcarrier frequency or lower the line frequency. Raising the audio subcarrier frequency would prevent existing receivers from properly tuning in the audio signal. Lowering the line frequency is comparatively innocuous, because the horizontal and vertical synchronization information in the NTSC signal allows a receiver to tolerate a substantial amount of slop in the line frequency. So that is the route the color standard took. In the black-and-white standard, the ratio of audio subcarrier frequency to line frequency is 4.5 MHz / 15,750 = 285.71. In the color standard, this becomes rounded to the integer 286, which means the color standard's line rate is 4.5 MHz / 286 ~ 15,734 lines per second. Dividing by 262.5 lines per field, this gives approximately 59.94 fields per second.
An NTSC television channel as transmitted occupies a total bandwidth of 6 MHz. A guard band, which does not carry any signals, occupies the lowest 250 kHz of the channel to avoid interference between the video signal of one channel and the audio signals of the next channel down. The actual video signal, which is amplitude-modulated, is transmitted between 500 kHz and 5.45 MHz above the lower bound of the channel. The video carrier is 1.25 MHz above the lower bound of the channel. Like most AM signals, the video carrier generates two sidebands, one above the carrier and one below. The sidebands are each 4.2 MHz wide. The entire upper sideband is transmitted, but only 750 kHz of the lower sideband, known as a vestigial sideband, is transmitted. The color subcarrier, as noted above, is 3.579545 MHz above the video carrier, and is quadrature-amplitude-modulated with suppressed carrier. The highest 25 kHz of each channel contains the audio signal, which is frequency-modulated, making it compatible with the audio signals broadcast by FM radio stations in the 88–108 MHz band. The main audio carrier is 4.5 MHz above the video carrier. Sometimes a channel may contain an MTS signal, which is simply more than one audio signal. This is normally the case when stereo audio and/or second audio program signals are used.
A complex process called "2 pulldown" is used. One film frame is transmitted for three video fields (1.5 video frame times), and the next frame is transmitted for two video fields (1 video frame time). Two 24 frame/s film frames are therefore transmitted in five 60 Hz video fields, for an average of 2.5 video fields per film frame. The average frame rate is thus 60 / 2.5 = 24 frame/s, so the average film speed is exactly what it should be. There are drawbacks, however. Still-framing on playback can display a video frame with fields from two different film frames, so any motion between the frames will appear as a rapid back-and-forth flicker. There can also be noticeable jitter/"stutter" during slow camera pans.
To avoid 3:2 pulldown, film shot specifically for NTSC television is often taken at 30 frame/s.
For viewing native PAL or SECAM material (such as European television series and some European movies) on NTSC equipment, a standards conversion has to take place. There are basically two ways to accomplish this.
Because satellite power is severely limited, analog video transmission through satellites differs from terrestrial TV transmission. AM is a linear modulation method, so a given demodulated signal-to-noise ratio (SNR) requires an equally high received RF SNR. The SNR of studio quality video is over 50 dB, so AM would require prohibitively high powers and/or large antennas.
Wideband FM is used instead to trade RF bandwidth for reduced power. Increasing the channel bandwidth from 6 to 36 MHz allows a RF SNR of only 10 dB or less. The wider noise bandwidth reduces this 40 dB power saving by 36 MHz / 6 MHz = 8 dB for a substantial net reduction of 32 dB.
Sound is on a FM subcarrier as in terrestrial transmission, but frequencies above 4.5 MHz are used to reduce aural/visual interference. 6.8, 5.8 and 6.2 MHz are commonly used. Stereo can be multiplex or discrete, and unrelated audio and data signals may be placed on additional subcarriers.
A triangular 60 Hz energy dispersal waveform is added to the composite baseband signal (video plus audio and data subcarriers) before modulation. This limits the satellite downlink power spectral density in case the video signal is lost. Otherwise the satellite might transmit all of its power on a single frequency, interfering with terrestrial microwave links in the same frequency band.
In half transponder mode, the frequency deviation of the composite baseband signal is reduced to 18 MHz to allow another signal in the other half of the 36 MHz transponder. This reduces the FM benefit somewhat, and the recovered SNRs are further reduced because the combined signal power must be "backed off" to avoid intermodulation distortion in the satellite transponder. A single FM signal is constant amplitude, so it can saturate a transponder without distortion.
However, the mismatch between NTSC's 30 frames per second and film's 24 frames is well overcome by an ingenious process which capitalizes on the field rate of the interlaced NTSC signal, thus avoiding the film playback speedup used for PAL and SECAM at 25 frames per second (which causes the accompanying audio to increase in pitch slightly). See Framerate conversion above.
There is no question the NTSC system reflects the technology of its originating era, but its compatibility and flexibility has been the key to its longevity over seven decades. The coming of digital television and high-definition television will end the need for analog television systems. NTSC broadcasts are mandated by the FCC to end in the United States on February 17, 2009.
The similarities of NTSC-M and NTSC-N can be seen on the broadcast television systems#ITU identification scheme table, which is reproduced here:
|System||Lines||Frame rate||Channel b/w||Visual b/w||Sound offset||Vestigial sideband||Vision mod.||Sound mod.||Notes|
|M||525||29.97||6||4.2||+4.5||0.75||Neg.||FM||Most of the Americas and Caribbean, Philippines, South Korea, Taiwan (all NTSC-M) and Brazil (PAL-M).|
|N||625||25||6||4.2||+4.5||0.75||Neg.||FM||Argentina, Paraguay, Uruguay (all PAL-N). Greater number of lines results in higher quality.|
As it is shown, aside from the number of lines and frames per second, the systems are identical. NTSC-N/PAL-N/PAL-Nc are compatible with sources such as game consoles, VHS/Betamax VCRs, and DVD players. However, they are not compatible with baseband broadcasts (which are received over an antenna), though some newer sets come with baseband NTSC 3.58 support (NTSC 3.58 being the frequency for color modulation in NTSC: 3.58 MHz).
The NTSC 4.43 system, while not a broadcast format, appears most often as a playback function of PAL cassette format VCRs, beginning with the Sony 3/4" U-Matic format and then following onto Betamax and VHS format machines. As Hollywood has the claim of providing the most cassette software (movies and television series) for VCRs for the world's viewers, and as not all cassette releases were made available in PAL formats, a means of playing NTSC format cassettes was highly desired.
Multi-standard video monitors were already in use in Europe to accommodate broadcast and professional needs regarding PAL, SECAM, and NTSC video formats from sources dedicated to just one of those formats. The heterodyne color-under process of U-Matic, Betamax & VHS lent itself to minor modification of VCR players to accommodate NTSC format cassettes. The color-under format of VHS uses a 629 kHz subcarrier while U-Matic & Betamax use a 688 kHz subcarrier to carry an amplitude modulated chroma signal for both NTSC and PAL formats. Since the VCR was ready to play the color portion of the NTSC recording using PAL color mode, the PAL scanner and capstan speeds had to be adjusted upwards from PAL's slower 50 Hz field rate to match NTSC's 59.94 Hz field rate, and faster linear tape speed.
Although easier to do than explain, the changes to the PAL VCR are very minor thanks to the existing VCR recording formats. The output of the VCR when playing an NTSC cassette in NTSC 4.43 mode is 525 lines/29.97 frames per second with PAL compatible heterodyned color. The multi-standard receiver is already set to support the NTSC H & V frequencies; it just needs to do so while receiving PAL color.
The existence of those multi-standard receivers was probably part of the need for region coding of DVDs. As the color signals are component on disc for all display formats almost no changes would be required for PAL DVD players to play NTSC (525/29.97) discs as long as the display was frame-rate compatible.
VIR (or Vertical interval reference), widely adopted in the 1980s, attempts to correct some of the color problems with NTSC video by adding studio-inserted reference data for luminance and chrominance levels on line 19. Suitably-equipped television sets could then employ these data in order to adjust the display to a closer match of the original studio image. The actual VIR signal contains three sections, the first having 70 percent luminance and the same chrominance as the color burst signal, and the other two having 50 percent and 7.5 percent luminance respectively.
A less-used successor to VIR, GCR, also added ghost (multipath interference) removal capabilities.
The remaining vertical blanking interval lines are typically used for datacasting or ancillary data such as video editing timestamps (vertical interval timecodes or SMPTE timecodes on lines 12–14 ), test data on lines 17–18, a network source code on line 20 and closed captioning, XDS, and V-chip data on line 21. Early teletext applications also used vertical blanking interval lines 14–18 and 20, but teletext over NTSC was never widely adopted by viewers
Many stations transmit TV Guide On Screen (TVGOS) data for an electronic program guide on VBI lines. The primary station in a market will broadcast 4 lines of data, and backup stations will broadcast 1 line. In most markets the PBS station is the primary host. TVGOS data can occupy any line from 10-25, but in practice its limited to 11-18, 20 and line 22. Line 22 is only used for 2 broadcast, DirecTV and CFPL-TV.
The horizontal resolution numbers in the following tables and graphs may not reflect reality, when transmitted over an analog medium in NTSC format.
TV networks eye digital delivery; plan for compressed NTSC and DTV feeds.(Special Report: Domestic Satellites)
Jan 26, 1998; As the major broadcast TV networks gear up to launch digital broadcasts this fall, they are evaluating how best to utilize their...