How Are Radio Waves Detected?

Jiuguang Wang/CC-BY-SA 2.0

Radio waves are detected using electrical circuits that receive these electromagnetic signals in an antenna, and then the radio frequencies are modulated through capacitors before emerging as sound in a speaker. Radio waves are normally less than a kilohertz long up to 20 gigahertz. Since humans cannot hear these frequencies, radio signals are often translated into sounds by electrical devices.

Antennas that receive radio signals are generally the same size as the wavelength they are designed to receive. Wider antennas detect longer, or fainter, wavelengths. Communications antennas are usually one-quarter the size of the transmitter, especially with applications for automobiles, satellite television and cellular phones. Receiving antennas are smaller simply to save space while the transmitters are stronger so detection is easier.

In March 2014, the Niels Bohr Institute announced a new method to detect radio waves using lasers that operate at room temperature. Very low frequencies of radio waves are measured in very cold settings that are a few degrees above absolute zero to reduce background noise created by heat. Background noise distorts readings and measurement so the laser technology reduces any background noise and allows scientists to take more accurate measurements. Instead of resistors that read the capacitors in an electrical circuit, a laser interprets the capacitor’s signal and turns the radio waves into light energy instead of sound.