What Is the Difference Between an Analog and a Digital Computer?

Don DeBold/CC-BY 2.0

The basic difference between analog and digital computers lies in the different approaches they take to handling data. Analog computers are able to deal with continuously varying inputs for complex phenomena such as voltage changes and temperature fluctuations. Digital computers must have their inputs reduced to a simple binary language in order to accurately model the world.

Analog systems were once favored by engineers who lacked modern digital technology to run calculations on. As digital technology advanced, however, it came to be preferred for its higher precision and the flexibility of the programs that could be run on digital machines. Modern electronic computers are virtually all digital and express their models in terms of 1s and 0s.

Analog computers haven’t died out, however. Oscilloscopes, of the kind used by electrical and sound engineers, are examples of analog computer technology. Slide rules are another example of a computation device that runs essentially on an analog input and output system. Perhaps the most ubiquitous example of an analog computer in the world is the brain itself. Brains are actually a combination of digital and analog systems. While the firing of a single neuron can be regarded as a simple on/off function, the communication between neurons is accomplished chemically by means of neurotransmitters that vary in concentration and intensity.