A voltmeter measures the difference in voltage between two points on an electrical circuit. The two points are usually the positive and negative input terminals. Both digital and analog voltmeters display their readings in terms of volts, which are units of measurement that convey the force of electricity in a circuit.
To obtain a reading, the user connects wires between the voltmeter and the two points, such as the input terminals, on the circuit. When the voltmeter is activated, it measures the pressure of the voltage in the circuit, then displays the reading in numerical form (on a digital unit) or with a needle that points to a number (on an analog unit). Modern voltmeters also take into account the disturbance of the circuit created by attaching the voltmeter, which increases accuracy.
Electricians use voltmeters to fine-tune electrical currents in residential and commercial electrical systems. Voltmeters can be used to test or diagnose any electrical device, however, such as the alternator on a vehicle or a home appliance. Accuracy and calibration are the two greatest hurdles when using voltmeters, especially among the analog models.
Digital voltmeters have existed since the mid-1950s, and they are typically priced based on their abilities to measure different levels of pressure. Most digital voltmeters are equipped with conversion functionality to translate the voltage reading to a numerical value.