What Is Ohm's Law in Electromagnetic Theory?

Ohm's law describes the relationship between current, resistance, and voltage between two points. The law states that V=I*R, where V is voltage, I is current, and R is resistance. Current is measured in amperes, voltage is measured in volts, and resistance is measured in ohms.

The law also describes the resistance as constant and independent of voltage and current. In contrast, voltage and current vary directly proportionally. However, it is worth noting that resistance can change as a function of temperature, and resistors have a tendency to change temperature as current passes through them. In general, as the resistors warm, their resistance increases. The law was determined empirically, and is widely accepted as an accurate approximation for the vast majority of situations. The law is consistent in both direct current and alternating current circuits, provided the circuits do not contain inductors or capacitors.

The law was published by Georg Ohm in 1827, and the original format was more complex than the widely known V=IR. The more general versions of the equation can be applied to magnetism as well as electronics, and the derivatives of these equations have uses in determining the relationships between current density, magnetic field, and material conductivity.