Variance measures how the data in a given variable distribution are spread relative to the mean. Operationally, it is the average squared distance from the mean and is calculated by the square root of the standard deviation. Conceptually, it indicates how much variation is within a given sample.
In a variance calculation, each data point in a distribution is subtracted from the variable mean, and the resulting difference is squared. The squared differences are then summed together and divided by the number of points in the distribution. Therefore, variance decreases when data points are closer to the mean, and it increases when the data are more scattered across the distribution.