In statistics, the sum of squared deviations refers to how much a set of sample numbers vary or disperse from a given number, usually the mean of the set. When analyzing data, it is important to look at overall variability and the degree of consistency the received scores have.
To arrive at the sum of squared deviations, or more simply, the sum of squares (SS), first the deviation of each number is calculated by subtracting it from the mean of all of the numbers. The individual deviation numbers are then squared and the squares are added. Finally, the added squared deviations are plugged into a formula used to calculate the overall sum of squares for a given data set.