"Percent uncertainty" is a measure of the uncertainty of a measurement compared to the size of the measurement, expressed as a percentage. The calculation is derived by dividing the uncertainty of the experiment into the total value of the measurement and multiplying it by 100.
Continue ReadingFor example, consider Andrew, who steps on a scale with a relative error, or uncertainty, of plus/minus 5 pounds. Due to the potential inaccuracy of the scale, if Andrew weighs in at 180 pounds, he could theoretically weigh as little as 175 or as much as 185 pounds. Therefore, when dividing the uncertainty (5) into Andrew's perceived weight (180), the result is 0.02777777, or an uncertainty of 2.7 percent.
Learn more about Calculus