Published

Accuracy vs. Precision

Your quality assurance manager can put you to sleep explaining the difference between these two terms, but you really need to know the difference.

Share

Accuracy describes “close-to-true value.” Precision describes “repeatability.”
Accuracy in measurement describes how closely the measurement from your system matches the actual or true measurement of the thing being measured. It is the difference between the observed average of measurements and the true average.

Think of accuracy as the “trustworthiness” of a measurement system.

Featured Content

Precision in measurement describes how well a measurement system will return the same measure; that is its repeatability.

As the targets show, it is important to be both accurate and precise if you are to get useable information from your measurement system.

But the repeatability has two components: that of the measurement system (gage) itself and that of the operator(s). The differences resulting from different operators using the same measurement device is called reproducibility.

In our shops, we cannot tell if our measurement system has repeatability or reproducibility issues without doing a Long Form Gage R&R study.

Gage repeatability and reproducibility studies (GR&R) use statistical techniques to identify and discern the sources of variation in our measurement system: is it the gage, or is it the operator?

Gage error determined by the GR&R is expressed as a percentage of the tolerance that you are trying to hold.

Typically, 10 percent or less gage error is considered acceptable. More than 30 percent is unacceptable; between 10 and 30 percent gage error may be acceptable depending on the application. 

Regardless, any level of gage error is an opportunity for continuous improvement.

— Precision Machined Products Association