What are Accuracy Ratios

Many users overlook the requirement in both ISO 9001 & AS9100D that states:

7.1.5 Monitoring and measuring resources
7.1.5.1 General
The organization shall determine and provide the resources needed to ensure valid and reliable results when monitoring or measuring is used to verify the conformity of products and services to requirements.

To “ensure valid and reliable results” the company must provide personnel with “monitoring or measuring” instruments (resources) that possess sufficient accuracy, range & resolution. A “key” point here is that the instruments possess a sufficient “Accuracy Ratio”.

An “Accuracy Ratio” is defined in SAE AS13003, "Measurement Systems Analysis Requirements for the Aero Engine Supply Chain", as:

AS13003, sec. 2.2, “Definitions”
ACCURACY RATIO: The ratio between the total part tolerance and the total calibration tolerance of the measurement equipment.

Machinists typically apply what they call the “Ten to One Rule”. This means that the measuring instrument chosen must be accurate (not just discriminate) to at least ⅒ of the tolerance being measured. For example, if you have a dimensional feature with a tolerance of 0.010“, your measuring instrument must be accurate to no less than 0.001”. The calculation is:

Dimension tolerance being measured ÷ Accuracy of measuring device used = Accuracy Ratio

Source: https://www.pngrepo.com/svg/234351/micrometer Using another example, let's assume that you have a micrometer with an accuracy of ±0.0001“. And you're using it to measure a dimensional feature with a tolerance of ±0.005”. Divide 0.005 by 0.0001 to obtain your accuracy ratio. In this scenario, you'd have an accuracy ratio of 50:1.

Management of accuracy ratio is a “risk control”

The greater the accuracy ratio, the smaller the likelihood that an Out-of-Tolerance condition will impact the feature measured. Conversely, the lower the accuracy ratio, the greater the likelihood that an Out-of-Tolerance condition will impact the feature measured (and the part).

Calibration Labs typically apply a minimum 4:1 “Test Accuracy Ratio” (TAR) rule using a similar concept. Generally speaking, 4:1 is the lowest accuracy ratio that should be maintained in order to “ensure valid and reliable results” (readings).

Let's suppose that a measuring device is found Out-of-Tolerance. A measuring device that is found Out-of-Tolerance ≥200% over its stated accuracy is considered “significantly” Out-of-Tolerance (SOOT). Assume that a dimensional feature with a tolerance of ±0.001“ was measured using an instrument accurate to ±0.0001”. If this measuring instrument was found to be 200% Out-of-Tolerance (a SOOT condition), multiply the instrument accuracy (0.0001“) by 2. Then determine your new accuracy ratio using the calculation of 0.001 ÷ 0.0002, which results in a new accuracy ratio of 5:1. Even with the measuring device “significantly” Out-of-Tolerance, the accuracy ratio (in this instance) is still above 4:1. So there is a high degree of probability that conforming product was still delivered to the customer.

However, if the dimensional feature had a tolerance of ±0.001”, and was measured using an instrument accurate to ±0.00015“. Multiply that instrument accuracy (0.00015”) by 2. Then determine your new accuracy ratio using the calculation of 0.001 ÷ 0.0003, which results in a new accuracy ratio of 3.3:1. Since this accuracy ratio is below 4:1, then this would reveal an increased likelihood that nonconforming product had been delivered to the customer.

Out-of-Tolerance Impact Analysis

Many users also overlook the requirement in both ISO 9001 & AS9100D that states:

7.1.5.2 Measurement traceability
The organization shall determine if the validity of previous measurement results has been adversely affected when measuring equipment is found to be unfit for its intended purpose, and shall take appropriate action as necessary.

The best, and easiest way to “determine if the validity of previous measurement results” is to, as described above, calculate the “accuracy ratio” based on the results (“As Found” condition) reported by the Calibration Lab.

If the diminished “Accuracy Ratio” remained above 4:1, then there is a reasonable degree of confidence that the OOT (Out-of-Tolerance) condition did not “adversely” affect the delivered product or service. However, if the diminished “Accuracy Ratio” fell below 4:1, then there is a reasonable degree of uncertainty in whether the OOT (Out-of-Tolerance) condition “adversely” affected the delivered product or service. And the lower the “Accuracy Ratio” (<4:1), the greater the probability that the delivered product or service had been “adversely” affected.