Home / All Definitions / Algebra / Geometry / Numbers & Symbols / Accuracy Definition

# Accuracy Definition

Accuracy is how close an approximation is to an actual value. In other terms, in measurement of a set, accuracy refers to closeness of the measurements to a specific value, while precision refers to the closeness of the measurements to each other. Precision is a description of random errors, a measure of statistical variability.

Given a set of data points from repeated measurements of the same quantity, the set can be said to be accurate if their average is close to the true value of the quantity being measured, while the set can be said to be precise if the values are close to each other.

More commonly accuracy is used as a description of systematic errors, a measure of statistical bias. Low accuracy causes a difference between a result and a true value. ISO calls this trueness. Alternatively, ISO defines accuracy as describing a combination of both types of observational error above (random and systematic), so high accuracy requires both high precision and high trueness.

## Overview

In the fields of science and engineering, the accuracy of a measurement system is the degree of closeness of measurements of a quantity to that quantity's true value. The precision of a measurement system, related to reproducibility and repeatability, is the degree to which repeated measurements under unchanged conditions show the same results. Although the two words precision and accuracy can be synonymous in colloquial use, they are deliberately contrasted in the context of the scientific method.

The field of statistics, where the interpretation of measurements plays a central role, prefers to use the terms bias and variability instead of accuracy and precision: bias is the amount of inaccuracy and variability is the amount of imprecision.

A measurement system can be accurate but not precise, precise but not accurate, neither, or both. For example, if an experiment contains a systematic error, then increasing the sample size generally increases precision but does not improve accuracy. The result would be a consistent yet inaccurate string of results from the flawed experiment. Eliminating the systematic error improves accuracy but does not change precision.

A measurement system is considered valid if it is both accurate and precise. Related terms include bias (non-random or directed effects caused by a factor or factors unrelated to the independent variable) and error (random variability).

In addition to accuracy and precision, measurements may also have a measurement resolution, which is the smallest change in the underlying physical quantity that produces a response in the measurement.

In numerical analysis, accuracy is also the nearness of a calculation to the true value; while precision is the resolution of the representation, typically defined by the number of decimal or binary digits.

In military terms, accuracy refers primarily to the accuracy of fire, the precision of fire expressed by the closeness of a grouping of shots at and around the center of the target.

## Examples

Example of accuracy vs precision:

• Accuracy: 3.14 is a fairly accurate approximation of Pi (Π, π).

• Precision: 3.1915 is a more precise approximation, but is less accurate.

### Sources

“Accuracy and Precision.” Wikipedia, Wikimedia Foundation, 2 May 2020, en.wikipedia.org/wiki/Accuracy_and_precision.

## App

Check out our free app for iOS & Android.

## App

Check out our free desktop application for macOS, Windows & Linux.

## Browser Extension

Check out our free browser extension for Chrome, Firefox, Edge, Safari, & Opera.

Placeholder

Placeholder

Cite Page
Email
WhatsApp
Reddit
SMS
Skype
Line