Percent Error Calculator

Enter your observed (experimental) value and true (theoretical) value into the Percent Error Calculator to find out how far off your measurement is. You'll get the percent error, the absolute error, and the absolute difference — helping you quickly assess measurement accuracy in science, lab work, or everyday estimation.

The value you measured or observed in your experiment.

The accepted, expected, or known true value.

Choose whether percent error should always be positive or retain its sign.

Results

Percent Error

--

Absolute Difference |Observed − True|

--

Relative Error (decimal)

--

Observed Value

--

True Value

--

Observed Value vs True Value

Frequently Asked Questions

What is percent error?

Percent error is a measure of how far off an observed or experimental value is from the true, accepted, or theoretical value, expressed as a percentage. It helps you assess the accuracy of a measurement. A small percent error indicates the observed value is close to the true value, while a large percent error signals a significant discrepancy.

What is the percent error formula?

The formula is: Percent Error = ((Observed Value − True Value) / True Value) × 100%. Most applications use the absolute value of the numerator so the result is always positive, but in some scientific contexts the sign is retained to indicate whether the measurement was an overestimate or underestimate.

How do I calculate percent error step by step?

1) Subtract the true value from the observed value. 2) Divide that difference by the true value. 3) Multiply the result by 100 to convert to a percentage. 4) Optionally, take the absolute value if you only want the magnitude of the error.

Can percent error be negative?

Yes, if you keep the sign. A negative percent error means your observed value was lower than the true value (an underestimate). A positive percent error means your observed value was higher (an overestimate). In many scientific disciplines the absolute value is used so the error is always reported as a positive number.

What is the difference between percent error and margin of error?

Percent error compares a single measured value to a known true value, showing how accurate that specific measurement is. Margin of error is a statistical concept used in surveys and polling that expresses the range within which the true population value likely falls, based on sample data and confidence levels.

What is the difference between percent error and standard error?

Percent error measures the discrepancy between one measurement and a known value. Standard error is a statistical measure of how much a sample mean is expected to vary from the true population mean across repeated samples — it reflects variability and precision of an estimate rather than accuracy of a single measurement.

What causes percent error in measurements?

Percent error can arise from human error (misreading instruments, rounding), instrument limitations (calibration issues, resolution), environmental factors (temperature, pressure), and methodological assumptions. Even careful measurements introduce some error, which is why percent error is routinely reported in experimental science.

What is considered a good (acceptable) percent error?

It depends on the field. In many physics or chemistry experiments, a percent error below 5% is generally considered acceptable. Engineering applications may require errors well below 1%. The acceptable threshold always depends on the precision required by the specific measurement context.

More Math Tools