What Is Absolute Accuracy at Full Scale for DAQ Devices?

Updated Apr 20, 2023

Reported In

Hardware

  • Multifunction I/O Device
  • C Series Multifunction I/O Module

Issue Details

While reading the specifications of my DAQ device, I found a terminology called Absolute Accuracy at Full Scale. What is the meaning of Absolute Accuracy at Full Scale and how can I calculate it?

Solution

Accuracy refers to how close to the correct value of a measurement is. Absolute Accuracy at Full Scale is a calculated theoretical accuracy assuming the value being measured is the maximum voltage supported in a given range. The accuracy of measurement will change as the measurement changes, so to be able to make a comparison between devices, the accuracy at full scale is used. Note that absolute accuracy at full scale makes assumptions about environment variables, such as 25 °C operating temperature, that may be different in practice.
In the specifications, Absolute Accuracy at Full Scale is typically calculated with the measurement for the maximum value of Nominal Range Positive Full Scale.

Example 1.

The NI PXIe-6363 has a range of ± 0.5 V. The absolute accuracy at full scale is calculated with the assumption that the signal being measured is 0.5 V. The absolute accuracy at full scale for the ± 0.5 V range is 100 µV.

Example 2.
 

 
NI 9223 publishes the table above under the accuracy section. For example, when the module is configured to return calibrated data, is using a ± 5 VDC range and 5 VDC is being measured, the maximum expected gain error for the full operating temperature range is ± 0.20% x 5 V (actual read value) = ± 10 mV. The offset error is ± 0.10% x 10 V (full voltage range) = ± 10 mV, so the total uncertainty when measuring a 5 V signal will be 5 V ± 20 mV. In this case, the absolute accuracy at full scale is 20 mV.