This content is not available in your preferred language.

The content is shown in another available language. Your browser may include features that can help translate the text.

Accuracy of Software-Timed Applications in LabVIEW

Updated Oct 11, 2022

Reported In

Software

  • LabVIEW
  • LabWindows/CVI

Issue Details

I have a software-timed application that performs data acquisition. The acquisition occurs in a loop which is controlled by a LabVIEW (or LabWindows/CVI) timer function. Why can I not use a time resolution better than 1 ms and why is the time interval between each iteration of the loop not constant?

Solution

LabVIEW or LabWindows/CVI™ timer functions use the operating system timers. The time resolution of these timers depends on the operating system. A list of typical timing resolutions on different operating systems is given below.
 
Operating SystemTypical Timing Resolution (ms)
Windows 3.x55
  Windows 9x/Me/NT/2000/XP/Vista/7/8/10  1
Macintosh 68k17
PowerMac1
SUN Solaris10
Linux10
Mac OS X (PowerPC)10
 
 
For example, any software-timed operation would have an accuracy of +/- 1 ms on a Windows 7 operating system.

If you are using the LabVIEW/LabWindows timer functions to control a loop, then you can expect differences in the time interval between each iteration of the loop, depending on what other processes are running on the computer at that instant. For example, if you have several windows open at the same time and you are switching between different windows during your data acquisition, then you can expect a lot of overhead on the Central Processing Unit (CPU), which might slow down the loop that is performing the data acquisition.

In this type of timed application, it is preferred to use hardware timing instead of software timing for improved accuracy and reliability. For example, if you want to scan a channel every 1 ms, you can configure the hardware to scan at a rate of 1000 samples/second instead of using software timing, which is equivalent to a 1 ms interval between each scan. This approach achieves the same outcome without relying on the variable timing of the operating system.

Hardware timing also becomes necessary in applications requiring a higher degree of precision than the 1 ms system timer on many operating systems.

Additional Information

When you are on a real time operating system it will perform with the same timing resolution as listed above, but will run with less interrupts in execution.