Solution
LabVIEW or LabWindows/CVI™ timer functions use the operating system timers. The time resolution of these timers depends on the operating system. A list of typical timing resolutions on different operating systems is given below.
Operating System | Typical Timing Resolution (ms) |
---|
Windows 3.x | 55 |
Windows 9x/Me/NT/2000/XP/Vista/7/8/10 | 1 |
Macintosh 68k | 17 |
PowerMac | 1 |
SUN Solaris | 10 |
Linux | 10 |
Mac OS X (PowerPC) | 10 |
For example, any software-timed operation would have an accuracy of +/- 1 ms on a Windows 7 operating system.
If you are using the LabVIEW/LabWindows timer functions to control a loop, then you can expect differences in the time interval between each iteration of the loop, depending on what other processes are running on the computer at that instant. For example, if you have several windows open at the same time and you are switching between different windows during your data acquisition, then you can expect a lot of overhead on the Central Processing Unit (CPU), which might slow down the loop that is performing the data acquisition.
In this type of timed application, it is preferred to use hardware timing instead of software timing for improved accuracy and reliability. For example, if you want to scan a channel every 1 ms, you can configure the hardware to scan at a rate of 1000 samples/second instead of using software timing, which is equivalent to a 1 ms interval between each scan. This approach achieves the same outcome without relying on the variable timing of the operating system.
Hardware timing also becomes necessary in applications requiring a higher degree of precision than the 1 ms system timer on many operating systems.