I am measuring a signal with frequency 5 kHz and I am using sampling on demand, with a timed loop executing at a rate of 1kHz.
Note: This image is a LabVIEW snippet, which includes LabVIEW code that you can reuse in your project. To use a snippet, right-click the image, save it to your computer, and drag the file onto your LabVIEW diagram.
Although the signal has a frequency of 5 kHz, the frequency I am reading is between 4000 kHz to 6000 kHz. The frequency is calculated by counting the difference between edges counted in one loop iteration and the next loop iteration, divided by the difference in time between two successive executions of the DAQmx Read function (The actual time the loop takes to execute).
The timed loop should execute with a maximum execution period of 1 ms however it will vary every time the loop is executing. For this specific example it was between 0.4 ms to 0.9 ms. Because the timed loop doesn't execute at the same rate, this will produce an error in the number of edges being counted. If the loop executed at 0.4 ms you would expect to read 2 edges, If the loop executed at 0.9 ms you will read 5 edges.Because the input signal is being read at 5 kHz, the number of edges being count in each iteration will be between 2 and 5 counts, so an error of 1 count will make a big impact in the frequency calculation.To reduce this impact simply measure more number of counts in each iteration. This can be done by reducing the rate at which the timed loop executes.
Collaborate with other users in our discussion forums
A valid service agreement may be required, and support options vary by country.