Increasing Interchannel Delay on DAQ Devices for Longer Settling Times

Updated Apr 24, 2024

Reported In

Software

  • LabVIEW

Driver

  • NI-DAQmx
  • Traditional NI-DAQ

Programming Language

  • C
  • C# .NET

Issue Details

I would like to eliminate ghosting, leakage, noise, erroneous or unexpected voltages, or capacitance issues between DAQ channels. My application requires longer settling times than automatically chosen by the NI-DAQmx or Traditional NI-DAQ (Legacy) Driver. How do I increase the interchannel delay using the NI-DAQmx API or the Traditional NI-DAQ (Legacy) Driver?

Solution

NI-DAQmx

  • Using the LabVIEW API

    You can manually set your convert (channel) clock using the DAQmx Timing property node. After placing the DAQmx Timing property node, left-click on the property selector and select More>>AI Convert>>Rate. Then, right-click on the property and select Change to Write to be able to change this property. 
    Note: If using a Compact DAQ module for analog input, you must first set the active devices so that the module you are acquiring with is the active device.  To do this select the timing property More>>AI Convert>>Active Devices. Once you have selected the active device, you can expand the property node and then select the More>>AI Convert>>Rate property. 
 
Note: If you don't see the property listed but the User Manual for your device says that is supported you can display the property by following these steps : Right Click Property Node >> Select Filter... >> Show All Attributes
 
  • Using the ANSI C API

    You can manually set your convert (channel) clock by calling: 
    int32 __CFUNC DAQmxSetAIConvRate(TaskHandle taskHandle, float64 data);
  • Using the .NET API

    You can manually set your convert (channel) clock by setting the AIConvertRate property of the NationalInstruments.DAQmx.Timing class.

Traditional NI-DAQ (Legacy)

  • Using the LabVIEW API

    The interchannel delay is specified in the AI Config VI for Traditional NI-DAQ. If left unwired, LabVIEW will scan the successive channels as fast as the board allows.
  • Using the ANSI C API

    You can manually set the interchannel delay by calling: 
    status = SCAN_Start (deviceNumber, buffer, count, sampTimebase, sampInterval, scanTimebase, scanInterval);

    The SampInterval specifies the amount of time to elapse between each A/D conversion within a scan sequence.

Additional Information

In the NI-DAQmx driver (version 7.3 and higher) and in the Traditional NI-DAQ (Legacy) driver the convert clock rate (interchannel delay) is chosen as the period of the fastest sampling rate of the device plus 10 us. In some cases it is beneficial to decrease the AI Convert clock rate to allow for additional settling time. To be able to reduce the AI Convert clock rate you may first need to reduce the sampling rate to allow time for each channel in your scan list. You should follow this general rule: 
AI Sampling Rate ≤ AI Convert Rate / # of Channels