Minimum Voltage Change for Analog Triggers on a Digitizer

Updated Feb 1, 2018

Reported In


  • Oscilloscope Device
  • PXI-5112
  • PCI-5112
  • PXIe-5122
  • PCI-8512
  • PXI-5122



Issue Details

I'm setting up an analog trigger for my NI High-Speed Digitizer.  What is the minimum voltage change that can be detected and used for triggering?


NI High-Speed Digitizers allow the user to trigger off of one of the input channels or off of the TRIG input. The minimum voltage change that you can trigger with will have different specifications depending on the trigger source and can be determined by two values from the digitizer's specifications document.

The minimum trigger voltage change detectable by legacy digitizers can be calculated by multiplying the vertical range by the DC Accuracy of the trigger channel. From the NI PXI/PCI-5112 Specifications, page 1 shows vertical input range values between 25mV and 25V. On Page 3, the DC accuracy for Channels 0 and 1 is 2.5% of full scale or ±500mV on the TRIG input.

Non-legacy digitizers have two specifications, edge trigger sensitivity and level accuracy.  Edge trigger sensitivity is the smallest voltage the device can edge trigger with, while level accuracy specifies how close the device will read the trigger voltage to its actual voltage.

As seen on page 3 of the NI PXI/PXIe/PCI-5122 Specifications, the NI 5122 has multiple range selections ,and on page 14, has an edge trigger sensitivity of 2.5% up to 50 MHz for CH 0 and CH 1. If the trigger is off of CH 0 and the full-scale voltage range is 1 V peak-to-peak, then the analog trigger sensitivity is 0.025 V.  This means that if we set the trigger level to 0 V, the card will actually trigger off a voltage between -0.025 V and +0.025 V.

Additional Information

This information is available in Datasheet & Specifications document on the product page for each digitizer.


Not Helpful