Distortion in Time and Frequency Domain Data in NI RF Devices

Updated Oct 5, 2024

Reported In

Hardware

  • USRP Software Defined Radio Device
  • PXI Vector Signal Analyzer
  • PXI Vector Signal Transceiver

Issue Details

The time and frequency domain plots of my received data using the NI RF device appear distorted. How to resolve the issue?
 

Solution

The most common reason for such distortion is clipping. Clipping is the phenomenon when the input signal (amplitude) range is higher than the acceptable limit of the RF analog front end and/or ADC. As a result, the time domain data is capped at a certain value. The impact is visible in the frequency domain as an increased bandwidth at the fundamental frequency as well as high power harmonics.
 
Chart, histogram  Description automatically generated
RF analog gain plays an important role in avoiding clipping. The knowledge of the expected signal power is important for the user to set the correct gain in order to avoid it. The following are some of the suitable ways to avoid any such signal distortion:
  1. Reducing the power of the input signal if possible.
  2. Reducing the RF analog gain
  3. Using an attenuator of correct (power) specification
There can be several other reasons for distortion in RF devices. Please find some of these issues in the knowledgebase discussing drift in constellations .

 

Additional Information

Please note that very high input power may also damage the RF signal path of your device.
In calibrated NI RF products (i.e., VSA, VST), the gain is automatically adjusted by the driver itself based on the ‘Reference Level’ (set by user). However, in USRP products, the user needs to adjust the gain directly. The gain range may vary over the frequency.