Degradation in Phase Synchronization Performance at Lower Input Power Levels in NI USRP 2945

Updated Apr 28, 2023

Reported In

Hardware

  • USRP-2945
  • USRP-2955

Issue Details

In my multichannel NI USRP 2945 phase synchronized setup (LO, clock, and start trigger shared), the phase performance is better when the input power is high ( > -20 dBm). However, as the input power or pre-amplification gain decreases, the phase performance degrades resulting in an increased standard deviation of phase and amplitude delta between the input channels. What can be the reason and how to improve the performance at lower power levels?
 

Solution

The phase performance is impacted by several reasons including signal-to-noise (SNR) ratio, LO leakage, and in-band Spurs. At the center frequency, the USRP devices are known to have a significant LO leakage. As the LO leakage amplitude and phase have higher variance, it has an additive impact on the phase and amplitude of the signal as well. However, the impact is significant when the input signal power becomes comparable to the LO leakage power.
To improve the performance degradation because of LO leakage (or any other spur), the contaminant has to be taken out of the band. A typical way of doing it is by digital frequency shift followed by digital filtering or decimation. NI USRP RIO architecture already contains both these IPs in its FPGA code. Refer to the article related to LO leakage suppression for more details.
 

Additional Information

Please note that suppressing the LO leakage by taking it out of the band reduces the maximum usable bandwidth of the device as explained in the above-referenced article.