Effects of Radar Interference on LTE Base Station Receiver Performance

Topics:

Abstract: In response to proposals to introduce new radio systems into 3550–3650 MHz radio spectrum in the United States, the authors have performed measurements and analysis on effects of interference from a variety of radar waveforms to the performance of a Long Term Evolution (LTE) base station receiver. This work has been prompted by the possibility that LTE base station receivers may eventually share spectrum with radar operations in this spectrum range. The base station receiver that was tested used time division duplex (TDD) modulation. Radar pulse parameters used in this testing spanned the range of both existing and anticipated future radar systems in the 3100–3650 MHz spectrum range. LTE base station receiver data throughput rates, block error rates (BLER), and internal noise levels have been measured as functions of radar pulse parameters and the incident power level of radar pulses in the base station receiver. The authors do not determine the acceptability of radar interference effects on LTE base station performance. Rather, these data are presented for the use of spectrum managers and engineers who can use this information as a building block in the construction of frequency-and-distance separation curves for radar transmitters and LTE base station receivers, supporting possible future spectrum sharing at 3.5 GHz.

Note: This report was reissued in May 2014 to correct the duty cycles of four radar interference waveforms that were misstated in the original version of this report. The error was due to a mistake in the equations on page 8, now corrected, in which a pulse repetition rate (PRR) variable was used instead of a pulse repetition interval (PRI) variable. The waveforms’ pulse widths, pulse repetition rates, and chirp bandwidths were correctly reported.