Due to the proliferation of wireless communications systems using LTE, LTE-A and soon to be 5G, the number of assigned frequency bands between 400 MHz and 6 GHz has dramatically increased. In 2011, the 3GPP standards association had defined 11 bands; now there are over 55 assignments for worldwide deployment. Of particular concern is the use of wireless services and the coexistence of UHF, L-, S- and C-Band radars and geolocation services (see Tables 1 and 2). While the topic of coexistence is not new, this article focuses on unresolved issues and the potential impact to sensitive radar receivers, demonstrating a proposed methodology for the assessment of cooperative radar receivers.

DEFINING COEXISTENCE

Numerous studies and ITU-R recommendations have focused on the impact of the radar transmission on the receiver in wireless communications systems. These studies have resulted in measurement procedures and recommended practices for the prediction of mitigation distances between the systems. This has been enabled by an accepted methodology to measure the power of the radar (ITU-R M1177-4)1 and the 3GPP Technical and Test Specifications2,3 to test the minimum acceptable performance of the wireless base station and user equipment. The receivers in radar and wireless communications have approximately the same sensitivity, ‐115 dBm. By means of the 3GPP standards, base station receivers are designed to physically coexist on the same antenna tower, a few feet apart with minimal frequency separation. Since the power of the radar can be orders of magnitude higher than the typical 40 W base station carrier signal, it would seem to make sense to just focus on the impact of the radar transmission on the “victim” wireless base station receiver.

Table 2

Figure 1

Figure 1 Spectrum components from wireless communications and ATC radar receiver.

However, the performance of radar receivers is not subject to international or commercially assessed requirements, and the lack of standard performance profiles limits the availability of data demonstrating the impact on the radar receiver from the transmission of a wireless communications system. While not the focus of this article, it should be noted that the recent Radio Equipment Directive in the European Union, RED Article 3.2, now requires the radio receivers of global navigation satellite systems (GNSS) to be tested for interference from terrestrial broadcast systems. Per EN 303 413, minimum performance requirements for GNSS receivers will enable the licensing of wireless communications systems in bands adjacent to services like GPS. This should serve as a proof point that the lack of standards on radar receivers will not prevent sovereign nationals from licensing spectrum in bands adjacent to radar infrastructure.

The few standards on commercial radar receiver performance have tended to focus on the issues of coexistence and interoperability with similar systems. Clearly, there are important examples, such as automotive radar and even maritime radar interoperability, to prevent undesired performance from two radars working in close proximity. The IEC 62388 international standard “Maritime navigation and radiocommunication equipment and systems” defines such test methods and performance requirements for commercial S-Band and X-Band systems. Referring to Figure 1, when one considers the coexistence of the wireless systems in the frequency bands shown in Table 1 (Bands 7 and 69) with the target radar systems from Table 2 (S-Band air traffic control (ATC) radar), test cases have not been defined nor are there performance standards required for the radar systems. Further, the radar systems have pre-dated the existence of the wireless communications standards by many decades.

Figure 2

Figure 2 Radar echo generator.

The measure of the radar’s immunity to interference is defined as the frequency dependent rejection (FDR).4,5 The FDR is determined by the receiver IF selectivity and is a function of the performance of the low noise amplifier (LNA) and noise power through the down-conversion, filtering and signal processing. In a radar receiver, the two main interference parameters influencing the receiver sensitivity are blocking and selectivity. Blocking is the measure of gain compression at the front-end LNA due to a strong signal forcing the LNA into nonlinear compression. The selectivity is the measure of the increase in noise introduced into the receiver front-end while not in nonlinear compression, that will reduce the signal-to-noise ratio (SNR) of the receiver.

In the interest of developing a standard method to assess the coexistence of radar and wireless communications systems, a CW tone can be used to represent the blocking signal. The CW source should have the ability to generate high-power with low phase noise and low harmonics, so the unintentional artifacts of the signal generator do not influence the test results. For a selectivity test, a noise signal is required. Since the challenge of coexistence in this case is primarily the mix of cellular and radar signals, the noise-like signal used to assess selectivity performance can be an LTE test model signal.2,3 To study the performance of the radar receiver in the presence of an LTE network, a standard method of assessing the FDR performance of the radar blocking and selectivity behavior needs to be defined. A cooperative radar system is a radar whose service duty will not be impaired while performing the testing assessment. During the test, the radar can be in a decommissioned state, on a test range under emulated conditions or otherwise operating. While it is not expected to be in service during the test, the radar should be fully functioning to allow observation of performance.

Figure 3

Figure 3 Example radar PPI and range scope displays showing cascading echoes.

ASSESSMENT METHODOLOGY

The functional performance of a cooperative radar should be assessed over-the-air (OTA) or in a test chamber, important to assure that all the components of the radar performance, including the antenna and LNA, are part of the system. Where allowed by law, OTA testing in situ provides the most realistic results. While the most common tool for assessing the functional performance of a radar is a single dihedral corner reflector or an array of reflectors at fixed locations, this method is not as ideal as test tools that provide a number of scaled amplitude, delayed echoes. Common tools with the ability to regenerate scaled echoes in an OTA RF environment include the use of digital radio frequency memory (DRFM) or radar echo generators (REG), as shown in Figure 2. Utilizing digital delay taps, these tools have the advantage of a controlled delivery of a series of radar echoes that represent the transmitted radar signal delayed in time and at various attenuation levels, representing radar cross sections. This is important for assessing basic radar receiver capabilities such as delay time (range), signal amplitude (resolution) and even the Doppler rate of an echo. In a test lab, while it is common practice to test functioning radars with fiber optic delay lines (FODL) or coaxial delay lines (CDL), these may not have the flexibility to create multiple targets at different delays and attenuation levels. Further, these may bypass the critical RF components such as the antenna and LNA, which can skew the results.

The test method and results in this article use the REG as the desired tool, constructed from commercially available test equipment with metrology grade instruments.6,7 With the added functionality, the REG can also create the additional RF interference signals required for testing, including CW, LTE or even arbitrary waveform signals. The baseline performance level for a selected mode of operation can be set with a REG to approximate a range of echo returns reliably detected on the radar system. The level and number of returns will depend on the quantitative thoroughness desired by the assessment. The baseline performance of the radar should provide a user interface that represents the actual operation expected by the end-user.

Figure 4

Figure 4 Asymmetric spectrum of a radar near the upper limit of its tuning range.

In the example shown in Figure 3, three echo returns are shown in the user interface. For this demonstration, the REG is connected to an RF input port during scan mode, so the radar echo appears as concentric circles on the user display. While the levels of the echo returns are not enumerated in the user interface, a development mode interface of amplitude versus range is shown in the upper right corner of the figure. As the interference signals are introduced, the radar receiver will become impaired due to LNA compression (blocking) or increased noise into the IF (selectivity), and the number of echoes seen by the user will decrease. This is the method to determine the susceptibility of the radar.

Some important considerations for the process and procedure for testing radar susceptibility are:

Occupied Channel—Using ITU-R M.1177-4,1 it is necessary to determine the bandwidth of the occupied signal using the approximate 40 dB spectrum (e.g., ~10 MHz). The occupied channel can vary depending on the mode of operation (i.e., PPI rate) and tuning frequency. The example in Figure 4 shows the asymmetric performance of the transmit mask of a radar at the upper end of its tuning range. While the performance results are assessed against the fractional bandwidth of the carrier frequency, translating the occupied channel to fractional bandwidth contributes to important guidelines on the policy.

Figure 5

Figure 5 Representative performance of a 3GPP base station receiver at 2 GHz.

For example, in 3GPP, a 500 MHz system with an LTE signal having an occupied channel of 5 MHz has a signal that occupies 1 percent of the fractional bandwidth. Per the 3GPP standard, the expected blocking performance at a 20 MHz offset is 100 dB, representing 4 percent of the fractional bandwidth offset. The same 5 MHz communications system operating at 5 GHz occupies
0.1 percent fractional bandwidth. While 3GPP requires the same blocking performance at a 20 MHz offset, filter requirements and system receiver performance will now require 100 dB blocking performance at 0.4 percent fractional bandwidth offset. Figure 5 shows the representative selectivity and blocking performance of an LTE base station receiver at a 2 GHz nominal center frequency.

FDR—The FDR is the measure of the rejection of an unwanted emission produced by the receiver’s selectivity. There are a couple of important parameters of the FDR: the on-tune rejection (OTR) and the off-frequency rejection (OFR). Within the OFR, there are components OFRinband and OFRoutband. These OFR components categorize the performance within the full tuning and operating range of the radar.

The OTR reflects the tuned center frequency occupied channel testing. Some radars have the ability to reject non-coherent and CW interference signals. Testing the OTR will help to separate the RF performance of the radar front-end and the reference performance gain of the digital signal processor (DSP) in the radar. The OFR will have different performance, based on the cascaded filter and amplifier receiver chain, as referenced in Figure 3. Therefore, it is necessary to provide enough test points to determine the behavior of the receiver considering these elements. These terms separate into OFRinband and OFRoutband to distinguish how these assessments might be determined.

Figure 6

Figure 6 Radar selectivity with increasing interference, from (a) to (c), at a fixed frequency offset.

RESULTS

Figure 7

Figure 7 Selectivity vs. offset frequency with −50 dBm amplitude.

Examining the test methodology, the performance of a maritime radar provides a test case for this assessment. The results for selectivity demonstrate the frequency and amplitude offsets of the radar’s FDR. Again, for the purposes of demonstration, the REG is connected directly to the RF input port, while the radar is set to scan mode. Figure 6a represents the baseline performance with three cascading echoes delayed in time and near the sensitivity of the radar. The SNR may be rather subjective if the echoes represented just a blip on the screen; therefore, as in the prior example, the REG is connected directly to the RF input port. In Figures 6b and 6c, an interference signal is coupled to the radar echo return, offset in frequency and increasing in amplitude. The decreasing SNR due to the interference signal appears as increasing the baseline noise. In Figure 6c, while the echoes in the development mode of the radar amplitude versus time display are still visible, a user would need to adjust the noise level of the radar to be able to discern any objects on the display.

To provide a reference of the radar’s FDR, a set of tests was conducted to plot the selectivity versus offset frequency at a fixed amplitude of ‐50 dBm (see Figure 7) and the selectivity versus amplitude at a fixed frequency offset (see Figure 8). The results are expressed in fractional bandwidth offset from the center frequency and the interference level relative to the receiver sensitivity. While typical radars have a receiver sensitivity between ‐90 to ‐120 dBm, the actual receiver sensitivity of the radar used in this test is proprietary. With coupling losses accounted for, the level of the interference signal was ‐50 dBm at the receiver input port. Without disclosing the additional coupling loss to the radar receiver, this represents a level approximately 50 dB above the input sensitivity. Three targets at range bins 270, 287 and 302 are shown in the “reference” measurement, where no interference was present. This shows that even at a modest interference level of  ‐50 dBm at the receiver input, with a frequency offset between 2 to 3 percent fractional bandwidth, the echoes will not have enough SNR to be detected by the radar.

Figure 8

Figure 8 Selectivity vs. amplitude with a fixed frequency offset (−3.733 percent fractional bandwidth).

Comparing these results to the standard performance of a wireless base station (see Figure 5), the base station can reject a +63 dB signal at a fractional bandwidth of 0.25 percent. It is clear that the selectivity of the radar has a much greater sensitivity at a much greater frequency offset. This affects the frequency allocation guard band between the radar and wireless services.

Using the values in Figure 7 and calculating a free space loss, the potential impact on a victim radar can be assessed. Assuming a cellular base station power in band 41 (see Table 1) at 40 W (+46 dBm), the cellular base station would have a free space attenuation of approximately ‐116 dB at a distance of 6 km. A possible band 41 downlink signal at 2690 MHz represents a ‐0.37 percent offset for a radar with a center frequency of 2.7 GHz. Knowing the FDR behavior of the victim radar, a 3 percent fractional bandwidth would dictate the radar should not be operated at a frequency below 2780 MHz at this 6 km distance.

The test of selectivity versus amplitude provides guidance on the physical separation distance allowed for coexistence at a defined offset frequency. Due to the performance of the selectivity versus frequency, the test for selectivity versus amplitude uses a large frequency offset (‐3.733 percent fractional bandwidth). Figure 8 shows the results when the selectivity power is increased from ‐50 to ‐26 dBm at the radar receiver port. While these values do not directly relate to the absolute sensitivity, the relative performance is roughly +50 and +74 dB above the receiver’s sensitivity.

Since the performance of the radar receiver and the wireless base station receiver are very similar, comparing the results with Figure 5 reinforces the need to have additional frequency guard band or stronger guidelines on the mitigation distances for mobile services.

CONCLUSION

The LTE base station and other fixed communications systems are designed for co-siting and coexistence, and substantial focus on blocking and selectivity immunity in the base station receiver has been specified by the 3GPP. When the transmit mask of the LTE base station (ACLR) and the receiver performance are compared, there is relative reciprocity in the out-of-band emissions and the block and selectivity performance.

The performance of the radar transmit mask and the radar receiver FDR curves demonstrate a substantial difference in performance for out-of-band signal behavior. While the radar emission mask shows a sharp and substantial roll-off in out-of-channel emissions (see Figure 4), the radar receiver used in this study clearly has an FDR that would make it highly susceptible to interference from a wireless network at a close-in frequency. It is clear that mitigation distances for frequency and separation distances for radar and radio systems need to consider the impact to the radar receiver. A standard methodology and approach will enable a baseline performance measurement, so these issues can get the necessary attention to determine appropriate guidelines for frequency allocations.

References

  1. ITU-R Recommendation M.1177-4, Techniques for Measurement of Unwanted Emissions of Radar Systems, Switzerland, 2004.
  2. 3GPP TS 25.141 V14.1.0, “3rd Generation Partnership Project; Technical Specification Group Radio Access Network; Base Station (BS) Conformance Testing (FDD), (Release 14),” Test Specification, Valbonne, France, December 2016.
  3. 3GPP TS 25.104 V14.1.0, “3rd Generation Partnership Project; Technical Specification Group Radio Access Network; Base Station (BS) Radio Transmission and Reception (FDD), (Release 14),” Technical Specification, Valbonne, France, December 2016.
  4. ITU-R Recommendation M.1461, Procedures for Determineing the Potential for Interference Between Radars Operation in the Radiodetermination Service and Systems in Other Services, Switzerland, 2000.
  5. ITU-R Recommendation SM.337-6, Frequency and Distance Separations, Switzerland, 2008.
  6. S. Heuel, “Real-Time Radar Target Generator,” Application Note 1MA256_0e, Rohde & Schwarz, Germany, November 2014.
  7. K. Shibli and S. Heuel, “Radar Echo Generator,” Application Note 1MA283_0e, Rohde & Schwarz, Germany, August 2016.