From Wi-Fi devices to LTE handsets, today’s wireless devices require several key measurements to characterize the ability of the receiver to demodulate an incoming signal from a base station or access point without distortion. Although many engineers are familiar with the notion of a sensitivity measurement (historically, the lowest signal level that can be received), this is just one of several measurements required to characterize receiver performance. There are actually six key measurements that engineers frequently use to evaluate receiver performance under a wide range of operating conditions.
Although this article describes these six receiver measurements in the context of an LTE receiver, the concepts apply to any wireless receiver. The 802.11 specifications are similar for wireless LAN radios.
LTE receiver metrics, shown in Table 1, are defined in section 7 of the 3GPP TS 36.521 specifications. Although Table 1 actually lists seven measurements, spurious response is functionally similar to blocking characteristics, so they’ll be combined in this article.
When testing LTE receiver performance, the primary receiver figure of merit is receiver throughput. Each of the measurements listed in Table 1 defines conditions at which the device must meet minimum throughput requirements of 95 percent.
Reference Sensitivity Level
The reference sensitivity level describes the receiver’s ability to operate within low signal power conditions. Unlike GSM and W-CDMA receivers that often use bit error rate (BER) to define the sensitivity requirements, LTE defines minimum performance in terms of throughput. Sensitivity is therefore defined as the lowest average power level at which the receiver can achieve 95 percent of the maximum throughput, when using QPSK modulation.
The hardware configuration for LTE reference sensitivity level requires a vector signal generator (VSG) directly connected to the receiver. It is common to use a fixed attenuator between the instrument and the device under test (DUT) or user equipment (UE) to improve the impedance match. As illustrated in Figure 1, the VSG produces an LTE downlink signal and the receiver reports its throughput through a digital interface.
The power level at which an LTE receiver must meet the required throughput varies according to the E-UTRA frequency band and configured channel bandwidth. These conditions are defined in Table 7.3.3-1 of the 3GPP TS 36.521 specifications. The power levels range from -106.2 dBm for Bands 35 and 36 in a 1.4 MHz bandwidth to -90 dBm for Band 20 in a 20 MHz bandwidth.
Maximum Input Level
A second key metric of LTE receiver performance is maximum input level. Similar to the reference sensitivity level, the receiver’s maximum input level characterizes its ability to achieve minimum requirements for throughput with large signal levels.
Receiver performance at relatively high power levels is primarily determined by the linearity of the front end, which is usually dominated by components such as the first LNA in the receive chain. The minimum conformance standards for maximum input level require that the receiver be able to achieve at least 95 percent of the maximum throughput in the presence of signal powers up to -25 dBm for all bands and in all channel bandwidths.
The test configuration for maximum input level is almost identical to reference sensitivity level. One minor difference is that it is not necessary to use substantial attenuation between the RF signal generator and the DUT when testing maximum input level. Instead, the signal generator is either connected directly to the DUT or through a small attenuator used for impedance matching.
Adjacent Channel Selectivity
Adjacent channel selectivity (ACS) is a third metric for LTE receiver performance. ACS measures a receiver’s ability to achieve minimum throughput requirements in the presence of an adjacent channel signal, i.e., at a specific frequency offset from the given channel. This measurement is particularly useful in determining the receiver’s performance at the band edge, when higher power out-of-band signals from other base stations are present. ACS can strictly be defined as the ratio (in dB) of the receiver filter’s attenuation at the assigned channel frequency to the attenuation at the adjacent channel.
The test configuration for ACS requires two signal generators connected to a power combiner. One VSG produces the reference LTE signal, which is demodulated by the receiver. The other signal generator produces the interfering LTE signal at an offset frequency, illustrated in Figure 2.
The outputs of both the primary downlink and interfering signal generators are combined to form a composite input to the UE. The specific requirements for LTE receiver ACS depend on the configured channel bandwidth and range from 33 dB in a 1.4 MHz channel to 27 dB in a 20 MHz channel.
Testing ACS generally involves two test configurations, one close to the sensitivity limit and one at the maximum input power to the receiver. When testing ACS at the lower end of the input power range, the primary RF signal generator generates a reference channel that is 14 dB above the receiver’s sensitivity limits. The interfering RF signal generator produces an LTE signal at a higher output power, where the specific power level depends on the bandwidth of the transmission.
When testing ACS at the higher end of the receiver’s input power range, the interfering RF signal generator produces an interferer at the maximum input level of -25 dBm. Then, the primary RF signal generator is configured to produce a reference channel that is substantially lower than the interfering signal. In this test, the absolute power level of the reference channel depends on the channel bandwidth.
Blocking Characteristics
Blocking characteristics are a measure of the receiver’s ability to accurately demodulate LTE signals in the presence of a wide range of interference. The specifications for LTE provide a more comprehensive range of interferers than the ACS measurement, including both continuous wave and modulated signals.
Figure 3 illustrates the range of blocking signals: continuous wave (CW) signals close to the band of interest (narrowband blocking), CW signals farther from the band of interest (out-of-band blocking) and modulated signals relatively close to the band of interest (in-band blocking).
A test configuration similar to that used for ACS is used to measure blocking characteristics, i.e., combining the outputs of two RF signal generators. If the signal generator producing the interfering signal cannot generate CW and modulated signals, a separate CW signal generator will be required.
Similar to sensitivity and ACS, the blocking characteristics measurements require the receiver to achieve a minimum throughput of 95 percent of its maximum throughput for each of the in-band, out-of-band and narrowband blocking measurements.
In-band blocking is a metric measure of the receiver’s performance in the presence of an unwanted interfering signal in the UE receive band or in the first 15 MHz below or above the receive band. When performing this measurement, the primary VSG is configured to produce an LTE signal 6 to 9 dB above the reference sensitivity limit. The in-band interferer is a modulated LTE signal configured at either -56, -44 or -30 dBm, depending on frequency offset.
By comparison, out-of-band blocking characteristics evaluate the receiver’s performance in the presence of higher power out-of-band signals. Unlike the in-band blocking characteristics that use a modulated signal, the out-of-band interfering signal is a CW signal at +6 dBm. When performing this measurement, the interference signal generator must be configured to generate a CW tone.
When testing out-of-band blocking, the reference LTE signal is generated at a power level that is 6 to 9 dB above the reference sensitivity level of the receiver, with the precise power level dependent on the bandwidth configuration. Figure 4 shows that in the 5 MHz bandwidth configuration in E-UTRA Band 1, where the reference sensitivity requirement is -100 dBm, the test signal for out-of-band blocking is -94 dBm, which is 6 dB higher in power.
The final blocking measurement, narrowband blocking, is a measure of the LTE receiver’s ability to achieve minimum throughput in the presence of an unwanted narrowband interferer, where the frequency offset is less than the channel spacing. Similar to the out-of-band blocking measurement, the narrowband blocking requires a test configuration that uses both LTE and CW signals.
Because the interferer for narrow-band blocking is close in frequency to the band of interest, the power level of the interferer is much closer to the power level of the reference signal. Here, the reference LTE signal ranges from 16 to 22 dB above the sensitivity level of the receiver. Also, the interferer is generated at a power level of -55 dBm for all bandwidth configurations and is spaced at a frequency offset that is just over 200 kHz away from the band edge of the signal of interest.
Intermodulation Characteristics
Receiver intermodulation characteristics mimic the effect of the intermodulation products that occur in a receiver experiencing multiple interferers simultaneously. To perform this measurement, two interference signals are simultaneously injected, creating third-order distortion products that directly interfere with the reference downlink signal.
As shown in Figure 5, the frequency offset between the two interfering signals, both a CW interferer and a modulated interferer, is equivalent to the frequency spacing between the CW interferer and the reference signal to the receiver. Thus, the resulting third-order distortion product directly interferes with the reference signal.
The test setup for intermodulation characteristics requires three RF signal generators, two VSGs and one CW signal generator, and a three-way RF power combiner.
To pass the intermodulation characteristics measurement, a receiver must be able to achieve 95 percent throughput at power levels ranging from 6 to 12 dB above the sensitivity limit, depending on the channel bandwidth.
Spurious Emissions
The final critical LTE receiver measurement is spurious emissions. Unlike the other measurements, which define a receiver’s ability to achieve a specified throughput under a range of signal conditions, the spurious emissions measurement characterizes the receive port’s radiated emissions. It is the only receiver measurement that does not reference the throughput.
The hardware requirement for spurious emissions is straightforward and consists of connecting a spectrum analyzer to the receive port of the receiver. The spectrum analyzer measures emissions from 30 MHz to 12.75 GHz using a measurement bandwidth of either 100 kHz or 1 MHz (some exceptions apply to Bands 22, 42 and 43). The emissions requirement is -57 dBm in a 100 kHz bandwidth or -47 dBm in a 1 MHz bandwidth, with the measurement bandwidth dependent on frequency.
Conclusion
Although many engineers are most familiar with sensitivity as a metric for receiver performance, real-world environments require additional measurements. As the LTE receiver performance metrics indicate, receivers operate in the presence of a wide range of interfering signals that place requirements on both the noise floor and linearity performance. When testing an LTE radio for conformance with the 3GPP requirements, radio designers must consider a wide range of receiver performance metrics, from in-band emissions to intermodulation.
Note: This article is an abridged section from an application note entitled “Introduction to LTE Device Testing: From Theory to Transmitter and Receiver Measurements.” For the full application note, visit: http://download.ni.com/evaluation/rf/Introduction_to_LTE_Device_Testing.pdf.