With mobile operators fast-tracking their 5G deployment plans, the realization of 5G is rapidly approaching. To keep pace, chipset and device manufacturers will need to accelerate their own development activities. If these efforts pay off, 5G will deliver new and powerful capabilities to support use cases requiring much faster data rates, ultra-reliable low latency (uRLLC) and massive machine-type communications (mMTC). However, that is easier said than done, given the challenge that comes with testing 5G’s high data rates.
5G USE CASES
There are three main use cases for 5G. They are enhanced mobile broadband (eMBB), uRLLC and mMTC. The eMBB use case is targeted in the Verizon 5G Technical Forum (5GTF) specification, as well as phase 1 of the 3GPP new radio (NR) specification. Due to strong industry demand, this use case and its definition have been accelerated. 3GPP has agreed to the early completion of the non-standalone (NSA) 5G NR mode for eMBB. In NSA mode, the connection is anchored in LTE with 5G NR carriers used to boost data rates and reduce latency. Data rates up to
20 Gbps in the downlink and 10 Gbps in the uplink are on the horizon for network rollouts in the next few years.
The 5G eMBB use case provides functionality to support high speed data rates, improved connectivity and system capacity. That is critical, since consumers want the ability to connect to the network wherever they are, such as attending a sports event, traveling in a car or riding on a high speed train or other public transport. High data rates and greater capacity are essential to using virtual reality (VR) and augmented reality (AR) applications, which include new video formats with increased resolutions (8K+) and higher frame rates (HFR). For interactive AR and VR applications, low latency is a key requirement. With the number of users increasing and simultaneously consuming or sharing premium content, 4G networks will struggle to provide the capacity. That underscores the necessity of improved capacity with a 5G network.
To achieve the higher data rates, improved connectivity and greater system capacity for eMBB than is available using sub-6-GHz frequencies, 5G is also being deployed in the higher frequency mmWave spectrum, which offers significantly greater bandwidth. While LTE operates at frequencies up to 6 GHz, mmWave frequencies up to 100 GHz are under consideration for 5G. 5GTF specifications covering 28 and 39 GHz are also being considered by other operators. At higher frequencies, propagation and penetration losses increase. To overcome the high path loss and improve connectivity to users at the cell edge, beamforming techniques will be employed. Beamforming increases the signal level received by a device, which results in a stronger signal-to-noise ratio, by providing high gain in specific spatial directions.
eMBB TEST CHALLENGES
The introduction of beamforming in the 5GTF and 3GPP NR specifications creates several new test challenges. Adding to these are the changes in the physical layer (PHY)—the frame structure, new reference signals and new scheduling and transmission modes—to support the eMBB use case. Understanding the new frame structure and beamforming concept is critical. To aide in this discussion, consider Table 1, which compares the PHY characteristics among the LTE, 5GTF and 3GPP 5G NR specifications. Note that all changes from the LTE standard are denoted in red. As shown in the table, the 5GTF frame structure parameters (e.g., subcarrier spacing and carrier bandwidth) are fixed compared to the 3GPP NR values. The 3GPP NR values are scalable to accommodate a wider range of use cases. As previously mentioned, 5GTF targets the eMBB use case. In this specification, higher subcarrier spacing, carrier bandwidth and use of higher frequencies all contribute to a higher data rate and improved connectivity, compared to LTE.
The radio frame size in LTE and 5GTF is the same: 10 ms (see Figure 1). In LTE, each frame contains 10 subframes and 20 slots, compared with the 50 subframes and 100 slots in 5GTF. This means that 5GTF slots (0.1 ms) are shorter than LTE slots. A resource block is the smallest entity that can be assigned to a device. Both LTE and 5G resource blocks consist of one slot in the time domain and 12 subcarriers in the frequency domain. LTE has a typical subcarrier spacing of 15 kHz, compared to 75 kHz in 5GTF. The maximum carrier bandwidth in LTE is 20 MHz, compared to 100 MHz in 5GTF if 100 resource blocks are being used. The higher 5GTF and 5G bandwidth results in higher data rates and improved network capacity.
The 5GTF specification supports the use of carrier aggregation (CA) in the downlink and uplink using a maximum of eight component carriers (CC). If CA is used, the bandwidth will be 800 MHz (8 × 100 MHz). The throughput rate is calculated using transport block size (TBS), which is the number of bits transmitted in one subframe every transmit time interval (TTI). TBS is dependent on the number of resource blocks allocated to the user equipment (UE), as well as the modulation and coding scheme (MCS) used. In the 5GTF specification, the highest MCS is 64-QAM, while the highest TBS is 66,392 bits. This results in a throughput rate of 663.92 Mbps per CC. If eight CCs are used, the throughput becomes 5.3 Gbps per UE (663.92 Mbps × 8).
TESTING eMBB
While beamforming and mmWave are being utilized to maximize the capabilities of the available spectrum, use of these technologies makes 5G implementation and testing all the more challenging. When testing 5G data throughput at higher frequencies, for example, a test setup equipped with additional hardware is required. An example of the type of 5G test system needed for eMBB data throughput testing is shown in Figure 2. The system consists of a 5G network emulator that is connected and controlled by a PC running software for prototyping advanced 5G protocol features, like beamforming at mmWave frequencies. In this setup, the UE is connected to the test system using a mmWave connection, to support the high frequency link required to achieve the high data throughput of eMBB. The high frequency challenges from incorporating antenna connectors in chipsets and handsets means that any data throughput testing must be done over-the-air (OTA).
To ensure ease of use, the optimal 5G test system should allow designers to create, edit, configure and run tests (scripts) directly from a graphical user interface (GUI). With this capability, the script elements for activating, deactivating and reconfiguring 5G cells, inserting radio resource control (RRC) and non-access stratum (NAS) messages and inserting user prompts and verdicts can be easily dragged and dropped into an editor and then configured. If sample scripts are available within the tool, these can be loaded into the editor and modified as needed. Data throughput test cases are created by loading scripts and configuring script elements. Examples of parameters that can be configured are power levels for synchronization and reference signals, beamforming parameters and resource blocks for transmitting and receiving control information and data.
During testing, dynamic control points (DCP) are used to allow the 5G network emulator state machine to behave like a live network until a certain exit condition is met. The exit condition might be the device sending a particular message (e.g., attach complete), the user performing some action (e.g., sending data for data throughput testing) or the end of a configured guard timer. During a script run, at a DCP, the user should be able to modify the parameters using a L1/L2 configuration application (see Figure 3). Some of the parameters that can be configured include:
- Scheduling subframes for the connected cell, including the subframes to be used for uplink (UL) and downlink (DL) control information and those to be used for UL and DL data.
- Layer 2 parameters (e.g., frequency, beam reference signal (BRS) transmission period, BRS transmit power, system information block (xSIB) default configuration and physical broadcast channel (xPBCH) transmit periodicity).
To ensure rapid device development, the optimal 5G test system should allow access to detailed logs and log analysis tools to help diagnose issues quickly, reliably and efficiently. A good log application should provide message decoding, enhanced search facilities and rapid navigation tools for easily finding records of interest. A tool that allows for the addition of bookmarks to facilitate troubleshooting and exporting of features is highly recommended.
A graphical key performance indicator (KPI) view may be helpful, enabling designers to make informed decisions more quickly (see Figure 4). For data throughput, typical KPIs include graphs of data rates at different layers (PHY, MAC, RLC, PDCP and application), channel quality information (CQI), MCS, block error rate (BLER) and acknowledgment/negative-acknowledgment (ACK/NACK) versus time. It is also important to measure the quality of the signal since that affects data throughput. KPIs like beam state information (BSI) and beam refinement information (BRI) are used to check that the UE has selected the strongest beam, as reported by the network.
SUMMARY
Achieving 5 Gbps and higher data rates is an exciting prospect, although it presents unique implementation and test challenges. Addressing these demands requires test methods and platforms that can handle very high data rates without requiring time consuming, costly and complex test programming. Performing 5G testing efficiently and accurately is important. Just as important is easy access to 5G network parameters and test results. With this information, design changes can be made quickly and efficiently, guaranteeing a smooth transition from prototype to product. With the right test methods and platforms, engineers can better ride the wave to successful 5G development.