November 2008
Moray Rumney obtained his BSc in Electronics from Heriot-Watt University in Edinburgh in 1984. He then joined Hewlett-Packard / Agilent Technologies and has worked for them for 24 years. During this time Moray has been involved in the architecture and design of the RF signal generators, signal analyzers and system simulators used in the cellular communications R&D and manufacturing test industries. Moray started his wireless standardisation activities for HP in 1991 when he joined ETSI and contributed to the standardisation of the GSM air interface and type approval tests. In 1999 he joined 3GPP and has been a significant contributor to the development of the W-CDMA radio specifications and corresponding conformance tests. This standardisation work has evolved to incorporate HSDPA, HSUPA and now LTE. In addition to standards work Moray is a technology advisor within Agilent responsible for Agilent's next generation RF and system simulator products.
When I was growing up my favourite book was undoubtedly the “Guinness Book of World Records.” I used to dip in and out of it searching for the craziest extremes of human experience, be they through human achievement or simply the result of extremes of natural human variation. It was not until many years later that I discovered a much less well-known but arguably much more important book called the “Mackeson Book of Averages.” Mackeson is also a brewer of stout and hence rival to the much larger Guinness Company so they thought they would poke fun at Guinness’ second most famous product being the Book of Records. It has to be said that averages are far less exciting than records, but there is also a human fascination in comparing ourselves to the norm of society, be it salary, height, or size of some other part of our anatomy. The other thing in favour of a book of averages is that it does not need to be reprinted nearly so often since the averages change much more slowly than peaks.
And so it is with wireless. An examination of the growth in peak data rates from the introduction of GSM in 1992 up until IMT-Advanced arrives around 2015 shows a staggering 100,000x growth from 9.6 kbps to around 1 Gbps. This growth is plotted in the top black line of Figure 1. To the casual observer this is proof that Moore’s law must be driving wireless performance, but with the doubling occurring every 16 months rather than two years. If we look further, this growth in peak data rates is driven by two factors: an approximate 100x increase in peak spectral efficiency and a 1,000x increase in channel bandwidths. The peak efficiency gains are primarily due to higher order modulation, more efficient channel coding and techniques such as adaptive modulation and coding (AMC) and Hybrid Automatic repeat Request (HARQ).
Figure 1 Growth of peak data rate and cell capacity.
This all sounds very impressive, but there is a snag in turning these peak figures into typical or average performance. Once an electromagnetic signal is launched into the ether it behaves according to the electromagnetic propagation theorem developed in the 1860’s by James Clerk Maxwell, a son of my home town in Edinburgh, Scotland. In these days of ever decreasing cell sizes his infamous equations - which tormented us as students - predict signals will often travel further than we might want and into the next cell. At this point Claude Shannon enters the picture. He along with Hartley in the 1940’s developed the Shannon-Hartley capacity theorem which predicts the error-free capacity of a communications channel as a logarithmic function of the signal to noise ratio (SNR). In cellular systems the dominant source of noise is unwanted signals from other users, primarily from adjacent cells sharing the same frequency and time. The difficulty this creates is that most of the techniques used to drive up peak spectral efficiency also rely on ever higher SNR which means the peak spectral efficiency figures can only be realised in an ever-shrinking area of low interference near the cell centre.
A CDF showing the typical distribution of interference in an urban macrocell is given in Figure 2. The median geometry factor, being the ratio of wanted to unwanted signals plus noise, is 5 dB yet for many high efficiency scenarios, geometry factors of 10 dB, 15 dB and even higher are required. However the distribution shows these levels are only experienced by <10% of the population. Conversely, users further out, and particularly those unfortunate enough to be on a cell boundary, experience much worse performance than is possible at the cell centre. Like it or not, this variation in performance is a law of physics which will not change until someone invents the spatially aware electromagnetic wave. This would conveniently stop in its tracks at the nominal cell boundary, or better still, bounce back in again to create the rich multipath needed for MIMO. A faraday cage around the cell would achieve this but there are implications on inter-cell mobility.
Figure 2 Distribution of geometry factor in typical urban cellF.
Back in the real world it is a fact that the average efficiency of the cell remains much lower than the peaks. The average efficiency is a highly significant quantity since when multiplied by the available spectrum, it predicts the capacity of the cell. The red trace in Figure 1 plots the growth in average efficiency alongside the blue trace which shows the growth in available spectrum1. Both traces are normalized to single-band GSM in 1992. The product of these represents the growth in cell capacity and is shown in the yellow trace. For the period from 1992 to around 2002 when EDGE was established, the growth in system capacity matched the growth in peak data rates, which over that period was a massive 50x comprising 7x gains in both efficiency and spectrum. The actual number of users and their data rates is a further function of the allocated bandwidth per user but if we hold this constant for illustrative purposes this means that the system would support seven times as many users each at a rate of around 70 kbps – quite an achievement.
From 2002 to the present day there have been further rises in average efficiency and spectrum but a 10x gap has grown between the cell capacity and the peak rates. This means that in a loaded cell the typical user will on average experience only 10% of the new higher peak data rate. From the present day through 2015 this gap grows to around 90x meaning the typical user will experience just over 1% of the peak data rates for which the system was designed. In motoring terms that’s like being sold a supercar that can travel at 180 mph but when you take it out on the road the network conditions restrict you to 2 mph. The exception would be if you were the only user on the road (only user in the cell) and came across a nice straight bit of road (excellent radio conditions) - then you could put your foot down and get what you paid for! For the rest of the time however you would have to sit in traffic moving no faster than cheaper cars designed with a top speed of perhaps 10 mph. Offering users the possibility of high performance, charging them for the infrastructure to support it and then not being able to consistently deliver the rates or coverage does not make for a sustainable business model for mass adoption. In lightly loaded systems much higher peaks will be seen by the lucky few, but this is not sustainable commercially.
So what is to be done? Is mobile broadband really out of reach or are there alternatives? To investigate this lets consider the following HSDPA simulation2. We start with a cell supporting the maximum 15 codes using 64QAM and a mobile with single receiver and an equalizer. This combination has a peak capacity of around 20 Mbps for the lucky single user in the right radio conditions. Next we load the cell with 34 randomly distributed users. What now is the aggregate capacity of the cell and the median throughput per user? Having bought his 20 Mbps-capable phone, Joe Public probably expects such performance irrespective of other users in the cell. His view might suggest an aggregate capacity of 34 x 20 Mbps = 680 Mbps. If only! Using our understanding of the air interface we know that the peak cell capacity is shared amongst the users by a scheduling algorithm. Without further analysis we might then reasonably conclude that the aggregate capacity remains at 20 Mbps providing a median throughput per user of around 588 kbps – quite attractive.
Unfortunately it’s not that simple. The random distribution of users means that very few are in ideal radio conditions and many are in quite poor conditions near the cell edge. Taking into account typical interference distribution this particular simulation concluded that the aggregate cell capacity was only 1.3 Mbps (0.26 bps/Hz) giving a median data rate of only 40 kbps per Joe. That’s less than dial up!
Let’s now consider adding a femtocell layer to the macrocell. In the simulation 96 femtocells, each with the same 20 Mbps peak rate as the macrocell, were randomly distributed. This resulted in 24 of the mobiles switching from the macrocell to the femtocells, the other 72 femtocells remaining unconnected. The result on performance is quite staggering. Twenty-four mobiles now have their own private base station in close proximity enabling the median data rate to rise a staggering 200x to 8 Mbps. The aggregate throughput of the macrocell and femtocells is now 270 Mbps. The 10 mobiles remaining on the macrocell see their median throughput rise from 50 kbps to 170 kbps – so everyone is a winner. A CDF showing the distribution of throughput with and without femtocells is shown in Figure 3, with expanded detail in Figure 4.
Figure 3 Distribution of throughput for cell with and without femtocell layer.
Figure 4 Expanded distribution of throughput for cell with and without femtocell layer.
This simulation clearly demonstrates how femtocells can dramatically boost data rates and overall system capacity for nomadic users. Clearly there remains room for improvement in the macrocell with enhancements such as interference cancellation, receive diversity, spatial multiplexing (MIMO) and beamforming, as well as gains from further spectrum growth. Even so, the upside from all these techniques, many of which are costly to implement, complex to operate and power hungry, might be in the order of 6x, and can’t begin to compete with the potential released by the divide and conquer approach of femtocells.
Conclusion
It is relatively easy to increase the peak performance of wireless, much like the top speed of cars on an open road. The challenge however is to improve the average. In motoring terms this would be like trying to double the average speed of traffic during rush hour – a task involving redesign of the entire transport ecosystem rather than just the design of a car itself. The days when improving macrocell average efficiency was relatively easy might be nearly over. However, adding user-financed femtocells to the network which benefit from improved radio conditions is a very attractive alternative for providing a true broadband experience for the nomadic user. The offloading of traffic from the macrocell also means that those users who need wide area coverage and mobility will also see improved performance. Finally, although femtocells offer a new paradigm in wireless they are not without their challenges. Interference mitigation, security, backhaul neutrality and the business model are some of the more prominent issues the industry needs to solve before femtocells become a preferred solution.
References
1. Spectrum and efficiency figures vary widely by geography. These figures are broadly indicative of the industry. Spectrum growth is based on a typical European model and includes growth identified at WARC-07.
2. 3GPP RAN WG4 Tdoc R4-081344