Efforts to improve energy efficiency in cellular have focussed mostly on the electrical efficiency of modules such as the power amplifiers or the cooling. But the higher level network design has largely escaped attention until recently. Simon Tonks, expert in Communications and Electronic Systems at PA Consulting Group, investigates.

Mike Hibberd

April 19, 2011

8 Min Read
Prioritising efficiencies: energy versus spectrum
Simon Tonks, expert in communications and electronic systems, PA Consulting

There has been a focus on energy efficiency in sectors such as transport and domestic use for many years, driven mainly by energy cost and finite resources. More recently CO2 emissions have become a concern and the sectors under scrutiny have expanded to cover ICT systems including cellular networks.

Efforts to improve energy efficiency in cellular have focussed mostly on the electrical efficiency of modules such as the power amplifiers or the cooling. But the higher level network design has largely escaped attention until recently. Initiatives such as the GSMA’s Green Manifesto and Mobile VCE’s Green Radio are now starting to look at this area.

But PA’s analysis reveals that, far from contributing to energy efficiency, trends in RAN design towards greater spectrum efficiency have produced progressively less energy efficient networks.

Technical solutions to improve energy efficiency already exist. What is needed is the financial incentive to implement them. The recent Carbon Reduction Commitment and related carbon trading scheme are unlikely to achieve this. What is required instead is a rethink of spectrum licensing processes to prioritise energy efficiency over spectrum efficiency. Today the latter is far more important to operators than the former.

Operators continually have to look at how to increase their networks’ capacity as more subscribers join and the usage per subscriber increases, due to trends like the recent data boom. This occurs in both the short term when expanding an existing network and in the long term when considering the next generation of network to deploy. Broadly there are three options:

  • Increase the number of base stations

  • Increase the number of carriers and hence the amount of spectrum used

  • Increase the spectrum efficiency

With macro base station costs of around $160,000 per site and 3G licences costing a total of £22 billion in the UK, the first two options are very costly. So there has understandably been a lot of focus on the third option; spectrum efficiency.

Unfortunately there is a hidden penalty with the pursuit of this option. Higher spectrum efficiency tends to create lower energy efficiency. But the cost of energy, even with carbon taxes, is such that the network cost is still dominated by the spectrum licence, base station Capex and other non-energy Opex.

Pursuit of good spectrum efficiency at the expense of poor energy efficiency is a poor trade-off. Energy usage is becoming an increasingly pressing issue for several reasons: Climate change from CO2 emissions; security of supply of oil and gas; and ever increasing production costs as reserves of fossil fuels diminish.

Legislation and financial incentives are coming into this area too. The Carbon Reduction Commitment in the UK, for example, requires large users of electricity – cellular networks would certainly qualify – to buy credits for their energy usage according to how much CO2 is emitted.

By contrast, spectrum is like land – there is a finite supply but it can be re-used ad infinitum. In addition, the evolution of technology means that the amount of spectrum that can be used in an economically viable way keeps growing. Looking at the big picture, energy efficiency should be prioritised over spectrum efficiency. Spectrum can be re-used, energy cannot.

Spectrum efficiency is usually measured as the number of bits per second that are transmitted per Hz of spectrum. Primarily this is determined by the modulation scheme in use. Analogue FSK as used by telemetry systems for example will have a spectrum efficiency of around 0.2bps per Hz. LTE systems are looking to average 2bps per Hz. By using high order modulation schemes, systems such as DSL and fixed microwave links reach 6bps per Hz or more. But this approach hurts energy efficiency in two ways.

Firstly, to be able to receive high order modulation it is necessary to have a good signal to noise ratio – and that in turn means transmitting at a higher power than for signals with lower spectrum efficiency. Conversely, if more spectrum is used to send the same signal then a lower signal power can be used. This relationship is defined in the well-known Shannon’s Law.

The second effect compounds this problem. A high order modulation scheme is necessary to send a data rate greater than the channel bandwidth. This means a scheme with a high peak to average power ratio. The power required by Shannon’s Law is the average power, but the base station power amplifier must be specified for the peak power to avoid distorting the signal. It is this peak power rating that effectively determines the amplifier’s power consumption, and so the peak to average ratio means the power consumption rises even faster than the signal power. In a macrocell the power amplifier can account for the majority of the base station’s power consumption.

Thus the increase in signal power needed to achieve high spectrum efficiency is liable to create a disproportionate increase in the power consumption of the base stations.

We have seen that a move towards wider bandwidths and lower modulation depths will deliver the same capacity with lower energy per bit. But as well as the air interface technology, there is the matter of the network design.

To provide wide area coverage will need a number of base stations. With large cells, a small number of base stations are needed but they need to transmit a high power to achieve the necessary signal at the cell edge. Smaller cells require more base stations to cover the same area but their reduced powers mean that the overall power consumption is lower than for large cells.

In addition, the fact that there are more base stations means that the load on each cell has decreased. This gives the additional benefit of reducing the bit rate in the same bandwidth, further reducing the power required. Of course more cells means more backhaul links which themselves consume power, but PA’s analysis shows that there is still a net reduction in the energy per bit.

So if the energy efficiency and associated costs would benefit from a dense network of small cells operating at a modest bps per Hz level, there must be a reason why it isn’t done currently. After all, there is no new technology needed to achieve this. To proceed with the energy efficiency improvements discussed requires not just the technical solution but the business case.

A look at the cost involved in deploying a large number of cell sites shows that it dominates over any cost saving from reduced energy consumption. The extra cost of the sites means that the energy savings would have to accumulate for more than 20 years to repay the capital alone. To that can be added financing cost and increases in other Opex related to the number of sites. In an industry where a new generation of network is deployed approximately every ten years that does not make commercial sense.

The introduction of carbon trading has increased the sensitivity of the business case to energy consumption, but not significantly so. Just before prices shot up in response to the Japanese nuclear crisis they were trading at a price equivalent to €0.008 per kWh of electricity, around 12 per cent of the price of the electricity itself. Increasing energy costs to the point where they dominate over the Capex of building extra sites would create a huge increase in the price of mobile telecom services to consumers.

An alternative approach based on linking the spectrum cost to energy usage and not just to the bandwidth would give a much more positive incentive.

PA believes that regulators should prioritise energy efficiency over spectrum efficiency. The thrust of many telecoms regulators over the last decade has been to maximise use of the finite radio spectrum. Typically this has been encouraged by means of spectrum pricing, either through auctions or bandwidth-based licence fees. There has been little incentive in any other direction.

Whilst it is still important not to waste spectrum, licence applications and fees should be assessed on grounds that factor in the energy needed to provide the service. This should be given at least equal and preferably greater priority than the spectrum efficiency factor.

The cost of energy already gives a small incentive in the right direction. The disincentives are the site costs and the spectrum cost, and these are far stronger. Regulators are not in a position to have much impact on the site costs but they do determine the spectrum costs. Linking the cost of the spectrum licence to the energy efficiency of the network being licensed and reducing the dependence on the bandwidth would help to offset the investment needed to create a more efficient network.

If the dynamics of the spectrum marketplace can be tilted in this way then operators will have a greater incentive to find lower power means of providing their services. This in turn will be passed onto manufacturers and their representatives on the standards bodies. For years researchers have concentrated on maximising bit rate over a given channel. In future, given the right incentives from regulators, this emphasis should move towards minimising the energy per bit rather than maximising bits per Hz.

What is needed is a change of mindset from spectrum efficiency to energy efficiency and a change of financial incentives to encourage it.

www.paconsulting.com/wireless

[email protected]

About the Author(s)

Mike Hibberd

Mike Hibberd was previously editorial director at Telecoms.com, Mobile Communications International magazine and Banking Technology | Follow him @telecomshibberd

You May Also Like