How LTE deals with inter-cell interference

For some reason, all Russian-language posts devoted to LTE discuss only the principles of basic technologies of the physical layer - OFDMA [1] , SC-FDMA [2] , a little bit MIMO [3] , [4] , some aspects of architecture [5] and VoLTE [6]. All this, of course, is very important and useful, but that's not all! After all, LTE, in addition to all of the above, is packed with very interesting solutions related to the distribution of time-frequency resources in the upstream and downstream channels (various algorithms for Scheduler), with the adaptation of modulation, coding and bandwidth to radio conditions, with procedures for accessing the medium, new types handovers, etc. - there are far from non-trivial approaches used ... But there is one more curious question that for some reason is ignored by the Habr community - how does the LTE network work in the complete absence of frequency-ter itorialnogo planning (Frequency Reuse Factor = 1!)? Consider an older network, say GSM (see below):


The entire frequency range was divided into subbands, and the main planning rule was to use different frequency bands in neighboring cells, otherwise, the signals of neighboring cells would be interfering and interfere with each other's happy life. In UMTS (WCDMA), everything was somewhat more complicated - all base stations (NodeB) used the same time-frequency resource, and different types of scrambling with orthogonal or pseudo-orthogonal sequences were used to separate signals from different cells or signals from different subscribers within the same cell.

One way or another - the problem of inter-cell interference (ICI-Inter-cell Interference) in the GSM and UMTS networks was not easy ... What do we see in LTE? Not only is the same frequency band used in all cells, but there is no scrambling of signals by orthogonal sequences (in the general case). What does it mean? If two neighboring BSKs (eNBs) allocate resource blocks in the same frequency band and at the same time for their subscribers to transmit data, then it is possible to say with certain probability that these subscribers will interfere with each other and will interfere. The most unpleasant situation will be observed at the edges of the honeycomb:



In this case, the probability of collision (the probability of packet distortion due to the simultaneous allocation of the same resource by two or more base stations to users) will obviously be affected by two factors: 1) the distance of the subscribers from each other, otherwise their proximity to the BS ( if the subscribers are close to the BS, the Power Control mechanism is activated (which is likely to force the phone to lower the level of transmitted power, as a result, the overall level of interference between cells will decrease). 2) the load in the cell (also a fairly obvious factor - the higher the load, the greater the likelihood of simultaneous allocation to subscribers at the edges of the cell of the same resource block). If you imitate the work of such a primitive scheduler, unaware of the load on the neighboring cell, etc.,



For a unit or maximum load, a situation is adopted when all the blocks of the time-frequency resource are distributed. To say that such values ​​of the probability of packet distortion is huge is to say nothing. This is a blatantly poor interference pattern. And, of course, it is unlikely that anyone would release LTE with such characteristics into the light.

So, what has been done in LTE to avoid this catastrophic interference between cells and not to resort to reuse of frequencies.

First, LTE runs a mechanism called ICIC (Inter-Cell Interference Coordination).... An interesting thing, I must say. Its detailed description with all the calculations can be found in the wonderful book at the end of this article, in section 12.5, who are interested. The meaning of fitcha is that neighboring eNBs (BSKs) transmit information about their load in the form of an Overload Indicator (OI) via the X2 interface. Thus, they actually have the opportunity to agree among themselves which one of them will use which subband at what point in time. In this case, the frequency-territorial distribution will look something like this:



That is, to subscribers located closer to the antenna, the eNB can give any resource blocks, and to those who are farther away, depending on the OI indicator. This is by no means a classic reuse of frequencies. This is an adaptive distribution of resources that adapts to the load on neighboring cells and this is the main way to reduce interference between cells (reduction - but not complete elimination, of course).

In addition to such directional mechanisms, LTE provides indirect methods to reduce interference. For example, Fractional Power Control. If classical power control was aimed at full compensation for signal loss during propagation (PathLoss compensation), then partial power control means partial compensation for such losses.



The parameter setting the coefficient value for Path Loss compensation is called in the Alpha standard (takes values ​​from 0 to 1). How it works: alpha value of 0.8 (80% - compensation for signal loss) allows you to reduce the level of inter-cell interference by 10-20%! At the same time, subscribers at the edges of the cell do not experience noticeable problems caused by incomplete compensation of Path Loss.

There are many more parameters that can be adjusted so that the cells interfere less, but ICIC and Fractional Power Control are perhaps the two most powerful mechanisms.

Very useful LTE book:
Stefania Sesia (ST-Ericsson, France), Issam Toufik (ETSI, France), Matthew Baker (Alcatel-Lucent), The UMTS Long Term Evolution. From Theory to Practice

Also popular now: