Monday, May 16, 2011

Do you like this Article?

The term radio resource management is generally used in wireless systems in a broad sense to cover all functions that are related to the assignment and the sharing of radio resources among the users (e.g., mobile terminals, radio bearers, user sessions) of the wireless network. The type of the required resource control, the required resource sharing, and the assignment methods are primarily determined by the basics of the multiple access technology, such as frequency division multiple access (FDMA), time division multiple access (TDMA), or code division multiple access (CDMA) and the feasible combinations thereof. Likewise, the smallest unit in which radio resources are assigned and distributed among the entities (e.g., power, time slots, frequency bands/carriers, or codes) also varies depending on the fundamentals of the multiple access technology employed on the radio interface. The placement and the distribution of the RRM functions to different network entities of the radio access network (RAN), including the functional distribution between the terminal and the network as well as the protocols and interfaces between the different entities, constitute the RAN architecture.

The radio interface of LTE is based on the OFDM technology, in which the radio resource appears as one common shared channel, shared by all users in the cell. The scheduler, which is located in the eNode B, controls the assignment of time-frequency blocks to UEs within the cell in an orthogonal manner so that no two UEs can thus intracell interference is avoided. One exception, though, is multiuser spatial multiplexing, also called multiuser MIMO (multiple input, multiple output), when multiple UEs with spatially separated channels the uplink of LTE.

Such a scheduler function is needed for both the uplink (UL) and the downlink (DL) so that it is compatible with frequency domain duplexing (FDD) and time domain duplexing (TDD) modes. The illustration below shows the resource grid of the uplink and downlink shared channels. The smallest unit in the resource grid is the resource element (RE), which corresponds to one subcarrier during one symbol duration. These resource elements are organized into larger blocks in time and in frequency, where seven of such symbol durations constitute one slot of length 0.5 ms and 12 subcarriers during one slot form the so-called resource block (RB).

Two consecutive time slots are called a subframe and 10 of such subframes create a frame, which is of 10-ms length. The scheduler can assign resource blocks only in pairs of two consecutive RBs (in time); that is, the smallest unit of resource that can be assigned is two RBs. There is, however, one important difference between the feasible assignments on the UL and DL shared channels. Because in the UL the modulation uses the single carrier FDMA (SC-FDMA) concept, the allocation of RBs per UE has to be on consecutive RBs in frequency. The SC-FDMA modulation basically corresponds to a discrete Fourier transform (DFT) precoded OFDM signal, where the modulation symbols are mapped to consecutive OFDM carriers. The primary motivation for using the SC-FDMA scheme in the UL is to achieve better peak-to-average power ratios. Because the LTE physical layer is defined such that it supports various multiantenna MIMO schemes such as transmit diversity and spatial multiplexing, the virtual space of radio resources is extended with a third dimension corresponding to the antenna port, in addition to the classical time and frequency domains. This essentially means that a time-frequency resource grid is available per antenna port. In the downlink, the system supports multistream transmission on up to four transmit antennas. In the uplink, no multistream transmission is supported from the same UE, but multiuser MIMO transmission is possible.

Defining the abstract resource element in LTE as the three tuple of [time, frequency, antenna port, the generic radio resource assignment problem in LTE can be formulated as finding an optimal allocation of the [time, frequency, antenna port] resource units to UEs so that the QoS requirements of the radio bearers are satisfied while minimizing the use of the radio resources. A closely related function to resource assignment is link adaptation (LA), which selects transport format—that is, modulation and coding scheme (MCS)—and allocates power to the assigned [time, frequency, antenna port] resource. Primarily, the scheduler in the eNode B executes the preceding resource assignment function, although the antenna configuration can be seen as a somewhat separated function from the generic scheduler operation. The scheduler selects the time-frequency resource to assign to a particular UE based on the channel conditions and the QoS needs of that UE. Then the LA function selects MCS and allocates power to the selected time-frequency resources.

The antenna configuration, such as the MIMO mode and its corresponding parameters (e.g., the precoding matrix), can be controlled basically separately from the time frequency assignments of the scheduler, although the two operations are not totally independent.

In an ideal case, the assignment of [time, frequency, antenna port] resources and the allocation of MCS and power setting would need to be done in a network-wide manner on a global knowledge basis in order to obtain the network-wide optimum assignment. However, for obvious reasons, this is infeasible in practical conditions because such a solution would require a global “super scheduler” function operating based on global information. Therefore, in practice, the resource assignment is performed by distributed entities operating on a cell level in the individual eNode Bs. However, this does not preclude some coordination between the distributed entities in neighbor eNode Bs—an important aspect of the RRM architecture that needs to be considered in LTE. Such neighbor eNode B coordination can be useful in the case of various RRM functions such as intercell interference coordination (ICIC).