초록
The Long Term Evolution (LTE) system is designed to provide a high quality data service for fast moving mobile users. It is based on the Orthogonal Frequency Division Multiplexing (OFDM) and relies its channel estimation on the training samples which are systematically built within the transmitting data. Either a preamble or a lattice type is used for the distribution of training samples and the latter suits better for the multipath fading channel environment whose channel frequency response (CFR) fluctuates rapidly with time. In the lattice-type structure, the estimation of the CFR makes use of the least squares estimate (LSE) for each pilot samples, followed by an interpolation both in time-and in frequency-domain to fill up the channel estimates for subcarriers corresponding to data samples. All interpolation schemes should rely on the pilot estimates only, and thus, their performances are bounded by the quality of pilot estimates. However, the additive noise give rise to high fluctuation on the pilot estimates, especially in a communication environment with low signal-to-noise ratio. These high fluctuations could be monitored in the alternating high values of the first forward differences (FFD) between pilot estimates. In this paper, we analyzed statistically those FFD values and propose a postprocessing algorithm to suppress high fluctuations in the noisy pilot estimates. The proposed method is based on a localized adaptive moving-average filtering. The performance of the proposed technique is verified on a multipath environment suggested on a 3GPP LTE specification. It is shown that the mean-squared error (MSE) between the actual CFR and pilot estimates could be reduced up to 68% from the noisy pilot estimates.