Saturday, 28 December 2013

Bentham eBooks: The Whole Story behind Blind Adaptive Equalizers/ Blind Deconvolution

Bentham Science aims to publish eBooks in all major areas of technology, social sciences, medicine and humanities. Their eBooks provide professionals, corporate researchers, academicians, graduates and under-graduates globally with the most contemporary information in their subject areas of interest.
It is well known that Intersymbol (ISI) Interference is a limiting factor in many communication environments where it causes an irreducible degradation of the bit error rate (BER) thus imposing an upper limit on the data symbol rate. In order to overcome the ISI problem, an equalizer is implemented in those systems. Among the three types of equalizers - non-blind, semi-blind and blind – the blind equalizer has the benefit of bandwidth saving and there is no need of going through a training phase. Blind equalization algorithms are essentially adaptive filtering algorithms designed such that they do not require the external supply of a desired response to generate the error signal in the output of the adaptive equalization filter. The algorithms generate an estimate of the desired response by applying a nonlinear transformation to sequences involved in the adaptation process. This nonlinearity is designed to minimize a cost function that is implicitly based on higher order statistics (HOS) according to one approach, or calculated directly according to the Bayes rules.
The complete review is available on:


Post a Comment

    Blogger news