HBP: improvement in BP algorithm for an adaptive MLP decision feedback equalizer

SS Yang, CL Ho, CM Lee�- …�on Circuits and Systems II: Express�…, 2006 - ieeexplore.ieee.org
SS Yang, CL Ho, CM Lee
IEEE Transactions on Circuits and Systems II: Express Briefs, 2006ieeexplore.ieee.org
Though the decision feedback equalizer (DFE) with multilayer perceptron (MLP) structure
can be trained effectively by the backpropagation (BP) algorithm, it is always accompanied
by the problem of local minimum. In order to solve some problems of the local minimum in
the BP algorithm and to improve the performance of the BP algorithm under the same MLP
structure, we combine the hierarchical approach and the BP algorithm to implement the MLP
DFE, and we call the new scheme hierarchical BP (HBP) algorithm. Based on the�…
Though the decision feedback equalizer (DFE) with multilayer perceptron (MLP) structure can be trained effectively by the backpropagation (BP) algorithm, it is always accompanied by the problem of local minimum. In order to solve some problems of the local minimum in the BP algorithm and to improve the performance of the BP algorithm under the same MLP structure, we combine the hierarchical approach and the BP algorithm to implement the MLP DFE, and we call the new scheme hierarchical BP (HBP) algorithm. Based on the hierarchical approach, from the input layer to the output layer of the MLP, every two layers of neural nodes (with one hidden layer) will be trained with an individual BP algorithm. Therefore, the entire MLP can be trained by several independent BP algorithms, unlike the standard BP algorithm, which utilizes only one BP algorithm to train the whole MLP structure. The results of performance evaluation indicate that the HBP algorithm not only strongly reduces the mean squared error but also yields a much lower bit-error rate than the standard BP algorithm does for equal computational cost and conditions.
ieeexplore.ieee.org