Training Radial Basis Neural Networks with the Extended Kalman Filter

    Research output: Contribution to journalArticlepeer-review

    Abstract

    Radial basis function (RBF) neural networks provide attractive possibilities for solving signal processing and pattern classification problems. Several algorithms have been proposed for choosing the RBF prototypes and training the network. The selection of the RBF prototypes and the network weights can be viewed as a system identification problem. As such, this paper proposes the use of the extended Kalman filter for the learning procedure. After the user chooses how many prototypes to include in the network, the Kalman filter simultaneously solves for the prototype vectors and the weight matrix. A decoupled extended Kalman filter is then proposed in order to decrease the computational effort of the training algorithm. Simulation results are presented on reformulated radial basis neural networks as applied to the Iris classification problem. It is shown that the use of the Kalman filter results in better learning than conventional RBF networks and faster learning than gradient descent.

    Original languageAmerican English
    JournalNeurocomputing
    Volume48
    StatePublished - Oct 1 2002

    Keywords

    • Radial basis function (RBF)
    • Training
    • Optimization
    • Gradient descent
    • Kalman filter

    Disciplines

    • Digital Communications and Networking
    • Electrical and Computer Engineering

    Cite this