Abstract
A novel random-gradient-based algorithm is developed for online tracking the minor component (MC) associated with the smallest eigenvalue of the autocorrelation matrix of the input vector sequence. The five available learning algorithms for tracking one MC are extended to those for tracking multiple MCs or the minor subspace (MS). In order to overcome the dynamical divergence properties of some available random-gradient-based algorithms, we propose a modification of the Oja-type algorithms, called OJAm, which can work satisfactorily. The averaging differential equation and the energy function associated with the OJAm are given. It is shown that the averaging differential equation will globally asymptotically converge to an invariance set. The corresponding energy or Lyapunov functions exhibit a unique global minimum attained if and only if its state matrices span the MS of the autocorrelation matrix of a vector data stream. The other stationary points are saddle (unstable) points. The globally convergence of OJAm is also studied. The OJAm provides an efficient online learning for tracking the MS. It can track an orthonormal basis of the MS while the other five available algorithms cannot track any orthonormal basis of the MS. The performances of the relative algorithms are shown via computer simulations.
Original language | English |
---|---|
Number of pages | 9 |
Journal | IEEE transactions on neural networks |
Publication status | Published - 2005 |
Keywords
- Lyapunov functions
- convergence
- eigenvalue decomposition (EVD)
- energy function
- invariance set
- learning algorithm