Sparse model construction using coordinate descent optimization

Xia Hong, Yi Guo, Sheng Chen, Junbin Gao

Research output: Chapter in Book / Conference PaperConference Paperpeer-review

1 Citation (Scopus)

Abstract

![CDATA[We propose a new sparse model construction method aimed at maximizing a model's generalisation capability for a large class of linear-in-the-parameters models. The coordinate descent optimization algorithm is employed with a modified l1- penalized least squares cost function in order to estimate a single parameter and its regularization parameter simultaneously based on the leave one out mean square error (LOOMSE). Our original contribution is to derive a closed form of optimal LOOMSE regularization parameter for a single term model, for which we show that the LOOMSE can be analytically computed without actually splitting the data set leading to a very simple parameter estimation method. We then integrate the new results within the coordinate descent optimization algorithm to update model parameters one at the time for linear-in-the-parameters models. Consequently a fully automated procedure is achieved without resort to any other validation data set for iterative model evaluation. Illustrative examples are included to demonstrate the effectiveness of the new approaches.]]
Original languageEnglish
Title of host publication2013 18th International Conference on Digital Signal Processing, DSP 2013, Santorini, Greece, 1-3 July, 2013
PublisherIEEE
Number of pages6
ISBN (Print)9781467358057
DOIs
Publication statusPublished - 2013
EventInternational Conference on Digital Signal Processing -
Duration: 1 Jul 2013 → …

Conference

ConferenceInternational Conference on Digital Signal Processing
Period1/07/13 → …

Keywords

  • algorithms
  • linear models (statistics)
  • regularization

Fingerprint

Dive into the research topics of 'Sparse model construction using coordinate descent optimization'. Together they form a unique fingerprint.

Cite this