Abstract
![CDATA[In this paper, we propose an algorithm encouraging group sparsity under some convex constraint. It stems from some applications where the regression coefficients are subject to constraints, for example nonnegativity and the explanatory variables are not suitable to be orthogonalized within groups. It takes the form of the group LASSO based on linear regression model where a L1/L2 norm is imposed on group coefficients to achieve group sparsity. It differs from the original group LASSO in the following ways. First, the regression coefficients must obey some convex constraints. Second, there is no requirement for orthogonality of the variables within individual groups. For these rea- sons, the simple blockwise coordinate descent for all group coefficients is no longer applicable and a special treatment for the constraint is necessary. The algorithm we proposed in this paper is an alternating direction method, and both exact and inexact solutions are provided. The inexact version simplifies the computation while retaining practical convergence. As an approximation to group L0, it can be applied to data analysis where group fitting is essential and the coefficients are constrained. It may serve as a screening procedure to reduce the number of the groups when the number of total groups is prohibitively high.]]
Original language | English |
---|---|
Title of host publication | AI 2012: Advances in Artificial Intelligence: 25th International Australasian Joint Conference, Sydney, Australia, December 4-7, 2012, Proceedings |
Publisher | Springer |
Pages | 433-444 |
Number of pages | 12 |
ISBN (Print) | 9783642351006 |
DOIs | |
Publication status | Published - 2012 |
Event | Australasian Joint Conference on Artificial Intelligence - Duration: 1 Dec 2013 → … |
Conference
Conference | Australasian Joint Conference on Artificial Intelligence |
---|---|
Period | 1/12/13 → … |
Keywords
- algorithm
- least squares
- linear models (statistics)