ABSTRACT
High-dimensional regression/classification continues to be an important and challenging problem, especially when features are highly correlated. Feature selection, combined with additional structure information on the features has been considered to be promising in promoting regression/classification performance. Graph-guided fused lasso (GFlasso) has recently been proposed to facilitate feature selection and graph structure exploitation, when features exhibit certain graph structures. However, the formulation in GFlasso relies on pairwise sample correlations to perform feature grouping, which could introduce additional estimation bias. In this paper, we propose three new feature grouping and selection methods to resolve this issue. The first method employs a convex function to penalize the pairwise l∞ norm of connected regression/classification coefficients, achieving simultaneous feature grouping and selection. The second method improves the first one by utilizing a non-convex function to reduce the estimation bias. The third one is the extension of the second method using a truncated l1 regularization to further reduce the estimation bias. The proposed methods combine feature grouping and feature selection to enhance estimation accuracy. We employ the alternating direction method of multipliers (ADMM) and difference of convex functions (DC) programming to solve the proposed formulations. Our experimental results on synthetic data and two real datasets demonstrate the effectiveness of the proposed methods.
Supplemental Material
- F. Bach, G. Lanckriet, and M. Jordan. Multiple kernel learning, conic duality, and the SMO algorithm. In ICML, 2004. Google ScholarDigital Library
- H. Bondell and B. Reich. Simultaneous regression shrinkage, variable selection, and supervised clustering of predictors with oscar. Biometrics, 64(1):115--123, 2008.Google ScholarCross Ref
- S. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein. Distributed optimization and statistical learning via the alternating direction method of multipliers. 2011.Google Scholar
- H. Chuang, E. Lee, Y. Liu, D. Lee, and T. Ideker. Network-based classification of breast cancer metastasis. Molecular systems biology, 3(1), 2007.Google Scholar
- H. Fei, B. Quanz, and J. Huan. Regularization and feature selection for networked features. In CIKM, pages 1893--1896, 2010. Google ScholarDigital Library
- S. Huang, J. Li, L. Sun, J. Liu, T. Wu, K. Chen, A. Fleisher, E. Reiman, and J. Ye. Learning brain connectivity of Alzheimer's disease from neuroimaging data. NIPS, 22:808--816, 2009.Google Scholar
- L. Jacob, G. Obozinski, and J. Vert. Group lasso with overlap and graph lasso. In ICML, pages 433--440, 2009. Google ScholarDigital Library
- R. Jenatton, J. Mairal, G. Obozinski, and F. Bach. Proximal methods for sparse hierarchical dictionary learning. In ICML, 2010.Google Scholar
- S. Kim and E. Xing. Statistical estimation of correlated genome associations to a quantitative trait network. PLoS genetics, 5(8):e1000587, 2009.Google ScholarCross Ref
- C. Li and H. Li. Network-constrained regularization and variable selection for analysis of genomic data. Bioinformatics, 24(9):1175--1182, 2008. Google ScholarDigital Library
- J. Liu and J. Ye. Moreau-Yosida regularization for grouped tree structure learning. NIPS, 2010.Google ScholarDigital Library
- A. Rinaldo. Properties and refinements of the fused lasso. The Annals of Statistics, 37(5B):2922--2952, 2009.Google ScholarCross Ref
- X. Shen and H. Huang. Grouping pursuit through a regularization solution surface. Journal of the American Statistical Association, 105(490):727--739, 2010.Google ScholarCross Ref
- X. Shen, H. Huang, and W. Pan. Simultaneous supervised clustering and feature selection over a graph. Submitted.Google Scholar
- X. Shen and J. Ye. Adaptive model selection. Journal of the American Statistical Association, 97(457):210--221, 2002.Google ScholarCross Ref
- P. Tao and L. An. Convex analysis approach to DC programming: Theory, algorithms and applications. Acta Math. Vietnam, 22(1):289--355, 1997.Google Scholar
- P. Tao and S. El Bernoussi. Duality in DC (difference of convex functions) optimization. subgradient methods. Trends in Mathematical Optimization, 84:277--293, 1988.Google ScholarCross Ref
- R. Tibshirani. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society. Series B, pages 267--288, 1996.Google ScholarCross Ref
- R. Tibshirani, M. Saunders, S. Rosset, J. Zhu, and K. Knight. Sparsity and smoothness via the fused lasso. Journal of the Royal Statistical Society: Series B, 67(1):91--108, 2005.Google ScholarCross Ref
- N. Tzourio-Mazoyer, B. Landeau, D. Papathanassiou, F. Crivello, O. Etard, N. Delcroix, B. Mazoyer, and M. Joliot. Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the {MNI MRI} single-subject brain. Neuroimage, 15(1):273--289, 2002.Google ScholarCross Ref
- L. Yuan, J. Liu, and J. Ye. Efficient methods for overlapping group lasso. NIPS, 2011.Google Scholar
- M. Yuan and Y. Lin. Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society: Series B, 68(1):49--67, 2006.Google ScholarCross Ref
- T. Zhang. Multi-stage convex relaxation for feature selection. stat, 1050:3, 2011.Google Scholar
- P. Zhao, G. Rocha, and B. Yu. The composite absolute penalties family for grouped and hierarchical variable selection. The Annals of Statistics, 37(6A):3468--3497, 2009.Google ScholarCross Ref
- L. Zhong and J. Kwok. Efficient sparse modeling with automatic feature grouping. ICML, 2011.Google ScholarDigital Library
- Y. Zhu, X. Shen, and W. Pan. Simultaneous grouping pursuit and feature selection in regression over an undirected graph. Submitted.Google Scholar
- H. Zou and T. Hastie. Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B, 67(2):301--320, 2005.Google ScholarCross Ref
Index Terms
- Feature grouping and selection over an undirected graph
Recommendations
Fast image deconvolution using closed-form thresholding formulas of Lq(q=12,23) regularization
In this paper, we focus on the research of fast deconvolution algorithm based on the non-convex L"q(q=12,23) sparse regularization. Recently, we have deduced the closed-form thresholding formula for L"1"2 regularization model (Xu (2010) [1]). In this ...
Multi-label feature selection via robust flexible sparse regularization
Highlights- A regularization norm named robust flexible sparse regularization (RFSR) is designed.
AbstractMulti-label feature selection is an efficient technique to deal with the high dimensional multi-label data by selecting the optimal feature subset. Existing researches have demonstrated that l 1-norm and l 2 , 1-norm are promising ...
Sparse regularization based feature selection: A survey
AbstractFeature selection, as an essential preprocessing tool, aims to identify a subset of crucial features by eliminating redundant and noisy features according to a predefined criterion. In recent years, sparse learning has received considerable ...
Comments