skip to main content
10.1145/2339530.2339675acmconferencesArticle/Chapter ViewAbstractPublication PageskddConference Proceedingsconference-collections
research-article

Feature grouping and selection over an undirected graph

Published:12 August 2012Publication History

ABSTRACT

High-dimensional regression/classification continues to be an important and challenging problem, especially when features are highly correlated. Feature selection, combined with additional structure information on the features has been considered to be promising in promoting regression/classification performance. Graph-guided fused lasso (GFlasso) has recently been proposed to facilitate feature selection and graph structure exploitation, when features exhibit certain graph structures. However, the formulation in GFlasso relies on pairwise sample correlations to perform feature grouping, which could introduce additional estimation bias. In this paper, we propose three new feature grouping and selection methods to resolve this issue. The first method employs a convex function to penalize the pairwise l∞ norm of connected regression/classification coefficients, achieving simultaneous feature grouping and selection. The second method improves the first one by utilizing a non-convex function to reduce the estimation bias. The third one is the extension of the second method using a truncated l1 regularization to further reduce the estimation bias. The proposed methods combine feature grouping and feature selection to enhance estimation accuracy. We employ the alternating direction method of multipliers (ADMM) and difference of convex functions (DC) programming to solve the proposed formulations. Our experimental results on synthetic data and two real datasets demonstrate the effectiveness of the proposed methods.

Skip Supplemental Material Section

Supplemental Material

311a_t_talk_10.mp4

mp4

186.7 MB

References

  1. F. Bach, G. Lanckriet, and M. Jordan. Multiple kernel learning, conic duality, and the SMO algorithm. In ICML, 2004. Google ScholarGoogle ScholarDigital LibraryDigital Library
  2. H. Bondell and B. Reich. Simultaneous regression shrinkage, variable selection, and supervised clustering of predictors with oscar. Biometrics, 64(1):115--123, 2008.Google ScholarGoogle ScholarCross RefCross Ref
  3. S. Boyd, N. Parikh, E. Chu, B. Peleato, and J. Eckstein. Distributed optimization and statistical learning via the alternating direction method of multipliers. 2011.Google ScholarGoogle Scholar
  4. H. Chuang, E. Lee, Y. Liu, D. Lee, and T. Ideker. Network-based classification of breast cancer metastasis. Molecular systems biology, 3(1), 2007.Google ScholarGoogle Scholar
  5. H. Fei, B. Quanz, and J. Huan. Regularization and feature selection for networked features. In CIKM, pages 1893--1896, 2010. Google ScholarGoogle ScholarDigital LibraryDigital Library
  6. S. Huang, J. Li, L. Sun, J. Liu, T. Wu, K. Chen, A. Fleisher, E. Reiman, and J. Ye. Learning brain connectivity of Alzheimer's disease from neuroimaging data. NIPS, 22:808--816, 2009.Google ScholarGoogle Scholar
  7. L. Jacob, G. Obozinski, and J. Vert. Group lasso with overlap and graph lasso. In ICML, pages 433--440, 2009. Google ScholarGoogle ScholarDigital LibraryDigital Library
  8. R. Jenatton, J. Mairal, G. Obozinski, and F. Bach. Proximal methods for sparse hierarchical dictionary learning. In ICML, 2010.Google ScholarGoogle Scholar
  9. S. Kim and E. Xing. Statistical estimation of correlated genome associations to a quantitative trait network. PLoS genetics, 5(8):e1000587, 2009.Google ScholarGoogle ScholarCross RefCross Ref
  10. C. Li and H. Li. Network-constrained regularization and variable selection for analysis of genomic data. Bioinformatics, 24(9):1175--1182, 2008. Google ScholarGoogle ScholarDigital LibraryDigital Library
  11. J. Liu and J. Ye. Moreau-Yosida regularization for grouped tree structure learning. NIPS, 2010.Google ScholarGoogle ScholarDigital LibraryDigital Library
  12. A. Rinaldo. Properties and refinements of the fused lasso. The Annals of Statistics, 37(5B):2922--2952, 2009.Google ScholarGoogle ScholarCross RefCross Ref
  13. X. Shen and H. Huang. Grouping pursuit through a regularization solution surface. Journal of the American Statistical Association, 105(490):727--739, 2010.Google ScholarGoogle ScholarCross RefCross Ref
  14. X. Shen, H. Huang, and W. Pan. Simultaneous supervised clustering and feature selection over a graph. Submitted.Google ScholarGoogle Scholar
  15. X. Shen and J. Ye. Adaptive model selection. Journal of the American Statistical Association, 97(457):210--221, 2002.Google ScholarGoogle ScholarCross RefCross Ref
  16. P. Tao and L. An. Convex analysis approach to DC programming: Theory, algorithms and applications. Acta Math. Vietnam, 22(1):289--355, 1997.Google ScholarGoogle Scholar
  17. P. Tao and S. El Bernoussi. Duality in DC (difference of convex functions) optimization. subgradient methods. Trends in Mathematical Optimization, 84:277--293, 1988.Google ScholarGoogle ScholarCross RefCross Ref
  18. R. Tibshirani. Regression shrinkage and selection via the lasso. Journal of the Royal Statistical Society. Series B, pages 267--288, 1996.Google ScholarGoogle ScholarCross RefCross Ref
  19. R. Tibshirani, M. Saunders, S. Rosset, J. Zhu, and K. Knight. Sparsity and smoothness via the fused lasso. Journal of the Royal Statistical Society: Series B, 67(1):91--108, 2005.Google ScholarGoogle ScholarCross RefCross Ref
  20. N. Tzourio-Mazoyer, B. Landeau, D. Papathanassiou, F. Crivello, O. Etard, N. Delcroix, B. Mazoyer, and M. Joliot. Automated anatomical labeling of activations in SPM using a macroscopic anatomical parcellation of the {MNI MRI} single-subject brain. Neuroimage, 15(1):273--289, 2002.Google ScholarGoogle ScholarCross RefCross Ref
  21. L. Yuan, J. Liu, and J. Ye. Efficient methods for overlapping group lasso. NIPS, 2011.Google ScholarGoogle Scholar
  22. M. Yuan and Y. Lin. Model selection and estimation in regression with grouped variables. Journal of the Royal Statistical Society: Series B, 68(1):49--67, 2006.Google ScholarGoogle ScholarCross RefCross Ref
  23. T. Zhang. Multi-stage convex relaxation for feature selection. stat, 1050:3, 2011.Google ScholarGoogle Scholar
  24. P. Zhao, G. Rocha, and B. Yu. The composite absolute penalties family for grouped and hierarchical variable selection. The Annals of Statistics, 37(6A):3468--3497, 2009.Google ScholarGoogle ScholarCross RefCross Ref
  25. L. Zhong and J. Kwok. Efficient sparse modeling with automatic feature grouping. ICML, 2011.Google ScholarGoogle ScholarDigital LibraryDigital Library
  26. Y. Zhu, X. Shen, and W. Pan. Simultaneous grouping pursuit and feature selection in regression over an undirected graph. Submitted.Google ScholarGoogle Scholar
  27. H. Zou and T. Hastie. Regularization and variable selection via the elastic net. Journal of the Royal Statistical Society: Series B, 67(2):301--320, 2005.Google ScholarGoogle ScholarCross RefCross Ref

Index Terms

  1. Feature grouping and selection over an undirected graph

    Recommendations

    Comments

    Login options

    Check if you have access through your login credentials or your institution to get full access on this article.

    Sign in
    • Published in

      cover image ACM Conferences
      KDD '12: Proceedings of the 18th ACM SIGKDD international conference on Knowledge discovery and data mining
      August 2012
      1616 pages
      ISBN:9781450314626
      DOI:10.1145/2339530

      Copyright © 2012 ACM

      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      • Published: 12 August 2012

      Permissions

      Request permissions about this article.

      Request Permissions

      Check for updates

      Qualifiers

      • research-article

      Acceptance Rates

      Overall Acceptance Rate1,133of8,635submissions,13%

      Upcoming Conference

      KDD '24

    PDF Format

    View or Download as a PDF file.

    PDF

    eReader

    View online with eReader.

    eReader