Skip to main content

2016 | OriginalPaper | Buchkapitel

Hidden Space Neighbourhood Component Analysis for Cancer Classification

verfasst von : Li Zhang, Xiaojuan Huang, Bangjun Wang, Fanzhang Li, Zhao Zhang

Erschienen in: Neural Information Processing

Verlag: Springer International Publishing

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Neighbourhood component analysis (NCA) is a method for learning a distance metric which can maximize the classification performance of the K nearest neighbour (KNN) classifier. However, NCA suffers from the small size sample problem that the number of samples is much less than the number of features. To remedy this, this paper proposes a hidden space neighbourhood components analysis (HSNCA), which is a nonlinear extension of NCA. HSNCA first maps the data in the original space into a feature space by a set of nonlinear mapping functions, and then performs NCA in the feature space. Notably, the number of samples is equal to the number of features in the feature space. Thus, HSNCA can avoid the small size sample problem. Experimental results on DNA array datasets show that HSNCA is feasibility and efficiency.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Guyon, I., Weston, J., Barnhill, S., Vapink, V.: Gene selection for cancer classification using support vector machines. Mach. Learn. 46, 389–422 (2002)CrossRefMATH Guyon, I., Weston, J., Barnhill, S., Vapink, V.: Gene selection for cancer classification using support vector machines. Mach. Learn. 46, 389–422 (2002)CrossRefMATH
2.
Zurück zum Zitat Li, J.T., Jia, Y.M., Li, W.L.: Adaptive huberized support vector machine and its application to microarray classification. Neural Comput. Appl. 20, 123–132 (2011)CrossRef Li, J.T., Jia, Y.M., Li, W.L.: Adaptive huberized support vector machine and its application to microarray classification. Neural Comput. Appl. 20, 123–132 (2011)CrossRef
3.
Zurück zum Zitat Li, L., Weinberg, C.-R., Darden, T.-A., Pedersen, L.-G.: Gene selection for sample classification based on gene expression data: study of sensitivity to choice of parameters of the GA/KNN method. Bioinformatics 17, 1131–1142 (2001)CrossRef Li, L., Weinberg, C.-R., Darden, T.-A., Pedersen, L.-G.: Gene selection for sample classification based on gene expression data: study of sensitivity to choice of parameters of the GA/KNN method. Bioinformatics 17, 1131–1142 (2001)CrossRef
4.
Zurück zum Zitat Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theor. IT–13, 21–27 (1967)CrossRefMATH Cover, T., Hart, P.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theor. IT–13, 21–27 (1967)CrossRefMATH
5.
Zurück zum Zitat Goldberger, J., Roweis, S., Hinton, G., Salakhutdinov, R.: Neighbourhood components analysis. In: Advances in Neural Information Processing Systems, vol. 17, pp. 513–520. MIT Press (2004) Goldberger, J., Roweis, S., Hinton, G., Salakhutdinov, R.: Neighbourhood components analysis. In: Advances in Neural Information Processing Systems, vol. 17, pp. 513–520. MIT Press (2004)
6.
Zurück zum Zitat Shental, N., Hertz, T., Weinshall, D., Pavel, M.: Adjustment learning and relevant component analysis. In: Proceedings of 7th European Conference on Computer Vision, London, UK, pp. 776–792 (2002) Shental, N., Hertz, T., Weinshall, D., Pavel, M.: Adjustment learning and relevant component analysis. In: Proceedings of 7th European Conference on Computer Vision, London, UK, pp. 776–792 (2002)
7.
Zurück zum Zitat Globerson, A., Roweis, S.T.: Metric learning by collapsing classes. In: Advances in Neural Information Processing Systems, vol. 18 (2005) Globerson, A., Roweis, S.T.: Metric learning by collapsing classes. In: Advances in Neural Information Processing Systems, vol. 18 (2005)
8.
Zurück zum Zitat Weinberger, K.Q., Saul, L.K.: Distance metric learning for large margin nearest neighbor classification. J. Mach. Learn. Res. 10, 207–244 (2009)MATH Weinberger, K.Q., Saul, L.K.: Distance metric learning for large margin nearest neighbor classification. J. Mach. Learn. Res. 10, 207–244 (2009)MATH
9.
Zurück zum Zitat Xing, E.P., Ng, A.Y., Jordan, M.I., Russell, S.: Distance metric learning, with application to clustering with side-information. In: Advances in Neural Information Processing Systems, vol. 14, pp. 521–528. MIT Press, Cambridge (2002) Xing, E.P., Ng, A.Y., Jordan, M.I., Russell, S.: Distance metric learning, with application to clustering with side-information. In: Advances in Neural Information Processing Systems, vol. 14, pp. 521–528. MIT Press, Cambridge (2002)
10.
Zurück zum Zitat Chopra, S., Hadsell, R., LeCunGoldberger, Y.: Learning a similiarty metric discriminatively, with application to face verification. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, San Diego, CA, pp. 349C–356 (2005) Chopra, S., Hadsell, R., LeCunGoldberger, Y.: Learning a similiarty metric discriminatively, with application to face verification. In: Proceedings of IEEE Conference on Computer Vision and Pattern Recognition, San Diego, CA, pp. 349C–356 (2005)
11.
Zurück zum Zitat Davis, J.V., Kulis, B., Jain, P., Sra, S., Dhillon, I.S.: Information-theoretic metric learning. In: Proceedings of 24th International Conference on Machine Learning, pp. 209–216. ACM, New York (2007) Davis, J.V., Kulis, B., Jain, P., Sra, S., Dhillon, I.S.: Information-theoretic metric learning. In: Proceedings of 24th International Conference on Machine Learning, pp. 209–216. ACM, New York (2007)
12.
Zurück zum Zitat Qin, C., Song, S., Huang, G., Zhu, L.: Unsupervised neighborhood component analysis for clustering. Neurocomputing 168, 609–617 (2015)CrossRef Qin, C., Song, S., Huang, G., Zhu, L.: Unsupervised neighborhood component analysis for clustering. Neurocomputing 168, 609–617 (2015)CrossRef
13.
Zurück zum Zitat Yang, W., Wang, K., Zuo, W.: Neighborhood component feature selection for high-dimensional data. J. Comput. 7(1), 161–168 (2012) Yang, W., Wang, K., Zuo, W.: Neighborhood component feature selection for high-dimensional data. J. Comput. 7(1), 161–168 (2012)
14.
Zurück zum Zitat Yang, Z., Laaksonen, J.: Regularized neighborhood component analysis. In: Ersbøll, B.K., Pedersen, K.S. (eds.) SCIA 2007. LNCS, vol. 4522, pp. 253–262. Springer, Heidelberg (2007)CrossRef Yang, Z., Laaksonen, J.: Regularized neighborhood component analysis. In: Ersbøll, B.K., Pedersen, K.S. (eds.) SCIA 2007. LNCS, vol. 4522, pp. 253–262. Springer, Heidelberg (2007)CrossRef
15.
Zurück zum Zitat Qin, C., Song, S., Huang, G.: Non-linear neighborhood component analysis based on constructive neural networks. In: Proceedings of 2014 IEEE International Conference on Systems, Man and Cybernetics, pp. 1997–2002. IEEE (2014) Qin, C., Song, S., Huang, G.: Non-linear neighborhood component analysis based on constructive neural networks. In: Proceedings of 2014 IEEE International Conference on Systems, Man and Cybernetics, pp. 1997–2002. IEEE (2014)
16.
Zurück zum Zitat Yang, W., Wang, K., Zuo, W.: Fast neighborhood component analysis. Neurocomputing 83(6), 31–37 (2012)CrossRef Yang, W., Wang, K., Zuo, W.: Fast neighborhood component analysis. Neurocomputing 83(6), 31–37 (2012)CrossRef
17.
Zurück zum Zitat Zhang, L., Zhou, W.D., Jiao, L.C.: Hidden space support vector machines. IEEE Trans. Neural Netw. 15(6), 1424–1434 (2004)CrossRef Zhang, L., Zhou, W.D., Jiao, L.C.: Hidden space support vector machines. IEEE Trans. Neural Netw. 15(6), 1424–1434 (2004)CrossRef
18.
Zurück zum Zitat Zhou, W., Zhang, L., Jiao, L.: Hidden space principal component analysis. In: Ng, W.-K., Kitsuregawa, M., Li, J., Chang, K. (eds.) PAKDD 2006. LNCS (LNAI), vol. 3918, pp. 801–805. Springer, Heidelberg (2006)CrossRef Zhou, W., Zhang, L., Jiao, L.: Hidden space principal component analysis. In: Ng, W.-K., Kitsuregawa, M., Li, J., Chang, K. (eds.) PAKDD 2006. LNCS (LNAI), vol. 3918, pp. 801–805. Springer, Heidelberg (2006)CrossRef
19.
Zurück zum Zitat Zhang, L., Zhou, W.D., Chang, P.-C.: Generalized nonlinear discriminant analysis and its small sample size problems. Neurocomputing 74, 568–574 (2011)CrossRef Zhang, L., Zhou, W.D., Chang, P.-C.: Generalized nonlinear discriminant analysis and its small sample size problems. Neurocomputing 74, 568–574 (2011)CrossRef
20.
Zurück zum Zitat Ding, C., Zhang, L., Wang, B.J.: Hidden space discriminant neighborhood embedding. In: Proceedings of 2014 International Joint Conference on Neural Networks, pp. 271–277. IEEE (2014) Ding, C., Zhang, L., Wang, B.J.: Hidden space discriminant neighborhood embedding. In: Proceedings of 2014 International Joint Conference on Neural Networks, pp. 271–277. IEEE (2014)
21.
Zurück zum Zitat Zhang, L., Zhou, W.-D., Chang, P.-C., Liu, J., Yan, Z., Wang, T., Li, F.-Z.: Kernel sparse representation-based classifier. IEEE Trans. Sig. Process. 60, 1684–1695 (2012)MathSciNetCrossRef Zhang, L., Zhou, W.-D., Chang, P.-C., Liu, J., Yan, Z., Wang, T., Li, F.-Z.: Kernel sparse representation-based classifier. IEEE Trans. Sig. Process. 60, 1684–1695 (2012)MathSciNetCrossRef
22.
Zurück zum Zitat Xu, Z., Dai, M., Meng, D.: Fast and efficient strategies for model selection of gaussian support vector machine. IEEE Trans. Syst. Man Cybern. - Part B: Cybern. 39(5), 1292–1307 (2009)CrossRef Xu, Z., Dai, M., Meng, D.: Fast and efficient strategies for model selection of gaussian support vector machine. IEEE Trans. Syst. Man Cybern. - Part B: Cybern. 39(5), 1292–1307 (2009)CrossRef
Metadaten
Titel
Hidden Space Neighbourhood Component Analysis for Cancer Classification
verfasst von
Li Zhang
Xiaojuan Huang
Bangjun Wang
Fanzhang Li
Zhao Zhang
Copyright-Jahr
2016
DOI
https://doi.org/10.1007/978-3-319-46681-1_6

Premium Partner