Skip to main content
Top
Published in: Neural Computing and Applications 13/2021

03-01-2021 | Original Article

A new formation of supervised dimensionality reduction method for moving vehicle classification

Authors: K. Silpaja Chandrasekar, P. Geetha

Published in: Neural Computing and Applications | Issue 13/2021

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

Analyzing a large number of features set for the classification process entails cost and complexity. To reduce this burden, dimensionality reduction has been applied to the extracted set of features as a preprocessing step. Among dimensionality reduction algorithms, many methods fail to handle high-dimensional data and they increase information loss and are sensitive to outliers. Therefore, this research proposes a new supervised dimensionality reduction method developed using an improved formation of linear discriminant analysis with diagonal eigenvalues (LDA-DE) that simultaneously preserves the information and addresses the issues of the classification process. The proposed framework focuses on reducing the dimension of extracted features set by computing the scattered matrices from the class labels and the diagonal eigenvalue matrix. Methods to eliminate duplicate rows and columns, to avoid feature overwriting, and to remove outliers are included in the newly developed LDA-DE method. The new LDA-DE method implemented with a fuzzy random forest classifier is tested on two datasets—MIO-TCD and BIT-Vehicle—to classify the moving vehicles. The performance of our LDA-DE method is compared with five state-of-the-art dimensionality reduction methods. The experimental confusion matrix results show that the LDA-DE method generates the reduced feature vector of the objects to a maximum extent. Further, the newly developed LDA-DE method achieves the best reduction results with optimal performance parameter values (lowest mean and standard deviation and highest f-measure and accuracy) and minimal data processing time than the state-of-the-art methods, promising its application for a fast and effective dimensionality reduction for moving vehicle classification.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Ahmadkhani S, Adibi P (2016) Face recognition using supervised probabilistic principal component analysis mixture model in dimensionality reduction without loss framework. IET Comput Vis 10(3):193–201CrossRef Ahmadkhani S, Adibi P (2016) Face recognition using supervised probabilistic principal component analysis mixture model in dimensionality reduction without loss framework. IET Comput Vis 10(3):193–201CrossRef
3.
go back to reference Avron H, Boutsidis C, Toledo S, Zouzias A (2013) Efficient dimensionality reduction for canonical correlation analysis. In: International conference on machine learning, pp 347–355 Avron H, Boutsidis C, Toledo S, Zouzias A (2013) Efficient dimensionality reduction for canonical correlation analysis. In: International conference on machine learning, pp 347–355
4.
go back to reference Bunte K, Biehl M, Hammer B (2012) A general framework for dimensionality-reducing data visualization mapping. Neural Comput 24(3):771–804MATHCrossRef Bunte K, Biehl M, Hammer B (2012) A general framework for dimensionality-reducing data visualization mapping. Neural Comput 24(3):771–804MATHCrossRef
5.
go back to reference Chao G, Luo Y, Ding W (2019) Recent advances in supervised dimension reduction: a survey. Mach Learn Knowl Extr 1(1):341–358CrossRef Chao G, Luo Y, Ding W (2019) Recent advances in supervised dimension reduction: a survey. Mach Learn Knowl Extr 1(1):341–358CrossRef
6.
go back to reference Chen B, Xing L, Zhao H, Zheng N, Prı JC et al (2016) Generalized correntropy for robust adaptive filtering. IEEE Trans Signal Process 64(13):3376–3387MathSciNetMATHCrossRef Chen B, Xing L, Zhao H, Zheng N, Prı JC et al (2016) Generalized correntropy for robust adaptive filtering. IEEE Trans Signal Process 64(13):3376–3387MathSciNetMATHCrossRef
7.
go back to reference Clark J, Provost F (2019) Unsupervised dimensionality reduction versus supervised regularization for classification from sparse data. Data Min Knowl Discov 33(4):871–916MathSciNetMATHCrossRef Clark J, Provost F (2019) Unsupervised dimensionality reduction versus supervised regularization for classification from sparse data. Data Min Knowl Discov 33(4):871–916MathSciNetMATHCrossRef
8.
go back to reference Cornillon PA, Hengartner N, Jégou N, Matzner-Løber E (2013) Iterative bias reduction: a comparative study. Stat Comput 23(6):777–791MathSciNetMATHCrossRef Cornillon PA, Hengartner N, Jégou N, Matzner-Løber E (2013) Iterative bias reduction: a comparative study. Stat Comput 23(6):777–791MathSciNetMATHCrossRef
9.
go back to reference Das S, Pal NR (2019) An unsupervised fuzzy rule-based method for structure preserving dimensionality reduction with prediction ability. In: IFIP international conference on artificial intelligence applications and innovations. Springer, pp 413–424 Das S, Pal NR (2019) An unsupervised fuzzy rule-based method for structure preserving dimensionality reduction with prediction ability. In: IFIP international conference on artificial intelligence applications and innovations. Springer, pp 413–424
10.
go back to reference Ding C, Zhou D, He X, Zha H (2006) R 1-pca: rotational invariant l 1-norm principal component analysis for robust subspace factorization. In: Proceedings of the 23rd international conference on machine learning. ACM, pp 281–288 Ding C, Zhou D, He X, Zha H (2006) R 1-pca: rotational invariant l 1-norm principal component analysis for robust subspace factorization. In: Proceedings of the 23rd international conference on machine learning. ACM, pp 281–288
11.
go back to reference Domingues R, Filippone M, Michiardi P, Zouaoui J (2018) A comparative evaluation of outlier detection algorithms: experiments and analyses. Pattern Recognit 74:406–421MATHCrossRef Domingues R, Filippone M, Michiardi P, Zouaoui J (2018) A comparative evaluation of outlier detection algorithms: experiments and analyses. Pattern Recognit 74:406–421MATHCrossRef
12.
go back to reference Fakhari MG, Hashemi H (2019) Fisher discriminant analysis (fda), a supervised feature reduction method in seismic object detection. Geopersia 9(1):141–149 Fakhari MG, Hashemi H (2019) Fisher discriminant analysis (fda), a supervised feature reduction method in seismic object detection. Geopersia 9(1):141–149
13.
go back to reference George B (2017) A study of the effect of random projection and other dimensionality reduction techniques on different classification methods. Baselius Res 18:201769 George B (2017) A study of the effect of random projection and other dimensionality reduction techniques on different classification methods. Baselius Res 18:201769
14.
go back to reference Haut JM, Paoletti ME, Plaza J, Plaza A (2018) Fast dimensionality reduction and classification of hyperspectral images with extreme learning machines. J Real-Time Image Process 15(3):439–462CrossRef Haut JM, Paoletti ME, Plaza J, Plaza A (2018) Fast dimensionality reduction and classification of hyperspectral images with extreme learning machines. J Real-Time Image Process 15(3):439–462CrossRef
15.
go back to reference Hou Y, Song I, Min HK, Park CH (2012) Complexity-reduced scheme for feature extraction with linear discriminant analysis. IEEE Trans Neural Netw Learn Syst 23(6):1003–1009CrossRef Hou Y, Song I, Min HK, Park CH (2012) Complexity-reduced scheme for feature extraction with linear discriminant analysis. IEEE Trans Neural Netw Learn Syst 23(6):1003–1009CrossRef
16.
go back to reference Hu L, Cui J (2019) Digital image recognition based on fractional-order-PCA-SVM coupling algorithm. Measurement 145:150–159CrossRef Hu L, Cui J (2019) Digital image recognition based on fractional-order-PCA-SVM coupling algorithm. Measurement 145:150–159CrossRef
17.
go back to reference Hu P, Peng D, Guo J, Zhen L (2018) Local feature based multi-view discriminant analysis. Knowl Based Syst 149:34–46CrossRef Hu P, Peng D, Guo J, Zhen L (2018) Local feature based multi-view discriminant analysis. Knowl Based Syst 149:34–46CrossRef
18.
go back to reference Huang P, Gao G (2016) Parameterless reconstructive discriminant analysis for feature extraction. Neurocomputing 190:50–59CrossRef Huang P, Gao G (2016) Parameterless reconstructive discriminant analysis for feature extraction. Neurocomputing 190:50–59CrossRef
19.
go back to reference Hussain KF, Afifi M, Moussa G (2018) A comprehensive study of the effect of spatial resolution and color of digital images on vehicle classification. IEEE Trans Intell Transp Syst 20(3):1181–1190CrossRef Hussain KF, Afifi M, Moussa G (2018) A comprehensive study of the effect of spatial resolution and color of digital images on vehicle classification. IEEE Trans Intell Transp Syst 20(3):1181–1190CrossRef
20.
go back to reference Jayaprakash C, Damodaran BB, Soman K et al (2018) Randomized ICA and LDA dimensionality reduction methods for hyperspectral image classification. arXiv preprint arXiv:180407347 Jayaprakash C, Damodaran BB, Soman K et al (2018) Randomized ICA and LDA dimensionality reduction methods for hyperspectral image classification. arXiv preprint arXiv:​180407347
21.
go back to reference Jiang X, Li C, Sun J (2018) A modified k-means clustering for mining of multimedia databases based on dimensionality reduction and similarity measures. Cluster Comput 21(1):797–804CrossRef Jiang X, Li C, Sun J (2018) A modified k-means clustering for mining of multimedia databases based on dimensionality reduction and similarity measures. Cluster Comput 21(1):797–804CrossRef
22.
go back to reference Jin X, Zhao M, Chow TW, Pecht M (2013) Motor bearing fault diagnosis using trace ratio linear discriminant analysis. IEEE Trans Ind Electron 61(5):2441–2451CrossRef Jin X, Zhao M, Chow TW, Pecht M (2013) Motor bearing fault diagnosis using trace ratio linear discriminant analysis. IEEE Trans Ind Electron 61(5):2441–2451CrossRef
23.
go back to reference Kao LJ, Lee TS, Lu CJ (2016) A multi-stage control chart pattern recognition scheme based on independent component analysis and support vector machine. J Intell Manuf 27(3):653–664CrossRef Kao LJ, Lee TS, Lu CJ (2016) A multi-stage control chart pattern recognition scheme based on independent component analysis and support vector machine. J Intell Manuf 27(3):653–664CrossRef
24.
go back to reference Labani M, Moradi P, Ahmadizar F, Jalili M (2018) A novel multivariate filter method for feature selection in text classification problems. Eng Appl Artif Intell 70:25–37CrossRef Labani M, Moradi P, Ahmadizar F, Jalili M (2018) A novel multivariate filter method for feature selection in text classification problems. Eng Appl Artif Intell 70:25–37CrossRef
25.
go back to reference Li B, Fan ZT, Zhang XL, Huang DS (2019) Robust dimensionality reduction via feature space to feature space distance metric learning. Neural Netw 112:1–14MATHCrossRef Li B, Fan ZT, Zhang XL, Huang DS (2019) Robust dimensionality reduction via feature space to feature space distance metric learning. Neural Netw 112:1–14MATHCrossRef
26.
go back to reference Li J, Zhang J, Pang N, Qin X (2018) Weighted outlier detection of high-dimensional categorical data using feature grouping. IEEE Trans Syst Man Cybern Syst 50(11):4295–4308CrossRef Li J, Zhang J, Pang N, Qin X (2018) Weighted outlier detection of high-dimensional categorical data using feature grouping. IEEE Trans Syst Man Cybern Syst 50(11):4295–4308CrossRef
27.
go back to reference Lindau B, Lindkvist L, Andersson A, Söderberg R (2013) Statistical shape modeling in virtual assembly using PCA-technique. J Manuf Syst 32(3):456–463CrossRef Lindau B, Lindkvist L, Andersson A, Söderberg R (2013) Statistical shape modeling in virtual assembly using PCA-technique. J Manuf Syst 32(3):456–463CrossRef
28.
go back to reference Liu C, Wechsler H (2001) A shape-and texture-based enhanced fisher classifier for face recognition. IEEE Trans Image Process 10(4):598–608MATHCrossRef Liu C, Wechsler H (2001) A shape-and texture-based enhanced fisher classifier for face recognition. IEEE Trans Image Process 10(4):598–608MATHCrossRef
29.
30.
go back to reference Liu W, Luo Z, Li S (2018b) Improving deep ensemble vehicle classification by using selected adversarial samples. Knowl Based Syst 160:167–175CrossRef Liu W, Luo Z, Li S (2018b) Improving deep ensemble vehicle classification by using selected adversarial samples. Knowl Based Syst 160:167–175CrossRef
31.
go back to reference Lu H, Plataniotis KN, Venetsanopoulos A (2013) Multilinear subspace learning: dimensionality reduction of multidimensional data. Chapman and Hall/CRC, Boca RatonCrossRef Lu H, Plataniotis KN, Venetsanopoulos A (2013) Multilinear subspace learning: dimensionality reduction of multidimensional data. Chapman and Hall/CRC, Boca RatonCrossRef
32.
go back to reference Lu L, Zhao H (2017) Active impulsive noise control using maximum correntropy with adaptive kernel size. Mech Syst Signal Process 87:180–191CrossRef Lu L, Zhao H (2017) Active impulsive noise control using maximum correntropy with adaptive kernel size. Mech Syst Signal Process 87:180–191CrossRef
33.
go back to reference Lu L, Zhao H, Champagne B (2017) Steady-state analysis of the maximum correntropy volterra filter with application to nonlinear channel equalization. In: 2017 25th European signal processing conference (EUSIPCO). IEEE, pp 2689–2693 Lu L, Zhao H, Champagne B (2017) Steady-state analysis of the maximum correntropy volterra filter with application to nonlinear channel equalization. In: 2017 25th European signal processing conference (EUSIPCO). IEEE, pp 2689–2693
34.
go back to reference Markopoulos PP, Karystinos GN, Pados DA (2014) Optimal algorithms for \( l\_ \)\(1\)-subspace signal processing. IEEE Trans Signal Process 62(19):5046–5058MathSciNetMATHCrossRef Markopoulos PP, Karystinos GN, Pados DA (2014) Optimal algorithms for \( l\_ \)\(1\)-subspace signal processing. IEEE Trans Signal Process 62(19):5046–5058MathSciNetMATHCrossRef
35.
go back to reference Martin-Clemente R, Zarzoso V (2016) On the link between L1-PCA and ICA. IEEE Trans Pattern Anal Mach Intell 39(3):515–528CrossRef Martin-Clemente R, Zarzoso V (2016) On the link between L1-PCA and ICA. IEEE Trans Pattern Anal Mach Intell 39(3):515–528CrossRef
36.
go back to reference Melin P, Castillo O, Kacprzyk J (2017) Nature-inspired design of hybrid intelligent systems. Springer, BerlinCrossRef Melin P, Castillo O, Kacprzyk J (2017) Nature-inspired design of hybrid intelligent systems. Springer, BerlinCrossRef
37.
go back to reference Meng D, Zhao Q, Xu Z (2012) Improve robustness of sparse PCA by L1-norm maximization. Pattern Recognit 45(1):487–497MATHCrossRef Meng D, Zhao Q, Xu Z (2012) Improve robustness of sparse PCA by L1-norm maximization. Pattern Recognit 45(1):487–497MATHCrossRef
38.
go back to reference Mi JX, Zhu Q, Lu J (2019) Principal component analysis based on block-norm minimization. Appl Intell 49(6):2169–2177CrossRef Mi JX, Zhu Q, Lu J (2019) Principal component analysis based on block-norm minimization. Appl Intell 49(6):2169–2177CrossRef
39.
go back to reference Mishra S, Chawla M (2019) A comparative study of local outlier factor algorithms for outliers detection in data streams. In: Emerging technologies in data mining and information security. Springer, pp 347–356 Mishra S, Chawla M (2019) A comparative study of local outlier factor algorithms for outliers detection in data streams. In: Emerging technologies in data mining and information security. Springer, pp 347–356
40.
go back to reference Nazarpour A, Adibi P (2015) Two-stage multiple kernel learning for supervised dimensionality reduction. Pattern Recognit 48(5):1854–1862CrossRef Nazarpour A, Adibi P (2015) Two-stage multiple kernel learning for supervised dimensionality reduction. Pattern Recognit 48(5):1854–1862CrossRef
41.
42.
go back to reference Palese LL (2018) A random version of principal component analysis in data clustering. Comput Biol Chem 73:57–64CrossRef Palese LL (2018) A random version of principal component analysis in data clustering. Comput Biol Chem 73:57–64CrossRef
43.
go back to reference Ritchie A, Scott C, Balzano L, Kessler D, Sripada CS (2019) Supervised principal component analysis via manifold optimization. In: Proceedings of 2019 IEEE data science workshop (DSW) Ritchie A, Scott C, Balzano L, Kessler D, Sripada CS (2019) Supervised principal component analysis via manifold optimization. In: Proceedings of 2019 IEEE data science workshop (DSW)
44.
go back to reference Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326CrossRef Roweis ST, Saul LK (2000) Nonlinear dimensionality reduction by locally linear embedding. Science 290(5500):2323–2326CrossRef
45.
go back to reference Ruiz LFC, Guasselli LA, At Caten, Zanotta DC (2018) Iterative k-nearest neighbors algorithm (IKNN) for submeter spatial resolution image classification obtained by unmanned aerial vehicle (UAV). Int J Remote Sens 39(15–16):5043–5058CrossRef Ruiz LFC, Guasselli LA, At Caten, Zanotta DC (2018) Iterative k-nearest neighbors algorithm (IKNN) for submeter spatial resolution image classification obtained by unmanned aerial vehicle (UAV). Int J Remote Sens 39(15–16):5043–5058CrossRef
46.
go back to reference Sachin D et al (2015) Dimensionality reduction and classification through PCA and LDA. J Comput Appl 122(17):4–8 Sachin D et al (2015) Dimensionality reduction and classification through PCA and LDA. J Comput Appl 122(17):4–8
47.
go back to reference Sahu KSKK, Satao K (2016) Image compression methods using dimension reduction and classification through PCA and LDA: a review. Int J Sci Res 5:2277–2280 Sahu KSKK, Satao K (2016) Image compression methods using dimension reduction and classification through PCA and LDA: a review. Int J Sci Res 5:2277–2280
48.
go back to reference Sangaiah AK, Fakhry AE, Abdel-Basset M, El-henawy I (2018) Arabic text clustering using improved clustering algorithms with dimensionality reduction. Cluster Comput 22(2):4535–4549 Sangaiah AK, Fakhry AE, Abdel-Basset M, El-henawy I (2018) Arabic text clustering using improved clustering algorithms with dimensionality reduction. Cluster Comput 22(2):4535–4549
49.
go back to reference Santamaria I (2010) Handbook of blind source separation: Independentcomponent analysis and applications (common, p. and jutten,; 2010 [bookreview]. IEEE Signal Process Mag 30(2):133–134CrossRef Santamaria I (2010) Handbook of blind source separation: Independentcomponent analysis and applications (common, p. and jutten,; 2010 [bookreview]. IEEE Signal Process Mag 30(2):133–134CrossRef
50.
go back to reference Shen HW, Cheng XQ, Wang YZ, Chen Y (2012) A dimensionality reduction framework for detection of multiscale structure in heterogeneous networks. J Comput Sci Technol 27(2):341–357MATHCrossRef Shen HW, Cheng XQ, Wang YZ, Chen Y (2012) A dimensionality reduction framework for detection of multiscale structure in heterogeneous networks. J Comput Sci Technol 27(2):341–357MATHCrossRef
51.
go back to reference Song Y, Li Y, Qu J (2018) A new approach for supervised dimensionality reduction. Int J Data Warehous Min (IJDWM) 14(4):20–37CrossRef Song Y, Li Y, Qu J (2018) A new approach for supervised dimensionality reduction. Int J Data Warehous Min (IJDWM) 14(4):20–37CrossRef
52.
go back to reference Sorzano COS, Vargas J, Montano AP (2014) A survey of dimensionality reduction techniques. arXiv preprint arXiv:14032877 Sorzano COS, Vargas J, Montano AP (2014) A survey of dimensionality reduction techniques. arXiv preprint arXiv:​14032877
53.
go back to reference Tian Z, Ramakrishnan R, Birch LM (1996) An efficient data clustering method for very large databases. In: Proc of the ACM SIGMOD international conference on management of data, Montreal, pp 103–114 Tian Z, Ramakrishnan R, Birch LM (1996) An efficient data clustering method for very large databases. In: Proc of the ACM SIGMOD international conference on management of data, Montreal, pp 103–114
54.
go back to reference Tsagkarakis N, Markopoulos PP, Sklivanitis G, Pados DA (2018) L1-norm principal-component analysis of complex data. IEEE Trans Signal Process 66(12):3256–3267MathSciNetMATHCrossRef Tsagkarakis N, Markopoulos PP, Sklivanitis G, Pados DA (2018) L1-norm principal-component analysis of complex data. IEEE Trans Signal Process 66(12):3256–3267MathSciNetMATHCrossRef
55.
go back to reference Vaddi R, Manoharan P (2018) Probabilistic PCA based hyper spectral image classification for remote sensing applications. In: International conference on intelligent systems design and applications. Springer, pp 863–869 Vaddi R, Manoharan P (2018) Probabilistic PCA based hyper spectral image classification for remote sensing applications. In: International conference on intelligent systems design and applications. Springer, pp 863–869
56.
go back to reference Velliangiri S, Alagumuthukrishnan S et al (2019) A review of dimensionality reduction techniques for efficient computation. Procedia Comput Sci 165:104–111CrossRef Velliangiri S, Alagumuthukrishnan S et al (2019) A review of dimensionality reduction techniques for efficient computation. Procedia Comput Sci 165:104–111CrossRef
57.
go back to reference Wang S, Lu J, Gu X, Du H, Yang J (2016) Semi-supervised linear discriminant analysis for dimension reduction and classification. Pattern Recognit 57:179–189MATHCrossRef Wang S, Lu J, Gu X, Du H, Yang J (2016) Semi-supervised linear discriminant analysis for dimension reduction and classification. Pattern Recognit 57:179–189MATHCrossRef
58.
go back to reference Wen J, Fang X, Cui J, Fei L, Yan K, Chen Y, Xu Y (2018) Robust sparse linear discriminant analysis. IEEE Trans Circuits Syst Video Technol 29(2):390–403CrossRef Wen J, Fang X, Cui J, Fei L, Yan K, Chen Y, Xu Y (2018) Robust sparse linear discriminant analysis. IEEE Trans Circuits Syst Video Technol 29(2):390–403CrossRef
59.
go back to reference Wu H, Prasad S (2018) Semi-supervised dimensionality reduction of hyperspectral imagery using pseudo-labels. Pattern Recognit 74:212–224CrossRef Wu H, Prasad S (2018) Semi-supervised dimensionality reduction of hyperspectral imagery using pseudo-labels. Pattern Recognit 74:212–224CrossRef
60.
go back to reference Xie H, Li J, Xue H (2017) A survey of dimensionality reduction techniques based on random projection. arXiv preprint arXiv:170604371 Xie H, Li J, Xue H (2017) A survey of dimensionality reduction techniques based on random projection. arXiv preprint arXiv:​170604371
61.
go back to reference Xu C, Tao D, Xu C, Rui Y (2014) Large-margin weakly supervised dimensionality reduction. In: International conference on machine learning, pp 865–873 Xu C, Tao D, Xu C, Rui Y (2014) Large-margin weakly supervised dimensionality reduction. In: International conference on machine learning, pp 865–873
62.
go back to reference Yan J, Zhang B, Liu N, Yan S, Cheng Q, Fan W, Yang Q, Xi W, Chen Z (2006) Effective and efficient dimensionality reduction for large-scale and streaming data preprocessing. IEEE Trans Knowl Data Eng 18(3):320–333CrossRef Yan J, Zhang B, Liu N, Yan S, Cheng Q, Fan W, Yang Q, Xi W, Chen Z (2006) Effective and efficient dimensionality reduction for large-scale and streaming data preprocessing. IEEE Trans Knowl Data Eng 18(3):320–333CrossRef
63.
go back to reference Yu S, Bi J, Ye J (2008) Probabilistic interpretations and extensions for a family of 2d PCA-style algorithms. In: Proc. KDD workshop data min. using matri. tensors, pp 1–7 Yu S, Bi J, Ye J (2008) Probabilistic interpretations and extensions for a family of 2d PCA-style algorithms. In: Proc. KDD workshop data min. using matri. tensors, pp 1–7
64.
go back to reference Yu Y, Zhao H, Chen B, He Z (2016) Two improved normalized subband adaptive filter algorithms with good robustness against impulsive interferences. Circuits Syst Signal Process 35(12):4607–4619MathSciNetMATHCrossRef Yu Y, Zhao H, Chen B, He Z (2016) Two improved normalized subband adaptive filter algorithms with good robustness against impulsive interferences. Circuits Syst Signal Process 35(12):4607–4619MathSciNetMATHCrossRef
65.
go back to reference Yuan S, Mao X, Chen L (2017) Multilinear spatial discriminant analysis for dimensionality reduction. IEEE Trans Image Process 26(6):2669–2681MathSciNetMATHCrossRef Yuan S, Mao X, Chen L (2017) Multilinear spatial discriminant analysis for dimensionality reduction. IEEE Trans Image Process 26(6):2669–2681MathSciNetMATHCrossRef
66.
go back to reference Zhang X, Li K, Wu Z, Fu Y, Zhao H, Chen B (2016) Convex regularized recursive maximum correntropy algorithm. Signal Process 129:12–16CrossRef Zhang X, Li K, Wu Z, Fu Y, Zhao H, Chen B (2016) Convex regularized recursive maximum correntropy algorithm. Signal Process 129:12–16CrossRef
Metadata
Title
A new formation of supervised dimensionality reduction method for moving vehicle classification
Authors
K. Silpaja Chandrasekar
P. Geetha
Publication date
03-01-2021
Publisher
Springer London
Published in
Neural Computing and Applications / Issue 13/2021
Print ISSN: 0941-0643
Electronic ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-020-05524-z

Other articles of this Issue 13/2021

Neural Computing and Applications 13/2021 Go to the issue

Premium Partner