Skip to main content
Erschienen in: Pattern Analysis and Applications 1/2023

27.07.2022 | Theoretical Advances

A multiple classifiers system with roulette-based feature subspace selection for one-vs-one scheme

verfasst von: Zhong-Liang Zhang, Chen-Yue Zhang, Xing-Gang Luo, Qing Zhou

Erschienen in: Pattern Analysis and Applications | Ausgabe 1/2023

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Classification is one of the most important topics in machine learning. However, most of these works focus on the two-class classification (i.e., classification into ‘positive’ and ‘negative’), whereas studies on multi-class classification are far from enough. In this study, we develop a novel methodology of multiple classifier systems (MCS) with one-vs-one (OVO) scheme for the multi-class classification task. First, the multi-class classification problem is divided into as many pairs of easier-to-solve binary sub-problems as possible. Subsequently, an optimal MCS is generated for each sub-problem using a roulette-based feature subspace selection and validation procedure. Finally, to identify the final class of a query sample, an OVO aggregation strategy is employed to obtain the class from the confidence score matrix derived from the MCS. To verify the effectiveness and robustness of the proposed approach, a thorough experimental study is performed. The extracted findings supported by the proper statistical analysis indicate the strength of the proposed method with respect to the state-of-the-art methods for multi-class classification problems.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Aha DW, Kibler D, Albert M (1991) Instance-based learning algorithms. Mach Learn 6:37–66CrossRef Aha DW, Kibler D, Albert M (1991) Instance-based learning algorithms. Mach Learn 6:37–66CrossRef
2.
Zurück zum Zitat Alcalá-Fdez J, Sanchez L, Garcia S, del Jesus MJ, Ventura S, Garrell JM, Otero J, Romero C, Bacardit J, Rivas VM et al (2009) KEEL: a software tool to assess evolutionary algorithms for data mining problems. Soft Comput 13(3):307–318CrossRef Alcalá-Fdez J, Sanchez L, Garcia S, del Jesus MJ, Ventura S, Garrell JM, Otero J, Romero C, Bacardit J, Rivas VM et al (2009) KEEL: a software tool to assess evolutionary algorithms for data mining problems. Soft Comput 13(3):307–318CrossRef
3.
Zurück zum Zitat Britto AS Jr, Sabourin R, Oliveira LE (2014) Dynamic selection of classifiers - a comprehensive review. Pattern Recognit 47(11):3665–3680CrossRef Britto AS Jr, Sabourin R, Oliveira LE (2014) Dynamic selection of classifiers - a comprehensive review. Pattern Recognit 47(11):3665–3680CrossRef
4.
Zurück zum Zitat Caruana R, Niculescu-Mizil A, Crew G, Ksikes A (2004) Ensemble selection from libraries of models. In: Proceedings of the twenty-first international conference on Machine learning, p 18 Caruana R, Niculescu-Mizil A, Crew G, Ksikes A (2004) Ensemble selection from libraries of models. In: Proceedings of the twenty-first international conference on Machine learning, p 18
5.
Zurück zum Zitat Cevikalp H, Polikar R (2008) Local classifier weighting by quadratic programming. IEEE Trans Neural Netw 19(10):1832–1838CrossRef Cevikalp H, Polikar R (2008) Local classifier weighting by quadratic programming. IEEE Trans Neural Netw 19(10):1832–1838CrossRef
6.
Zurück zum Zitat Chandrashekar G, Sahin F (2014) A survey on feature selection methods. Comput Electr Eng 40(1):16–28CrossRef Chandrashekar G, Sahin F (2014) A survey on feature selection methods. Comput Electr Eng 40(1):16–28CrossRef
7.
Zurück zum Zitat Chang CC, Lin CJ (2011) LIBSVM: a library for support vector machines. ACM Trans Intell Syst Technol 2(3):1–27CrossRef Chang CC, Lin CJ (2011) LIBSVM: a library for support vector machines. ACM Trans Intell Syst Technol 2(3):1–27CrossRef
9.
Zurück zum Zitat Chiew KL, Tan CL, Wong K, Yong KS, Tiong WK (2019) A new hybrid ensemble feature selection framework for machine learning-based phishing detection system. Inf Sci 484:153–166CrossRef Chiew KL, Tan CL, Wong K, Yong KS, Tiong WK (2019) A new hybrid ensemble feature selection framework for machine learning-based phishing detection system. Inf Sci 484:153–166CrossRef
10.
Zurück zum Zitat Cicchetti DV, Feinstein AR (1990) High agreement but low kappa: II. resolving the paradoxes. J Clin Epidemiol 43(6):551–558CrossRef Cicchetti DV, Feinstein AR (1990) High agreement but low kappa: II. resolving the paradoxes. J Clin Epidemiol 43(6):551–558CrossRef
11.
Zurück zum Zitat Clark P, Boswell R (1991) Rule induction with cn2: some recent improvements. In: European Working Session on Learning, Springer, pp 151–163 Clark P, Boswell R (1991) Rule induction with cn2: some recent improvements. In: European Working Session on Learning, Springer, pp 151–163
12.
Zurück zum Zitat Cohen J (1960) A coefficient of agreement for nominal scales. Educ Psychol Meas 20(1):37–46CrossRef Cohen J (1960) A coefficient of agreement for nominal scales. Educ Psychol Meas 20(1):37–46CrossRef
13.
Zurück zum Zitat Cruz RM, Sabourin R, Cavalcanti GD (2018) Dynamic classifier selection: recent advances and perspectives. Inf Fusion 41:195–216CrossRef Cruz RM, Sabourin R, Cavalcanti GD (2018) Dynamic classifier selection: recent advances and perspectives. Inf Fusion 41:195–216CrossRef
14.
Zurück zum Zitat Derrac J, García S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol Comput 1(1):3–18CrossRef Derrac J, García S, Molina D, Herrera F (2011) A practical tutorial on the use of nonparametric statistical tests as a methodology for comparing evolutionary and swarm intelligence algorithms. Swarm Evol Comput 1(1):3–18CrossRef
15.
Zurück zum Zitat Dietterich TG, Bakiri G (1994) Solving multiclass learning problems via error-correcting output codes. J Artif Intell Res 2:263–286MATHCrossRef Dietterich TG, Bakiri G (1994) Solving multiclass learning problems via error-correcting output codes. J Artif Intell Res 2:263–286MATHCrossRef
16.
Zurück zum Zitat Dos Santos EM, Sabourin R, Maupin P (2008) A dynamic overproduce-and-choose strategy for the selection of classifier ensembles. Pattern Recognit 41(10):2993–3009MATHCrossRef Dos Santos EM, Sabourin R, Maupin P (2008) A dynamic overproduce-and-choose strategy for the selection of classifier ensembles. Pattern Recognit 41(10):2993–3009MATHCrossRef
17.
Zurück zum Zitat Duan Y, Zou B, Xu J, Chen F, Wei J, Tang YY (2021) OAA-SVM-MS: a fast and efficient multi-class classification algorithm. Neurocomputing 454:448–460CrossRef Duan Y, Zou B, Xu J, Chen F, Wei J, Tang YY (2021) OAA-SVM-MS: a fast and efficient multi-class classification algorithm. Neurocomputing 454:448–460CrossRef
18.
Zurück zum Zitat Feinstein AR, Cicchetti DV (1990) High agreement but low kappa: I. The problems of two paradoxes. J Clin Epidemiol 43(6):543–549CrossRef Feinstein AR, Cicchetti DV (1990) High agreement but low kappa: I. The problems of two paradoxes. J Clin Epidemiol 43(6):543–549CrossRef
19.
Zurück zum Zitat Galar M, Fernández A, Barrenechea E, Bustince H, Herrera F (2011) An overview of ensemble methods for binary classifiers in multi-class problems: Experimental study on one-vs-one and one-vs-all schemes. Pattern Recognit 44(8):1761–1776CrossRef Galar M, Fernández A, Barrenechea E, Bustince H, Herrera F (2011) An overview of ensemble methods for binary classifiers in multi-class problems: Experimental study on one-vs-one and one-vs-all schemes. Pattern Recognit 44(8):1761–1776CrossRef
20.
Zurück zum Zitat Galar M, Fernández A, Barrenechea E, Bustince H, Herrera F (2013) Dynamic classifier selection for one-vs-one strategy: avoiding non-competent classifiers. Pattern Recognit 46(12):3412–3424CrossRef Galar M, Fernández A, Barrenechea E, Bustince H, Herrera F (2013) Dynamic classifier selection for one-vs-one strategy: avoiding non-competent classifiers. Pattern Recognit 46(12):3412–3424CrossRef
21.
Zurück zum Zitat Galar M, Fernández A, Barrenechea E, Herrera F (2015) DRCW-OVO: distance-based relative competence weighting combination for one-vs-one strategy in multi-class problems. Pattern Recognit 48(1):28–42CrossRef Galar M, Fernández A, Barrenechea E, Herrera F (2015) DRCW-OVO: distance-based relative competence weighting combination for one-vs-one strategy in multi-class problems. Pattern Recognit 48(1):28–42CrossRef
22.
Zurück zum Zitat García S, Fernández A, Luengo J, Herrera F (2010) Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Inf Sci 180(10):2044–2064CrossRef García S, Fernández A, Luengo J, Herrera F (2010) Advanced nonparametric tests for multiple comparisons in the design of experiments in computational intelligence and data mining: Experimental analysis of power. Inf Sci 180(10):2044–2064CrossRef
23.
Zurück zum Zitat Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. ACM SIGKDD Explor Newslett 11(1):10–18CrossRef Hall M, Frank E, Holmes G, Pfahringer B, Reutemann P, Witten IH (2009) The WEKA data mining software: an update. ACM SIGKDD Explor Newslett 11(1):10–18CrossRef
24.
Zurück zum Zitat Hall MA (1999) Correlation-based feature selection for machine learning. In: PhD thesis, Hamilton Hall MA (1999) Correlation-based feature selection for machine learning. In: PhD thesis, Hamilton
25.
Zurück zum Zitat Hassan MR, Huda S, Hassan MM, Abawajy J, Alsanad A, Fortino G (2022) Early detection of cardiovascular autonomic neuropathy: a multi-class classification model based on feature selection and deep learning feature fusion. Inf Fusion 77:70–80CrossRef Hassan MR, Huda S, Hassan MM, Abawajy J, Alsanad A, Fortino G (2022) Early detection of cardiovascular autonomic neuropathy: a multi-class classification model based on feature selection and deep learning feature fusion. Inf Fusion 77:70–80CrossRef
26.
Zurück zum Zitat Hodges J, Lehmann E (1962) Rank methods for combination of independent experiments in analysis of variance. Ann Math Stat 33:482–497MathSciNetMATHCrossRef Hodges J, Lehmann E (1962) Rank methods for combination of independent experiments in analysis of variance. Ann Math Stat 33:482–497MathSciNetMATHCrossRef
27.
Zurück zum Zitat Holm S (1979) A simple sequentially rejective multiple test procedure. Scand J Stat. pp 65–70 Holm S (1979) A simple sequentially rejective multiple test procedure. Scand J Stat. pp 65–70
28.
Zurück zum Zitat Huang YS, Suen CY (1995) A method of combining multiple experts for the recognition of unconstrained handwritten numerals. IEEE Trans Pattern Anal Mach Intell 17(1):90–94CrossRef Huang YS, Suen CY (1995) A method of combining multiple experts for the recognition of unconstrained handwritten numerals. IEEE Trans Pattern Anal Mach Intell 17(1):90–94CrossRef
29.
Zurück zum Zitat Jadhav S, He H, Jenkins K (2018) Information gain directed genetic algorithm wrapper feature selection for credit rating. Appl Soft Comput 69:541–553CrossRef Jadhav S, He H, Jenkins K (2018) Information gain directed genetic algorithm wrapper feature selection for credit rating. Appl Soft Comput 69:541–553CrossRef
30.
Zurück zum Zitat John GH, Langley P (1995) Estimating continuous distributions in bayesian classifiers. In: the Eleventh Conference on Uncertainty in Artificial Intelligence, pp 338–345 John GH, Langley P (1995) Estimating continuous distributions in bayesian classifiers. In: the Eleventh Conference on Uncertainty in Artificial Intelligence, pp 338–345
31.
Zurück zum Zitat Kang S, Cho S, Kang P (2015) Constructing a multi-class classifier using one-against-one approach with different binary classifiers. Neurocomputing 149:677–682CrossRef Kang S, Cho S, Kang P (2015) Constructing a multi-class classifier using one-against-one approach with different binary classifiers. Neurocomputing 149:677–682CrossRef
32.
Zurück zum Zitat Kittler J, Hatef M, Duin RP, Matas J (1998) On combining classifiers. IEEE Trans Pattern Anal Mach Intell 20(3):226–239CrossRef Kittler J, Hatef M, Duin RP, Matas J (1998) On combining classifiers. IEEE Trans Pattern Anal Mach Intell 20(3):226–239CrossRef
33.
Zurück zum Zitat Knerr S, Personnaz L, Dreyfus G (1990) Single-layer learning revisited: a stepwise procedure for building and training a neural network. In: Neurocomputing, Springer, pp 41–50 Knerr S, Personnaz L, Dreyfus G (1990) Single-layer learning revisited: a stepwise procedure for building and training a neural network. In: Neurocomputing, Springer, pp 41–50
34.
Zurück zum Zitat Kuncheva LI, Bezdek JC, Duin RP (2001) Decision templates for multiple classifier fusion: an experimental comparison. Pattern Recognit 34(2):299–314MATHCrossRef Kuncheva LI, Bezdek JC, Duin RP (2001) Decision templates for multiple classifier fusion: an experimental comparison. Pattern Recognit 34(2):299–314MATHCrossRef
35.
Zurück zum Zitat Mendialdua I, Martínez-Otzeta JM, Rodriguez-Rodriguez I, Ruiz-Vazquez T, Sierra B (2015) Dynamic selection of the best base classifier in one versus one. Knowl Based Syst 85:298–306CrossRef Mendialdua I, Martínez-Otzeta JM, Rodriguez-Rodriguez I, Ruiz-Vazquez T, Sierra B (2015) Dynamic selection of the best base classifier in one versus one. Knowl Based Syst 85:298–306CrossRef
36.
Zurück zum Zitat Moreno-Torres JG, Sáez JA, Herrera F (2012) Study on the impact of partition-induced dataset shift on \(k\)-fold cross-validation. IEEE Trans Neural Netw Learn Syst 23(8):1304–1312CrossRef Moreno-Torres JG, Sáez JA, Herrera F (2012) Study on the impact of partition-induced dataset shift on \(k\)-fold cross-validation. IEEE Trans Neural Netw Learn Syst 23(8):1304–1312CrossRef
37.
Zurück zum Zitat Nakariyakul S, Casasent DP (2009) An improvement on floating search algorithms for feature subset selection. Pattern Recognit 42(9):1932–1940MATHCrossRef Nakariyakul S, Casasent DP (2009) An improvement on floating search algorithms for feature subset selection. Pattern Recognit 42(9):1932–1940MATHCrossRef
40.
Zurück zum Zitat Onan A, Korukoğlu S, Bulut H (2017) A hybrid ensemble pruning approach based on consensus clustering and multi-objective evolutionary algorithm for sentiment classification. Inf Process Manag 53(4):814–833CrossRef Onan A, Korukoğlu S, Bulut H (2017) A hybrid ensemble pruning approach based on consensus clustering and multi-objective evolutionary algorithm for sentiment classification. Inf Process Manag 53(4):814–833CrossRef
41.
42.
Zurück zum Zitat Press WH, Flannery BP, Teukolsky SA, Vetterling WT (1988) Numer Recipes C. Cambridge University Press, CambridgeMATH Press WH, Flannery BP, Teukolsky SA, Vetterling WT (1988) Numer Recipes C. Cambridge University Press, CambridgeMATH
43.
Zurück zum Zitat Quinlan JR (1986) Induction of decision trees. Mach Learn 1(1):81–106CrossRef Quinlan JR (1986) Induction of decision trees. Mach Learn 1(1):81–106CrossRef
44.
Zurück zum Zitat Ram VSS, Kayastha N, Sha K (2022) Ofes: optimal feature evaluation and selection for multi-class classification. Data Knowl Eng, p 102007 Ram VSS, Kayastha N, Sha K (2022) Ofes: optimal feature evaluation and selection for multi-class classification. Data Knowl Eng, p 102007
45.
Zurück zum Zitat Raudys Š (2006) Trainable fusion rules. i. Large sample size case. Neural Netw 19(10):1506–1516MATHCrossRef Raudys Š (2006) Trainable fusion rules. i. Large sample size case. Neural Netw 19(10):1506–1516MATHCrossRef
46.
Zurück zum Zitat Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323(6088):533–536MATHCrossRef Rumelhart DE, Hinton GE, Williams RJ (1986) Learning representations by back-propagating errors. Nature 323(6088):533–536MATHCrossRef
47.
Zurück zum Zitat Saeys Y, Inza I, Larrañaga P (2007) A review of feature selection techniques in bioinformatics. Bioinformatics 23(19):2507–2517CrossRef Saeys Y, Inza I, Larrañaga P (2007) A review of feature selection techniques in bioinformatics. Bioinformatics 23(19):2507–2517CrossRef
48.
Zurück zum Zitat Sahoo KK, Dutta I, Ijaz MF, Woźniak M, Singh PK (2021) TLEFuzzyNet: fuzzy rank-based ensemble of transfer learning models for emotion recognition from human speeches. IEEE Access 9:166518–166530CrossRef Sahoo KK, Dutta I, Ijaz MF, Woźniak M, Singh PK (2021) TLEFuzzyNet: fuzzy rank-based ensemble of transfer learning models for emotion recognition from human speeches. IEEE Access 9:166518–166530CrossRef
49.
Zurück zum Zitat Salesi S, Cosma G, Mavrovouniotis M (2021) TAGA: Tabu asexual genetic algorithm embedded in a filter/filter feature selection approach for high-dimensional data. Inf Sci 565:105–127MathSciNetCrossRef Salesi S, Cosma G, Mavrovouniotis M (2021) TAGA: Tabu asexual genetic algorithm embedded in a filter/filter feature selection approach for high-dimensional data. Inf Sci 565:105–127MathSciNetCrossRef
50.
Zurück zum Zitat Seijo-Pardo B, Porto-Díaz I, Bolón-Canedo V, Alonso-Betanzos A (2017) Ensemble feature selection: homogeneous and heterogeneous approaches. Knowl Based Syst 118:124–139CrossRef Seijo-Pardo B, Porto-Díaz I, Bolón-Canedo V, Alonso-Betanzos A (2017) Ensemble feature selection: homogeneous and heterogeneous approaches. Knowl Based Syst 118:124–139CrossRef
51.
Zurück zum Zitat Senliol B, Gulgezen G, Yu L, Cataltepe Z (2008) Fast Correlation Based Filter (FCBF) with a different search strategy. In: International Symposium on Computer & Information Sciences, pp 1–4 Senliol B, Gulgezen G, Yu L, Cataltepe Z (2008) Fast Correlation Based Filter (FCBF) with a different search strategy. In: International Symposium on Computer & Information Sciences, pp 1–4
53.
Zurück zum Zitat Silva RA, Britto Jr AdS, Enembreck F, Sabourin R, de Oliveira LE (2020) CSBF: a static ensemble fusion method based on the centrality score of complex networks. Comput Intell 36(2):522–556MathSciNetCrossRef Silva RA, Britto Jr AdS, Enembreck F, Sabourin R, de Oliveira LE (2020) CSBF: a static ensemble fusion method based on the centrality score of complex networks. Comput Intell 36(2):522–556MathSciNetCrossRef
54.
Zurück zum Zitat Thabtah F, Kamalov F, Hammoud S, Shahamiri SR (2020) Least loss: a simplified filter method for feature selection. Inf Sci 534:1–15MathSciNetMATHCrossRef Thabtah F, Kamalov F, Hammoud S, Shahamiri SR (2020) Least loss: a simplified filter method for feature selection. Inf Sci 534:1–15MathSciNetMATHCrossRef
55.
Zurück zum Zitat Vapnik VN (1999) An overview of statistical learning theory. IEEE Trans Neural Netw 10(5):988–999CrossRef Vapnik VN (1999) An overview of statistical learning theory. IEEE Trans Neural Netw 10(5):988–999CrossRef
58.
Zurück zum Zitat Wilcoxon F (1945) Individual comparisons by ranking methods. Biom Bull 1(6):80–83CrossRef Wilcoxon F (1945) Individual comparisons by ranking methods. Biom Bull 1(6):80–83CrossRef
60.
Zurück zum Zitat Zhang J, Dai Q, Yao C (2021) DEP-TSP meta: a multiple criteria dynamic ensemble pruning technique ad-hoc for time series prediction. Int J Mach Learn Cybern, pp 1–24 Zhang J, Dai Q, Yao C (2021) DEP-TSP meta: a multiple criteria dynamic ensemble pruning technique ad-hoc for time series prediction. Int J Mach Learn Cybern, pp 1–24
61.
Zurück zum Zitat Zhang ZL, Luo XG, García S, Tang JF, Herrera F (2017) Exploring the effectiveness of dynamic ensemble selection in the one-versus-one scheme. Knowl Based Syst 125:53–63CrossRef Zhang ZL, Luo XG, García S, Tang JF, Herrera F (2017) Exploring the effectiveness of dynamic ensemble selection in the one-versus-one scheme. Knowl Based Syst 125:53–63CrossRef
62.
Zurück zum Zitat Zhang ZL, Luo XG, Zhou Q (2022) Drcw-fr\(k\)nn-ovo: distance-based related competence weighting based on fixed radius \(k\) nearest neighbour for one-vs-one scheme. Int J Mach Learn Cybern 13(5):1441–1459CrossRef Zhang ZL, Luo XG, Zhou Q (2022) Drcw-fr\(k\)nn-ovo: distance-based related competence weighting based on fixed radius \(k\) nearest neighbour for one-vs-one scheme. Int J Mach Learn Cybern 13(5):1441–1459CrossRef
63.
Zurück zum Zitat Zyblewski P, Sabourin R, Woźniak M (2021) Preprocessed dynamic classifier ensemble selection for highly imbalanced drifted data streams. Inf Fusion 66:138–154CrossRef Zyblewski P, Sabourin R, Woźniak M (2021) Preprocessed dynamic classifier ensemble selection for highly imbalanced drifted data streams. Inf Fusion 66:138–154CrossRef
Metadaten
Titel
A multiple classifiers system with roulette-based feature subspace selection for one-vs-one scheme
verfasst von
Zhong-Liang Zhang
Chen-Yue Zhang
Xing-Gang Luo
Qing Zhou
Publikationsdatum
27.07.2022
Verlag
Springer London
Erschienen in
Pattern Analysis and Applications / Ausgabe 1/2023
Print ISSN: 1433-7541
Elektronische ISSN: 1433-755X
DOI
https://doi.org/10.1007/s10044-022-01089-w

Weitere Artikel der Ausgabe 1/2023

Pattern Analysis and Applications 1/2023 Zur Ausgabe

Premium Partner