Skip to main content
Erschienen in: Pattern Analysis and Applications 2/2017

19.07.2015 | Theoretical Advances

Locally adaptive k parameter selection for nearest neighbor classifier: one nearest cluster

verfasst von: Faruk Bulut, Mehmet Fatih Amasyali

Erschienen in: Pattern Analysis and Applications | Ausgabe 2/2017

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The k nearest neighbors (k-NN) classification technique has a worldly wide fame due to its simplicity, effectiveness, and robustness. As a lazy learner, k-NN is a versatile algorithm and is used in many fields. In this classifier, the k parameter is generally chosen by the user, and the optimal k value is found by experiments. The chosen constant k value is used during the whole classification phase. The same k value used for each test sample can decrease the overall prediction performance. The optimal k value for each test sample should vary from others in order to have more accurate predictions. In this study, a dynamic k value selection method for each instance is proposed. This improved classification method employs a simple clustering procedure. In the experiments, more accurate results are found. The reasons of success have also been understood and presented.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Myatt G (2007) Making sense of data: a practical guide to exploratory data analysis and data mining , p 176–181 Wiley, New York Myatt G (2007) Making sense of data: a practical guide to exploratory data analysis and data mining , p 176–181 Wiley, New York
3.
Zurück zum Zitat Ho T, Basu M (2002) Complexity measures of supervised classification problems. IEEE Trans Pattern Anal Mach Intell 24(3):289–300CrossRef Ho T, Basu M (2002) Complexity measures of supervised classification problems. IEEE Trans Pattern Anal Mach Intell 24(3):289–300CrossRef
4.
Zurück zum Zitat Bhatia N (2010) Survey of nearest neighbor techniques. Int J Comput Sci Inf Secur 8(2):302–305 Bhatia N (2010) Survey of nearest neighbor techniques. Int J Comput Sci Inf Secur 8(2):302–305
5.
Zurück zum Zitat Jiang L, Cai Z, Wang D, Jiang S (2007) Survey of improving K-nearest-neighbor for classification. In: Fourth international conference on fuzzy systems and knowledge discovery, vol 1, pp 679–683. doi:10.1109/FSKD.2007.552 Jiang L, Cai Z, Wang D, Jiang S (2007) Survey of improving K-nearest-neighbor for classification. In: Fourth international conference on fuzzy systems and knowledge discovery, vol 1, pp 679–683. doi:10.​1109/​FSKD.​2007.​552
6.
Zurück zum Zitat Miloud-Aouidate A, Baba-Ali AR (2011) Survey of nearest neighbor condensing techniques. Int J Adv Comput Sci Appl 2(11):59–64 Miloud-Aouidate A, Baba-Ali AR (2011) Survey of nearest neighbor condensing techniques. Int J Adv Comput Sci Appl 2(11):59–64
7.
Zurück zum Zitat Alexandros A, Michael O, Anthony B (2013) Adaptive distance metrics for nearest neighbour classification based on genetic programming. Lecture notes in computer science, vol 7831, Springer, Heidelberg, pp 1–12 Alexandros A, Michael O, Anthony B (2013) Adaptive distance metrics for nearest neighbour classification based on genetic programming. Lecture notes in computer science, vol 7831, Springer, Heidelberg, pp 1–12
8.
Zurück zum Zitat Wang J, Neskovic P, Cooper LN (2007) Improving nearest neighbor rule with a simple adaptive distance measure. Pattern Recognit Lett 28(2):207–213CrossRef Wang J, Neskovic P, Cooper LN (2007) Improving nearest neighbor rule with a simple adaptive distance measure. Pattern Recognit Lett 28(2):207–213CrossRef
10.
Zurück zum Zitat Ghosh AK, Chaudhuri P, Murthy CA (2006) Multi-scale classification using nearest neighbor density estimates. IEEE Trans Syst Man Cybern Part B 36(5):1139–1148CrossRef Ghosh AK, Chaudhuri P, Murthy CA (2006) Multi-scale classification using nearest neighbor density estimates. IEEE Trans Syst Man Cybern Part B 36(5):1139–1148CrossRef
11.
Zurück zum Zitat Ozger ZB and Amasyali M (2013) KNN parameter selection via meta learning. In: Signal processing and communications applications conference (SIU), Trabzon, Turkey. doi:10.1109/SIU.2013.6531231 Ozger ZB and Amasyali M (2013) KNN parameter selection via meta learning. In: Signal processing and communications applications conference (SIU), Trabzon, Turkey. doi:10.​1109/​SIU.​2013.​6531231
12.
Zurück zum Zitat Sanchez J, Pla F, Ferri F (1997) On the use of neighbourhood-based non-parametric classifiers. Pattern Recognit Lett 18(11–13):1179–1186CrossRef Sanchez J, Pla F, Ferri F (1997) On the use of neighbourhood-based non-parametric classifiers. Pattern Recognit Lett 18(11–13):1179–1186CrossRef
13.
15.
Zurück zum Zitat Ghosh AK (2007) On nearest neighbor classification using adaptive choice of k. J Comput Gr Stat 16(2):482–502MathSciNetCrossRef Ghosh AK (2007) On nearest neighbor classification using adaptive choice of k. J Comput Gr Stat 16(2):482–502MathSciNetCrossRef
16.
Zurück zum Zitat Wang H, Nie F, Huang H (2014) Robust distance metric learning via simultaneous l1-norm minimization and maximization. In: Proceedings of the 31st international conference on machine, JMLR: W&CP, vol 32. Beijing, China. Wang H, Nie F, Huang H (2014) Robust distance metric learning via simultaneous l1-norm minimization and maximization. In: Proceedings of the 31st international conference on machine, JMLR: W&CP, vol 32. Beijing, China.
17.
Zurück zum Zitat Xiang S, Nie F, Zhang C (2008) Learning a Mahalanobis distance metric for data clustering and classification. Pattern Recognit 41:3600–3612CrossRefMATH Xiang S, Nie F, Zhang C (2008) Learning a Mahalanobis distance metric for data clustering and classification. Pattern Recognit 41:3600–3612CrossRefMATH
19.
Zurück zum Zitat Weiss MA (2013) Data structures & algorithm analysis in C++, 4th edn. Pearson, London, pp 83–85, 614–618, 629 Weiss MA (2013) Data structures & algorithm analysis in C++, 4th edn. Pearson, London, pp 83–85, 614–618, 629
20.
Zurück zum Zitat Myatt G (2007) Making sense of data: a practical guide to exploratory data analysis and data mining. Wiley, New York, pp 120–129CrossRefMATH Myatt G (2007) Making sense of data: a practical guide to exploratory data analysis and data mining. Wiley, New York, pp 120–129CrossRefMATH
21.
Zurück zum Zitat Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNetMATH Demsar J (2006) Statistical comparisons of classifiers over multiple data sets. J Mach Learn Res 7:1–30MathSciNetMATH
Metadaten
Titel
Locally adaptive k parameter selection for nearest neighbor classifier: one nearest cluster
verfasst von
Faruk Bulut
Mehmet Fatih Amasyali
Publikationsdatum
19.07.2015
Verlag
Springer London
Erschienen in
Pattern Analysis and Applications / Ausgabe 2/2017
Print ISSN: 1433-7541
Elektronische ISSN: 1433-755X
DOI
https://doi.org/10.1007/s10044-015-0504-0

Weitere Artikel der Ausgabe 2/2017

Pattern Analysis and Applications 2/2017 Zur Ausgabe