Skip to main content
Erschienen in: Arabian Journal for Science and Engineering 11/2019

11.07.2019 | Research Article - Computer Engineering and Computer Science

A Novel Distance Metric Based on Differential Evolution

verfasst von: Ömer Faruk Ertuğrul

Erschienen in: Arabian Journal for Science and Engineering | Ausgabe 11/2019

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Distance has been employed as a representation of similarity for half a century. Many different distance metrics have been proposed in this duration such as Euclidean, Manhattan, Minkowski and weighted Euclidean distance metrics. Each of them has its own characteristics and is calculated in different formulations/manners. In this paper, a novel distance metric, which has a high adaptation capability, was proposed. In order to increase the adaptation ability of the proposed distance metric, its parameters were optimized according to the employed dataset by differential evolution (DE), which is a metaheuristic optimization method. The proposed distance metric was employed in the k-nearest neighbor, and 30 different benchmark datasets were used in the evaluation of the proposed approach. Each of the parameters of the novel distance metric and the parameters of DE was assessed based on the obtained accuracies. Obtained results validated the applicability of the proposed distance metric and the proposed approach.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Cover, T.; Hart, P.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 13(1), 21–27 (1967)CrossRefMATH Cover, T.; Hart, P.: Nearest neighbor pattern classification. IEEE Trans. Inf. Theory 13(1), 21–27 (1967)CrossRefMATH
2.
Zurück zum Zitat Goldstein, M.: kn-Nearest neighbor classification. IEEE Trans. Inf. Theory IT-18(5), 627–630 (1972)CrossRefMATH Goldstein, M.: kn-Nearest neighbor classification. IEEE Trans. Inf. Theory IT-18(5), 627–630 (1972)CrossRefMATH
3.
Zurück zum Zitat Adeniyi, D.A.; Wei, Z.; Yongquan, Y.: Automated web usage data mining and recommendation system using K-nearest neighbor (KNN) classification method. Appl. Comput. Inform. 12(1), 90–108 (2016)CrossRef Adeniyi, D.A.; Wei, Z.; Yongquan, Y.: Automated web usage data mining and recommendation system using K-nearest neighbor (KNN) classification method. Appl. Comput. Inform. 12(1), 90–108 (2016)CrossRef
4.
Zurück zum Zitat Song, Y.; Liang, J.; Lu, J.; Zhao, X.: An efficient instance selection algorithm for k nearest neighbor regression. Neurocomputing 251, 26–34 (2017)CrossRef Song, Y.; Liang, J.; Lu, J.; Zhao, X.: An efficient instance selection algorithm for k nearest neighbor regression. Neurocomputing 251, 26–34 (2017)CrossRef
5.
Zurück zum Zitat Denoeux, T.; Kanjanatarakul, O.; Sriboonchitta, S.: EK-NNclus: a clustering procedure based on the evidential K-nearest neighbor rule. Knowl. Based Syst. 88, 57–69 (2015)CrossRef Denoeux, T.; Kanjanatarakul, O.; Sriboonchitta, S.: EK-NNclus: a clustering procedure based on the evidential K-nearest neighbor rule. Knowl. Based Syst. 88, 57–69 (2015)CrossRef
6.
Zurück zum Zitat Mohammed, M.A.; Ghani, M.K.A.; Hamed, R.I.; Mostafa, S.A.; Ibrahim, D.A.; Jameel, H.K.; Alallah, A.H.: Solving vehicle routing problem by using improved K-nearest neighbor algorithm for best solution. J. Comput. Sci. 21, 232–240 (2017)CrossRef Mohammed, M.A.; Ghani, M.K.A.; Hamed, R.I.; Mostafa, S.A.; Ibrahim, D.A.; Jameel, H.K.; Alallah, A.H.: Solving vehicle routing problem by using improved K-nearest neighbor algorithm for best solution. J. Comput. Sci. 21, 232–240 (2017)CrossRef
7.
Zurück zum Zitat Chen, G.H.; Shah, D.: Explaining the success of nearest neighbor methods in prediction. Found. Trends® Mach. Learn. 10(5–6), 337–588 (2018)CrossRefMATH Chen, G.H.; Shah, D.: Explaining the success of nearest neighbor methods in prediction. Found. Trends® Mach. Learn. 10(5–6), 337–588 (2018)CrossRefMATH
8.
Zurück zum Zitat Guo, Y.; Han, S.; Li, Y.; Zhang, C.; Bai, Y.: K-nearest neighbor combined with guided filter for hyperspectral image classification. Procedia Comput. Sci. 129, 159–165 (2018)CrossRef Guo, Y.; Han, S.; Li, Y.; Zhang, C.; Bai, Y.: K-nearest neighbor combined with guided filter for hyperspectral image classification. Procedia Comput. Sci. 129, 159–165 (2018)CrossRef
9.
Zurück zum Zitat Joshi, A.; Mehta, A.: Analysis of K-nearest neighbor technique for breast cancer disease classification. Mach. Learn. 98, 13 (2018) Joshi, A.; Mehta, A.: Analysis of K-nearest neighbor technique for breast cancer disease classification. Mach. Learn. 98, 13 (2018)
10.
Zurück zum Zitat Wan, C.H.; Lee, L.H.; Rajkumar, R.; Isa, D.: A hybrid text classification approach with low dependency on parameter by integrating K-nearest neighbor and support vector machine. Expert Syst. Appl. 39(15), 11880–11888 (2012)CrossRef Wan, C.H.; Lee, L.H.; Rajkumar, R.; Isa, D.: A hybrid text classification approach with low dependency on parameter by integrating K-nearest neighbor and support vector machine. Expert Syst. Appl. 39(15), 11880–11888 (2012)CrossRef
11.
Zurück zum Zitat Zhang, M.L.; Zhou, Z.H.: A k-nearest neighbor based algorithm for multi-label classification. IEEE Int. Conf. Granul. Comput. 2, 718–721 (2005) Zhang, M.L.; Zhou, Z.H.: A k-nearest neighbor based algorithm for multi-label classification. IEEE Int. Conf. Granul. Comput. 2, 718–721 (2005)
12.
Zurück zum Zitat Beyer K; Goldstein J; Ramakrishnan R; Shaft U: When is “nearest neighbor” meaningful? In: International Conference on Database Theory, pp. 217–235 (1999) Beyer K; Goldstein J; Ramakrishnan R; Shaft U: When is “nearest neighbor” meaningful? In: International Conference on Database Theory, pp. 217–235 (1999)
13.
Zurück zum Zitat Ertuğrul, Ö.F.; Tağluk, M.E.: A novel version of k nearest neighbor: dependent nearest neighbor. Appl. Soft Comput. 55, 480–490 (2017)CrossRef Ertuğrul, Ö.F.; Tağluk, M.E.: A novel version of k nearest neighbor: dependent nearest neighbor. Appl. Soft Comput. 55, 480–490 (2017)CrossRef
14.
Zurück zum Zitat Triguero, I.; García, S.; Herrera, F.: Differential evolution (DE) for optimizing the positioning of prototypes in nearest neighbor classification. Pattern Recogn. 44(4), 901–916 (2011)CrossRef Triguero, I.; García, S.; Herrera, F.: Differential evolution (DE) for optimizing the positioning of prototypes in nearest neighbor classification. Pattern Recogn. 44(4), 901–916 (2011)CrossRef
15.
Zurück zum Zitat Kaur, M.; Kumar, V.: Adaptive differential evolution-based Lorenz chaotic system for image encryption. Arab. J. Sci. Eng. 43(12), 8127–8144 (2018)CrossRef Kaur, M.; Kumar, V.: Adaptive differential evolution-based Lorenz chaotic system for image encryption. Arab. J. Sci. Eng. 43(12), 8127–8144 (2018)CrossRef
16.
Zurück zum Zitat Price, K.V.; Storn, R.: Differential evolution: a simple evolution strategy for fast optimization. Dr. Dobb’s J. 22(4), 18–24 (1997)MATH Price, K.V.; Storn, R.: Differential evolution: a simple evolution strategy for fast optimization. Dr. Dobb’s J. 22(4), 18–24 (1997)MATH
18.
Zurück zum Zitat Wang, L.; Hu, H.; Ai, X.Y.; Liu, H.: Effective electricity energy consumption forecasting using echo state network improved by differential evolution algorithm. Energy 153, 801–815 (2018)CrossRef Wang, L.; Hu, H.; Ai, X.Y.; Liu, H.: Effective electricity energy consumption forecasting using echo state network improved by differential evolution algorithm. Energy 153, 801–815 (2018)CrossRef
19.
Zurück zum Zitat Wu, G.; Shen, X.; Li, H.; Chen, H.; Lin, A.; Suganthan, P.N.: Ensemble of differential evolution variants. Inf. Sci. 423, 172–186 (2018)MathSciNetCrossRef Wu, G.; Shen, X.; Li, H.; Chen, H.; Lin, A.; Suganthan, P.N.: Ensemble of differential evolution variants. Inf. Sci. 423, 172–186 (2018)MathSciNetCrossRef
21.
Zurück zum Zitat Das, S.; Suganthan, P.N.: Differential evolution: a survey of the state-of-the-art. IEEE Trans. Evol. Comput. 15(1), 4–31 (2011)CrossRef Das, S.; Suganthan, P.N.: Differential evolution: a survey of the state-of-the-art. IEEE Trans. Evol. Comput. 15(1), 4–31 (2011)CrossRef
22.
Zurück zum Zitat Lu, X.F.; Tang, K.: Classification- and regression-assisted differential evolution for computationally expensive problems. J. Comput. Sci. Technol. 27(5), 1024–1034 (2012)MathSciNetCrossRef Lu, X.F.; Tang, K.: Classification- and regression-assisted differential evolution for computationally expensive problems. J. Comput. Sci. Technol. 27(5), 1024–1034 (2012)MathSciNetCrossRef
23.
Zurück zum Zitat Omran, M.G.; Engelbrecht, A.P.; Salman, A.: Differential evolution methods for unsupervised image classification. In: The 2005 IEEE Congress on Evolutionary Computation, vol. 2, pp. 966–973 (2005) Omran, M.G.; Engelbrecht, A.P.; Salman, A.: Differential evolution methods for unsupervised image classification. In: The 2005 IEEE Congress on Evolutionary Computation, vol. 2, pp. 966–973 (2005)
24.
Zurück zum Zitat Zeng, Y.R.; Zeng, Y.; Choi, B.; Wang, L.: Multifactor-influenced energy consumption forecasting using enhanced back-propagation neural network. Energy 127, 381–396 (2017)CrossRef Zeng, Y.R.; Zeng, Y.; Choi, B.; Wang, L.: Multifactor-influenced energy consumption forecasting using enhanced back-propagation neural network. Energy 127, 381–396 (2017)CrossRef
25.
Zurück zum Zitat Pham, H.A.: Reduction of function evaluation in differential evolution using nearest neighbor comparison. Vietnam J. Comput. Sci. 2(2), 121–131 (2015)CrossRef Pham, H.A.: Reduction of function evaluation in differential evolution using nearest neighbor comparison. Vietnam J. Comput. Sci. 2(2), 121–131 (2015)CrossRef
26.
Zurück zum Zitat Dash, C.S.K.; Saran, A.; Sahoo, P.; Dehuri, S.; Cho, S.B.: Design of self-adaptive and equilibrium differential evolution optimized radial basis function neural network classifier for imputed database. Pattern Recogn. Lett. 80, 76–83 (2016)CrossRef Dash, C.S.K.; Saran, A.; Sahoo, P.; Dehuri, S.; Cho, S.B.: Design of self-adaptive and equilibrium differential evolution optimized radial basis function neural network classifier for imputed database. Pattern Recogn. Lett. 80, 76–83 (2016)CrossRef
27.
Zurück zum Zitat Boriah, S.; Chandola, V.; Kumar, V.: Similarity measures for categorical data: a comparative evaluation. Red 30(2), 243–254 (2008) Boriah, S.; Chandola, V.; Kumar, V.: Similarity measures for categorical data: a comparative evaluation. Red 30(2), 243–254 (2008)
28.
Zurück zum Zitat Cha, S.H.: Comprehensive survey on distance/similarity measures between probability density functions. Int. J. Math. Models Methods Appl. Sci. 1(4), 300–3007 (2007) Cha, S.H.: Comprehensive survey on distance/similarity measures between probability density functions. Int. J. Math. Models Methods Appl. Sci. 1(4), 300–3007 (2007)
29.
Zurück zum Zitat Hand, D.J.; Vinciotti, V.: Choosing k for two-class nearest neighbour classifiers with unbalanced classes. Pattern Recogn. Lett. 24(9), 1555–1562 (2003)CrossRefMATH Hand, D.J.; Vinciotti, V.: Choosing k for two-class nearest neighbour classifiers with unbalanced classes. Pattern Recogn. Lett. 24(9), 1555–1562 (2003)CrossRefMATH
30.
Zurück zum Zitat Jiang, Q.; Jin, X.; Lee, S.J.; Yao, S.: A new similarity/distance measure between intuitionistic fuzzy sets based on the transformed isosceles triangles and its applications to pattern recognition. Expert Syst. Appl. 116, 439–453 (2019)CrossRef Jiang, Q.; Jin, X.; Lee, S.J.; Yao, S.: A new similarity/distance measure between intuitionistic fuzzy sets based on the transformed isosceles triangles and its applications to pattern recognition. Expert Syst. Appl. 116, 439–453 (2019)CrossRef
31.
Zurück zum Zitat Ozcan, K.; Velipasalar, S.; Varshney, P.K.: Autonomous fall detection with wearable cameras by using relative entropy distance measure. IEEE Trans. Hum. Mach. Syst. 47(1), 31–39 (2017) Ozcan, K.; Velipasalar, S.; Varshney, P.K.: Autonomous fall detection with wearable cameras by using relative entropy distance measure. IEEE Trans. Hum. Mach. Syst. 47(1), 31–39 (2017)
32.
Zurück zum Zitat Peng, J.; Heisterkamp, D.R.; Dai, H.K.: Adaptive quasiconformal kernel nearest neighbor classification. IEEE Trans. Pattern Anal. Mach. Intell. 26(5), 656–661 (2004)CrossRef Peng, J.; Heisterkamp, D.R.; Dai, H.K.: Adaptive quasiconformal kernel nearest neighbor classification. IEEE Trans. Pattern Anal. Mach. Intell. 26(5), 656–661 (2004)CrossRef
33.
Zurück zum Zitat Yu, K.; Ji, L.; Zhang, X.: Kernel nearest-neighbor algorithm. Neural Process. Lett. 15(2), 147–156 (2002)CrossRefMATH Yu, K.; Ji, L.; Zhang, X.: Kernel nearest-neighbor algorithm. Neural Process. Lett. 15(2), 147–156 (2002)CrossRefMATH
34.
Zurück zum Zitat Zuo, W.; Zhang, D.; Wang, K.: On kernel difference-weighted k-nearest neighbor classification. Pattern Anal. Appl. 11(3–4), 247–257 (2008)MathSciNetCrossRef Zuo, W.; Zhang, D.; Wang, K.: On kernel difference-weighted k-nearest neighbor classification. Pattern Anal. Appl. 11(3–4), 247–257 (2008)MathSciNetCrossRef
35.
Zurück zum Zitat Chernoff, K.; Nielsen, M.: Weighting of the k-nearest-neighbors. In: IEEE 20th International Conference on Pattern Recognition (ICPR), pp. 666–669 (2010) Chernoff, K.; Nielsen, M.: Weighting of the k-nearest-neighbors. In: IEEE 20th International Conference on Pattern Recognition (ICPR), pp. 666–669 (2010)
36.
Zurück zum Zitat Dudani, S.A.: The distance-weighted k-nearest-neighbor rule. IEEE Trans. Syst. Man Cybern. 4, 325–327 (1976)CrossRef Dudani, S.A.: The distance-weighted k-nearest-neighbor rule. IEEE Trans. Syst. Man Cybern. 4, 325–327 (1976)CrossRef
37.
Zurück zum Zitat García-Pedrajas, N.; del Castillo, J.A.R.; Cerruela-García, G.: A proposal for local k values for k-nearest neighbor rule. IEEE Trans. Neural Netw. Learn. Syst. 28(2), 470–475 (2017)CrossRef García-Pedrajas, N.; del Castillo, J.A.R.; Cerruela-García, G.: A proposal for local k values for k-nearest neighbor rule. IEEE Trans. Neural Netw. Learn. Syst. 28(2), 470–475 (2017)CrossRef
38.
Zurück zum Zitat Hechenbichler, K.; Schliep, K.: Weighted k-nearest-neighbor techniques and ordinal classification. Sonderforschungsbereich 386, Paper 399 (2004) Hechenbichler, K.; Schliep, K.: Weighted k-nearest-neighbor techniques and ordinal classification. Sonderforschungsbereich 386, Paper 399 (2004)
39.
Zurück zum Zitat MacLeod, J.E.; Luk, A.; Titterington, D.M.: A re-examination of the distance-weighted k-nearest neighbor classification rule. IEEE Trans. Syst. Man Cybern. 17(4), 689–696 (1987)CrossRef MacLeod, J.E.; Luk, A.; Titterington, D.M.: A re-examination of the distance-weighted k-nearest neighbor classification rule. IEEE Trans. Syst. Man Cybern. 17(4), 689–696 (1987)CrossRef
40.
Zurück zum Zitat Duin, R.P.W.; Juszczak, P.; Paclik P.; Pekalska E.; de Ridder D.: PR-Tools 4.0, a Matlab Toolbox for Pattern Recognition, The Netherlands (2004) Duin, R.P.W.; Juszczak, P.; Paclik P.; Pekalska E.; de Ridder D.: PR-Tools 4.0, a Matlab Toolbox for Pattern Recognition, The Netherlands (2004)
42.
Zurück zum Zitat Ertuğrul, Ö.F.: A novel type of activation function in artificial neural networks: trained activation function. Neural Netw. 99, 148–157 (2018)CrossRef Ertuğrul, Ö.F.: A novel type of activation function in artificial neural networks: trained activation function. Neural Netw. 99, 148–157 (2018)CrossRef
43.
Zurück zum Zitat Bajpai, A.; Varshney, U.; Dubey, D: Performance enhancement of automatic speech recognition system using euclidean distance comparison and artificial neural network. In: 3rd International Conference On Internet of Things: Smart Innovation and Usages (IoT-SIU), pp. 1–5. IEEE (2018) Bajpai, A.; Varshney, U.; Dubey, D: Performance enhancement of automatic speech recognition system using euclidean distance comparison and artificial neural network. In: 3rd International Conference On Internet of Things: Smart Innovation and Usages (IoT-SIU), pp. 1–5. IEEE (2018)
44.
Zurück zum Zitat Pambudi, E.A.; Andono, P.N.; Pramunendar, R.A.: Image segmentation analysis based on K-means PSO by using three distance measures. ICTACT J. Image Video Process. 9(1), 1821–1826 (2018) Pambudi, E.A.; Andono, P.N.; Pramunendar, R.A.: Image segmentation analysis based on K-means PSO by using three distance measures. ICTACT J. Image Video Process. 9(1), 1821–1826 (2018)
45.
47.
Zurück zum Zitat Peng, L.; Liu, S.; Liu, R.; Wang, L.: Effective long short-term memory with differential evolution algorithm for electricity price prediction. Energy 162, 1301–1314 (2018)CrossRef Peng, L.; Liu, S.; Liu, R.; Wang, L.: Effective long short-term memory with differential evolution algorithm for electricity price prediction. Energy 162, 1301–1314 (2018)CrossRef
48.
Zurück zum Zitat Price, K.; Storn, R.; Lampinen, J.: Differential Evolution—A Practical Approach to Global Optimization. Springer, Berlin (2005)MATH Price, K.; Storn, R.; Lampinen, J.: Differential Evolution—A Practical Approach to Global Optimization. Springer, Berlin (2005)MATH
49.
Zurück zum Zitat Sanam, J.; Ganguly, S.; Panda, A.K.; Hemanth, C.: Optimization of energy loss cost of distribution networks with the optimal placement and sizing of DSTATCOM using differential evolution algorithm. Arab. J. Sci. Eng. 42(7), 2851–2865 (2017)CrossRef Sanam, J.; Ganguly, S.; Panda, A.K.; Hemanth, C.: Optimization of energy loss cost of distribution networks with the optimal placement and sizing of DSTATCOM using differential evolution algorithm. Arab. J. Sci. Eng. 42(7), 2851–2865 (2017)CrossRef
Metadaten
Titel
A Novel Distance Metric Based on Differential Evolution
verfasst von
Ömer Faruk Ertuğrul
Publikationsdatum
11.07.2019
Verlag
Springer Berlin Heidelberg
Erschienen in
Arabian Journal for Science and Engineering / Ausgabe 11/2019
Print ISSN: 2193-567X
Elektronische ISSN: 2191-4281
DOI
https://doi.org/10.1007/s13369-019-04003-5

Weitere Artikel der Ausgabe 11/2019

Arabian Journal for Science and Engineering 11/2019 Zur Ausgabe

Research Article - Computer Engineering and Computer Science

Bidirectional Encoder–Decoder Model for Arabic Named Entity Recognition

Research Article - Computer Engineering and Computer Science

Hybrid Cascade Forward Neural Network with Elman Neural Network for Disease Prediction

Research Article - Computer Engineering and Computer Science

UFC: A Unified POI Recommendation Framework

Research Article - Computer Engineering and Computer Science

On Some Improved Versions of Whale Optimization Algorithm

Research Article - Computer Engineering and Computer Science

Embedded Fuzzy Logic Control System for Refrigerated Display Cabinets

    Marktübersichten

    Die im Laufe eines Jahres in der „adhäsion“ veröffentlichten Marktübersichten helfen Anwendern verschiedenster Branchen, sich einen gezielten Überblick über Lieferantenangebote zu verschaffen.