Skip to main content
Erschienen in: Journal of Intelligent Information Systems 2/2013

01.04.2013

UniDis: a universal discretization technique

verfasst von: Yu Sang, Yingwei Jin, Keqiu Li, Heng Qi

Erschienen in: Journal of Intelligent Information Systems | Ausgabe 2/2013

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Discretization techniques have played an important role in machine learning and data mining as most methods in such areas require that the training data set contains only discrete attributes. Data discretization unification (DDU), one of the state-of-the-art discretization techniques, trades off classification errors and the number of discretized intervals, and unifies existing discretization criteria. However, it suffers from two deficiencies. First, the efficiency of DDU is very low as it conducts a large number of parameters to search good results, which does not still guarantee to obtain an optimal solution. Second, DDU does not take into account the number of inconsistent records produced by discretization, which leads to unnecessary information loss. To overcome the above deficiencies, this paper presents a Uni versal Dis cretization technique, namely UniDis. We first develop a non-parametric normalized discretization criteria which avoids the effect of relatively large difference between classification errors and the number of discretized intervals on discretization results. In addition, we define a new entropy-based measure of inconsistency for multi-dimensional variables to effectively control information loss while producing a concise summarization of continuous variables. Finally, we propose a heuristic algorithm to guarantee better discretization based on the non-parametric normalized discretization criteria and the entropy-based inconsistency. Besides theoretical analysis, experimental results demonstrate that our approach is statistically comparable to DDU evaluated by a popular statistical test and it yields a better discretization scheme which significantly improves the accuracy of classification than previously other known discretization methods except for DDU by running J4.8 decision tree and Naive Bayes classifier.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Anhänge
Nur mit Berechtigung zugänglich
Literatur
Zurück zum Zitat Biba, M., Esposito, F., Ferilli, S., Mauro, N.D., Basile, T. (2007). Unsupervised discretization using kernel density estimation. In: Proceedings of Twentieth International Joint Conference on Artificial Intelligence (IJCAI) (pp. 696–701). Biba, M., Esposito, F., Ferilli, S., Mauro, N.D., Basile, T. (2007). Unsupervised discretization using kernel density estimation. In: Proceedings of Twentieth International Joint Conference on Artificial Intelligence (IJCAI) (pp. 696–701).
Zurück zum Zitat Bondu, A., Boulle, M., Lemaire, V., Loiseau, S., Duval, B. (2008). A Non-parametric semi-supervised discretization method. In: Proceedings of Eighth IEEE International Conference on Data Mining (ICDM) (pp. 53–62). Bondu, A., Boulle, M., Lemaire, V., Loiseau, S., Duval, B. (2008). A Non-parametric semi-supervised discretization method. In: Proceedings of Eighth IEEE International Conference on Data Mining (ICDM) (pp. 53–62).
Zurück zum Zitat Boulle, M. (2004). Khiops: a statistical discretization method of continuous attributes. Machine Learning, 55, 53–69.MATHCrossRef Boulle, M. (2004). Khiops: a statistical discretization method of continuous attributes. Machine Learning, 55, 53–69.MATHCrossRef
Zurück zum Zitat Boulle, M. (2006). MODL: a bayes optimal discretization method for continuous attributes. Machine Learning, 65, 131–165.CrossRef Boulle, M. (2006). MODL: a bayes optimal discretization method for continuous attributes. Machine Learning, 65, 131–165.CrossRef
Zurück zum Zitat Ching, J.Y., Wong, A.K.C., Chan, K.C.C. (1995). Class-dependent discretization for inductive learning from continuous and mixed-mode data. IEEE Transactions on Pattern Analysis and Machine Intelligence, 17(7), 641–651.CrossRef Ching, J.Y., Wong, A.K.C., Chan, K.C.C. (1995). Class-dependent discretization for inductive learning from continuous and mixed-mode data. IEEE Transactions on Pattern Analysis and Machine Intelligence, 17(7), 641–651.CrossRef
Zurück zum Zitat Cios, K.J., & Kurgan, L.A. (2007). CLIP4: hybrid inductive machine learning algorithm that generates inequality rules. Information Sciences, 177(17), 3592–3612.CrossRef Cios, K.J., & Kurgan, L.A. (2007). CLIP4: hybrid inductive machine learning algorithm that generates inequality rules. Information Sciences, 177(17), 3592–3612.CrossRef
Zurück zum Zitat Cover, T.M., & Thomas, J.A. (2006). Elements of information thoery (2nd ed.). New York: Wiley. Cover, T.M., & Thomas, J.A. (2006). Elements of information thoery (2nd ed.). New York: Wiley.
Zurück zum Zitat Demsar, J. (2006). Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research, 7, 1–30.MathSciNetMATH Demsar, J. (2006). Statistical comparisons of classifiers over multiple data sets. Journal of Machine Learning Research, 7, 1–30.MathSciNetMATH
Zurück zum Zitat Dougherty, J., Kohavi, R., Sahami M. (1995). Supervised and unsupervised discretization of continuous feature. In: Proceedings of 12th International conference of Machine learning (pp. 194–202). Dougherty, J., Kohavi, R., Sahami M. (1995). Supervised and unsupervised discretization of continuous feature. In: Proceedings of 12th International conference of Machine learning (pp. 194–202).
Zurück zum Zitat Fayyad, U., & Irani, K. (1993). Multi-interval discretization of continuous-valued attributes for classification learning. In: Proceedings of thirteenth international joint conference on artificial intelligence (pp. 1022–1027). San Mateo, CA: Morgan Kaufmann. Fayyad, U., & Irani, K. (1993). Multi-interval discretization of continuous-valued attributes for classification learning. In: Proceedings of thirteenth international joint conference on artificial intelligence (pp. 1022–1027). San Mateo, CA: Morgan Kaufmann.
Zurück zum Zitat Hand, D., Mannila, H., Smyth, P. (2001). Principles of data mining. MIT Press. Hand, D., Mannila, H., Smyth, P. (2001). Principles of data mining. MIT Press.
Zurück zum Zitat Hansen, M.H., & Yu, B. (2001). Model selection and the principle of minimum description length. Journal of the American Statistical Association, 96(545), 746–774.MathSciNetMATHCrossRef Hansen, M.H., & Yu, B. (2001). Model selection and the principle of minimum description length. Journal of the American Statistical Association, 96(545), 746–774.MathSciNetMATHCrossRef
Zurück zum Zitat Jin, R.M., Breitbart, Y., Muoh, C. (2007). Data discretization unification. In: Proceedings of seventh IEEE International Conference on Data Mining (ICDM Best Paper) (pp. 183–192). Jin, R.M., Breitbart, Y., Muoh, C. (2007). Data discretization unification. In: Proceedings of seventh IEEE International Conference on Data Mining (ICDM Best Paper) (pp. 183–192).
Zurück zum Zitat Jin, Y.W., & Qu, W.Y. (2009). Multi-dimension multi-objective fuzzy optimum dynamic programming method with complicated information based on a maximal-sum-rule of decision sequence priority. In: Eighth IEEE international conference on embedded computing; IEEE international conference on scalable computing and communications (pp. 656–660). Dalian, China.CrossRef Jin, Y.W., & Qu, W.Y. (2009). Multi-dimension multi-objective fuzzy optimum dynamic programming method with complicated information based on a maximal-sum-rule of decision sequence priority. In: Eighth IEEE international conference on embedded computing; IEEE international conference on scalable computing and communications (pp. 656–660). Dalian, China.CrossRef
Zurück zum Zitat Kerber, R. (1992). ChiMerge: discretization of numeric attributes. In: Proceedings of ninth national conference on artificial intelligence (pp. 123–128). AAAI Press. Kerber, R. (1992). ChiMerge: discretization of numeric attributes. In: Proceedings of ninth national conference on artificial intelligence (pp. 123–128). AAAI Press.
Zurück zum Zitat Kurgan, L.A., & Cios, K.J. (2004). CAIM discretization algorithm. IEEE Transactions on Knowledge and Data Engineering, 16(2), 145–153.CrossRef Kurgan, L.A., & Cios, K.J. (2004). CAIM discretization algorithm. IEEE Transactions on Knowledge and Data Engineering, 16(2), 145–153.CrossRef
Zurück zum Zitat Ling, C.X., & Zhang, H.J. (2002). The representational power of discrete bayesian networks. Journal of Machine Learning Research, 3, 709–721.MathSciNet Ling, C.X., & Zhang, H.J. (2002). The representational power of discrete bayesian networks. Journal of Machine Learning Research, 3, 709–721.MathSciNet
Zurück zum Zitat Liu, L.L., Wong, A.K.C., Wang, Y. (2004). A global optimal algorithm for class-dependent discretization of continuous data. Intelligent Data Analysis, 8(2), 151–170. Liu, L.L., Wong, A.K.C., Wang, Y. (2004). A global optimal algorithm for class-dependent discretization of continuous data. Intelligent Data Analysis, 8(2), 151–170.
Zurück zum Zitat Liu, H., Hussain, F., Tan, C.L., Dash, M. (2002). Discretization: an enabling technique. Journal of Data Mining and Knowledge Discovery, 6(4), 393–423.MathSciNetCrossRef Liu, H., Hussain, F., Tan, C.L., Dash, M. (2002). Discretization: an enabling technique. Journal of Data Mining and Knowledge Discovery, 6(4), 393–423.MathSciNetCrossRef
Zurück zum Zitat Liu, H., & Setiono, R. (1997). Feature selection via discretization. IEEE Transactions on Knowledge and Data Engineering, 9(4), 642–645.CrossRef Liu, H., & Setiono, R. (1997). Feature selection via discretization. IEEE Transactions on Knowledge and Data Engineering, 9(4), 642–645.CrossRef
Zurück zum Zitat Mahady, H., Muhammad, A.C. , Qu, W.Y., Lin, X.M. (2010). Efficient algorithms to monitor continuous constrained k nearest neighbor queries. In: Data base systems for advanced applications (pp. 233–249). Tsukuba, Japan. Mahady, H., Muhammad, A.C. , Qu, W.Y., Lin, X.M. (2010). Efficient algorithms to monitor continuous constrained k nearest neighbor queries. In: Data base systems for advanced applications (pp. 233–249). Tsukuba, Japan.
Zurück zum Zitat Mussard, S., Seyte, F., Terraza, M. (2003). Decomposition of Gini and the generalized entropy inequality measures. Economic Bulletin, 4(7), 1–6. Mussard, S., Seyte, F., Terraza, M. (2003). Decomposition of Gini and the generalized entropy inequality measures. Economic Bulletin, 4(7), 1–6.
Zurück zum Zitat Quinlan, J.R. (1986). Induction of decision trees. Machine Learning, 1, 81–106. Quinlan, J.R. (1986). Induction of decision trees. Machine Learning, 1, 81–106.
Zurück zum Zitat Quinlan, J.R. (1993). C4.5: Programs for machine learning. San Mateo, California: Morgan Kaufmann. Quinlan, J.R. (1993). C4.5: Programs for machine learning. San Mateo, California: Morgan Kaufmann.
Zurück zum Zitat Roweis, S.T., & Saul, L.K. (2000). Science. Nonlinear Dimensionality Reduction by Locally Linear Embedding, 290(5500), 2323–2326. Roweis, S.T., & Saul, L.K. (2000). Science. Nonlinear Dimensionality Reduction by Locally Linear Embedding, 290(5500), 2323–2326.
Zurück zum Zitat Schmidberger, G., & Frank, E. (2005). Unsupervised discretization using tree-based density estimation. In: Proceedings of The European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD) (pp. 240–251). Schmidberger, G., & Frank, E. (2005). Unsupervised discretization using tree-based density estimation. In: Proceedings of The European Conference on Machine Learning and Principles and Practice of Knowledge Discovery in Databases (ECML PKDD) (pp. 240–251).
Zurück zum Zitat Su, C.T., & Hsu, J.H. (2005). An extended Chi2 algorithm for discretization of real value attributes. IEEE Transactions on Knowledge and Data Engineering, 17(3), 437–441.CrossRef Su, C.T., & Hsu, J.H. (2005). An extended Chi2 algorithm for discretization of real value attributes. IEEE Transactions on Knowledge and Data Engineering, 17(3), 437–441.CrossRef
Zurück zum Zitat Tay, E.H., & Shen, L. (2002). A modified Chi2 algorithm for discretization. IEEE Transactions on Knowledge and Data Engineering, 14(3), 666–670.CrossRef Tay, E.H., & Shen, L. (2002). A modified Chi2 algorithm for discretization. IEEE Transactions on Knowledge and Data Engineering, 14(3), 666–670.CrossRef
Zurück zum Zitat Tsai, C.J., Lee, C.I., Yang, W.P. (2008). A discretization algorithm based on class-attribute contingency coefficient. Information Sciences, 178, pp. 714–731.CrossRef Tsai, C.J., Lee, C.I., Yang, W.P. (2008). A discretization algorithm based on class-attribute contingency coefficient. Information Sciences, 178, pp. 714–731.CrossRef
Zurück zum Zitat Wang, H.X., & Zaniolo, C. (2000). CMP: a fast decision tree classifier using multivariate predictions. In: 16th International Conference on Data Engineering (ICDE00) (pp. 449–460). Wang, H.X., & Zaniolo, C. (2000). CMP: a fast decision tree classifier using multivariate predictions. In: 16th International Conference on Data Engineering (ICDE00) (pp. 449–460).
Zurück zum Zitat Witten, I.H., & Frank, E. (2000). Data mining: Practical machine learning tools and techniques with java implementations. San Francisco, CA: Morgan Kaufmann. Witten, I.H., & Frank, E. (2000). Data mining: Practical machine learning tools and techniques with java implementations. San Francisco, CA: Morgan Kaufmann.
Zurück zum Zitat Zar, J.H. (1998). Biostatistical analysis (4th ed.). Englewood Clifs, New Jersey: Prentice Hall. Zar, J.H. (1998). Biostatistical analysis (4th ed.). Englewood Clifs, New Jersey: Prentice Hall.
Metadaten
Titel
UniDis: a universal discretization technique
verfasst von
Yu Sang
Yingwei Jin
Keqiu Li
Heng Qi
Publikationsdatum
01.04.2013
Verlag
Springer US
Erschienen in
Journal of Intelligent Information Systems / Ausgabe 2/2013
Print ISSN: 0925-9902
Elektronische ISSN: 1573-7675
DOI
https://doi.org/10.1007/s10844-012-0228-1

Weitere Artikel der Ausgabe 2/2013

Journal of Intelligent Information Systems 2/2013 Zur Ausgabe

Premium Partner