Skip to main content
Top
Published in: International Journal of Machine Learning and Cybernetics 9/2022

22-04-2022 | Original Article

A heuristically self-organised Linguistic Attribute Deep Learning for edge intelligence

Authors: Hongmei He, Zhenhuan Zhu

Published in: International Journal of Machine Learning and Cybernetics | Issue 9/2022

Log in

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

With the development of the internet of things (IoT), IoT intelligence is becoming a new technology. The “curse of dimensionality” is the barrier of data fusion in edge devices to the success of IoT intelligence. Deep Learning has attracted a lot of attention recently due to its successful application in various fields such as image processing and natural language processing. However, the success of Deep Learning benefits from the Graphics Processing Unit (GPU) computing. Similar to conventional Deep Learning, the computational complexity of optimising LAHs blocks the applications of LAHs. However, unlike conventional Deep Learning, an Linguistic Attribute Hierarchy (LAH) embedded with Linguistic Decision Trees (LDTs) could overcome the shortcoming of the lack of interpretation by providing transparent information propagation through the rules created by LDTs in the LAH. To address the challenge in the construction of high-performance LAHs, we propose a heuristic approach for constructing an LAH embedded with LDTs for decision making or classification by using the distance correlations between attributes and between attributes and the target variable. The set of attributes is divided into some attribute clusters and then heuristically organised into an LAH. The proposed approach has been validated on some benchmark decision or classification problems from the UCI machine learning repository. The experimental results show that the proposed self-organisation algorithm can create an effective and efficient LAH. Such a self-organised LAH embedded with LDTs can not only efficiently address “curse of dimensionality” in a single LDT for data fusion with massive attributes, but also achieve better or comparable performance in decision making or classification compared to a single LDT for the problem to be solved. The main contribution of this work is that (1) an LAH provides a new attribute Deep Learning approach with good transparency and extends the conventional Deep Learning, which usually denotes a black-box deep neural network; (2) the self-organisation algorithm for SOLAH is much more efficient than the genetic algorithm in the wrapper for optimising LAHs, and SOLAH can achieve comparable results to the LAH, which was optimised by the genetic algorithm in the wrapper (GAW). This is crucial for edge intelligence and makes it feasible to embed SOLAH and the self-organisation algorithm in edge devices for IoT applications. Hence, this research will take a step towards edge intelligence.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Show more products
Appendix
Available only for authorised users
Literature
2.
go back to reference Blum AL, Rivest RL (1992) Training a 3-node neural network is NP-complete. Neural Netw 5:117–127CrossRef Blum AL, Rivest RL (1992) Training a 3-node neural network is NP-complete. Neural Netw 5:117–127CrossRef
4.
go back to reference Campello RJGB, do Amaral WC (2006) Hierarchical fuzzy relational models: linguistic interpretation and universal approximation. IEEE Trans Fuzzy Syst 14(3):446–453CrossRef Campello RJGB, do Amaral WC (2006) Hierarchical fuzzy relational models: linguistic interpretation and universal approximation. IEEE Trans Fuzzy Syst 14(3):446–453CrossRef
5.
go back to reference Pedrycs W (1993) Fuzzy control and fuzzy systems, 2nd edn. Research Studies Press Ltd, Somerset Pedrycs W (1993) Fuzzy control and fuzzy systems, 2nd edn. Research Studies Press Ltd, Somerset
6.
go back to reference Lawry L, He H (2008) Multi-attribute decision making based on label semantics. Internat J Uncertain Fuzziness Knowl-Based Syst 16(2) supp:69–86MathSciNetCrossRef Lawry L, He H (2008) Multi-attribute decision making based on label semantics. Internat J Uncertain Fuzziness Knowl-Based Syst 16(2) supp:69–86MathSciNetCrossRef
7.
go back to reference He H, Lawry J (2009) Optimal cascade hierarchies of linguistic decision trees for decision making. In: Proc. Hong Kong, China He H, Lawry J (2009) Optimal cascade hierarchies of linguistic decision trees for decision making. In: Proc. Hong Kong, China
8.
go back to reference He H, Lawry J (2009) Optimal cascade linguistic attribute hierarchies for information propagation. IAENG Int J Comput Sci 36(2):129–136 He H, Lawry J (2009) Optimal cascade linguistic attribute hierarchies for information propagation. IAENG Int J Comput Sci 36(2):129–136
11.
go back to reference Mitra S, Konwar KM, Pal SK (2002) Fuzzy decision tree, linguistic rules and fuzzy knowledge-based network: generation and evaluation. IEEE Trans Syst Man Cybern Part C Appl Rev 32(4):328–339CrossRef Mitra S, Konwar KM, Pal SK (2002) Fuzzy decision tree, linguistic rules and fuzzy knowledge-based network: generation and evaluation. IEEE Trans Syst Man Cybern Part C Appl Rev 32(4):328–339CrossRef
13.
go back to reference He H, Zhu Z, Tiwari A, Mills A (2015) A cascade of linguistic CMAC neural networks for decision making. Killarney, Ireland He H, Zhu Z, Tiwari A, Mills A (2015) A cascade of linguistic CMAC neural networks for decision making. Killarney, Ireland
14.
go back to reference He H, Watson T, Maple C, Mehnen J, Tiwari A (2017) Semantic attribute deep learning with a hierarchy of linguistic decision trees for spam detection, IJCNN2017. Anchorage, Alaska, USA He H, Watson T, Maple C, Mehnen J, Tiwari A (2017) Semantic attribute deep learning with a hierarchy of linguistic decision trees for spam detection, IJCNN2017. Anchorage, Alaska, USA
16.
go back to reference Miao Y, Gowayyed M, Metze F (2015) EESEN: end-to-end speech recognition using deep RNN Models and WFST-based decoding, CoRR. arXiv:1507.08240 Miao Y, Gowayyed M, Metze F (2015) EESEN: end-to-end speech recognition using deep RNN Models and WFST-based decoding, CoRR. arXiv:​1507.​08240
17.
go back to reference Zhang Z, Geiger J, Pohjalainen J, Mousa AED, Jin W, Schuller B (2017) In: [cs.CL], Jan. (ed) Deep learning for environmentally robust speech recognition: an overview of recent developments. arXiv:1701.02720v1 Zhang Z, Geiger J, Pohjalainen J, Mousa AED, Jin W, Schuller B (2017) In: [cs.CL], Jan. (ed) Deep learning for environmentally robust speech recognition: an overview of recent developments. arXiv:​1701.​02720v1
19.
go back to reference He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: The IEEE conference on computer vision and pattern recognition (CVPR), pp 770–778 He K, Zhang X, Ren S, Sun J (2016) Deep residual learning for image recognition. In: The IEEE conference on computer vision and pattern recognition (CVPR), pp 770–778
20.
go back to reference Girshick R, Donahue J, Darrell T, Malik J (2014) Rich feature hierarchies for accurate object detection and semantic segmentation. In: The IEEE conference on computer vision and pattern recognition (CVPR), pp 580–587 Girshick R, Donahue J, Darrell T, Malik J (2014) Rich feature hierarchies for accurate object detection and semantic segmentation. In: The IEEE conference on computer vision and pattern recognition (CVPR), pp 580–587
21.
go back to reference Ren S, He K, Girshick R, Sun J (2017) Faster R-CNN: towards real- time object detection with region proposal networks. J IEEE Trans Pattern Anal Mach Intell 39(6):1137–1149CrossRef Ren S, He K, Girshick R, Sun J (2017) Faster R-CNN: towards real- time object detection with region proposal networks. J IEEE Trans Pattern Anal Mach Intell 39(6):1137–1149CrossRef
24.
go back to reference Chen L, Papandreou G, Kokkinos I, Murphy K, Yuille AL(2016) DeepLab: semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFs, CoRR. arXiv: 1606.00915 Chen L, Papandreou G, Kokkinos I, Murphy K, Yuille AL(2016) DeepLab: semantic image segmentation with deep convolutional nets, atrous convolution, and fully connected CRFs, CoRR. arXiv:​ 1606.​00915
26.
go back to reference Lee H, Largman Y, Pham P, Ng AY (2009) Unsupervised feature learning for audio classification using convolutional deep belief networks. In: Proc. of the 22nd International Conference on Neural Information Processing Systems (NIPS’09), Vancouver, British Columbia, Canada, pp 1096–1104 Lee H, Largman Y, Pham P, Ng AY (2009) Unsupervised feature learning for audio classification using convolutional deep belief networks. In: Proc. of the 22nd International Conference on Neural Information Processing Systems (NIPS’09), Vancouver, British Columbia, Canada, pp 1096–1104
27.
28.
go back to reference Taigman Y, Yang M, Ranzato M, Wolf L (2014) Deepface: closing the gap to human-level performance in face verification, pp 1701–1708 Taigman Y, Yang M, Ranzato M, Wolf L (2014) Deepface: closing the gap to human-level performance in face verification, pp 1701–1708
29.
go back to reference Druzhkov PN, Kustikova VD (2016) A survey of deep learning methods and software tools for image classification and object detection. Pattern Recogn Image Anal 26(1):9–15CrossRef Druzhkov PN, Kustikova VD (2016) A survey of deep learning methods and software tools for image classification and object detection. Pattern Recogn Image Anal 26(1):9–15CrossRef
30.
go back to reference Szegedy C, Toshev A, Erhan D (2013) Deep neural networks for object detection. In: Proc. of NIPS (Lake Tahoe, 2013), pp 2553–2561 Szegedy C, Toshev A, Erhan D (2013) Deep neural networks for object detection. In: Proc. of NIPS (Lake Tahoe, 2013), pp 2553–2561
31.
go back to reference Simonyan K, Vedaldi A, Zisserman A (2013) Deep fisher networks for large-scale image classification. In: Proc. of NIPS (Lake Tahoe, 2013), pp 163–171 Simonyan K, Vedaldi A, Zisserman A (2013) Deep fisher networks for large-scale image classification. In: Proc. of NIPS (Lake Tahoe, 2013), pp 163–171
32.
go back to reference Krizhevsky A, Sutskever I, Hinton GE (2012) ImageNet classification with deep convolutional neural networks. In: Proc. of NIPS (Lake Tahoe, 2012), pp 1097–1105 Krizhevsky A, Sutskever I, Hinton GE (2012) ImageNet classification with deep convolutional neural networks. In: Proc. of NIPS (Lake Tahoe, 2012), pp 1097–1105
33.
go back to reference He Y, Kavukcuoglu K, Wang Y, Szlam A, Qi Y (2014) Unsupervised feature learning by deep sparse coding. In: Proc. of SIAM international conference on data mining (Philadelphia, 2014), pp 902–910 He Y, Kavukcuoglu K, Wang Y, Szlam A, Qi Y (2014) Unsupervised feature learning by deep sparse coding. In: Proc. of SIAM international conference on data mining (Philadelphia, 2014), pp 902–910
34.
go back to reference Dahl GE (2015) Deep learning approaches to problems in speech recognition, computational chemistry, and natural language text processing. In: University of Toronto (PhD thesis) Dahl GE (2015) Deep learning approaches to problems in speech recognition, computational chemistry, and natural language text processing. In: University of Toronto (PhD thesis)
36.
go back to reference Dahl GE, Sainath TN, Hinton GE (2013) Improving deep neural networks for LVCSR using rectified linear units and dropout. In: ICASSP, IEEE, pp 8609–8613 Dahl GE, Sainath TN, Hinton GE (2013) Improving deep neural networks for LVCSR using rectified linear units and dropout. In: ICASSP, IEEE, pp 8609–8613
38.
go back to reference Socher R, Perelygin A, Wu JY, Chuang J, Manning CD, Ng AY, Potts C (2013) Recursive deep models for semantic compositionality over a sentiment treebank, vol October 2013. Seattle, Washington, USA, pp 1631–1642 Socher R, Perelygin A, Wu JY, Chuang J, Manning CD, Ng AY, Potts C (2013) Recursive deep models for semantic compositionality over a sentiment treebank, vol October 2013. Seattle, Washington, USA, pp 1631–1642
41.
go back to reference Justus D, Brennan J, Bonner S, McGough AS (2018) Predicting the computational cost of deep learning models, IEEE international conference on big data, Seattle, WA, USA. arXiv: 1811.11880.pdf Justus D, Brennan J, Bonner S, McGough AS (2018) Predicting the computational cost of deep learning models, IEEE international conference on big data, Seattle, WA, USA. arXiv:​ 1811.​11880.​pdf
43.
44.
go back to reference Lawry J (2006). In: Kacprzyk J (ed) Modeling and reasoning with vague concepts. Springer, New York Lawry J (2006). In: Kacprzyk J (ed) Modeling and reasoning with vague concepts. Springer, New York
45.
go back to reference Jeffrey RC (1990) The logic of decision. The University of Chicargo Press, New York Jeffrey RC (1990) The logic of decision. The University of Chicargo Press, New York
47.
go back to reference Quinlan JR (1986) Induction of decision trees, machine learning. Kluwer Academic Publishers, Boston, pp 81–106 Quinlan JR (1986) Induction of decision trees, machine learning. Kluwer Academic Publishers, Boston, pp 81–106
48.
go back to reference Almeida TA, Hidalgo JMG, Yamakami A (2011) Contributions to the study of sms spam filtering:new collection and results. In: DocEng’11, Mountain View, California, USA Almeida TA, Hidalgo JMG, Yamakami A (2011) Contributions to the study of sms spam filtering:new collection and results. In: DocEng’11, Mountain View, California, USA
49.
go back to reference He H, Tiwari A, Mehnen J, Watson T, Maple C, Jin Y, Gabrys B (2016) incremental information gain analysis of input attribute impact on RBF-Kernel SVM Spam Detection, WCCI2016. Vancouver, Canada He H, Tiwari A, Mehnen J, Watson T, Maple C, Jin Y, Gabrys B (2016) incremental information gain analysis of input attribute impact on RBF-Kernel SVM Spam Detection, WCCI2016. Vancouver, Canada
Metadata
Title
A heuristically self-organised Linguistic Attribute Deep Learning for edge intelligence
Authors
Hongmei He
Zhenhuan Zhu
Publication date
22-04-2022
Publisher
Springer Berlin Heidelberg
Published in
International Journal of Machine Learning and Cybernetics / Issue 9/2022
Print ISSN: 1868-8071
Electronic ISSN: 1868-808X
DOI
https://doi.org/10.1007/s13042-022-01544-4

Other articles of this Issue 9/2022

International Journal of Machine Learning and Cybernetics 9/2022 Go to the issue