Skip to main content
Erschienen in: Arabian Journal for Science and Engineering 4/2020

21.11.2019 | Research Article - Special Issue - Intelligent Computing And Interdisciplinary Applications

An Integrated Word Embedding-Based Dual-Task Learning Method for Sentiment Analysis

verfasst von: Yanping Fu, Yun Liu, Sheng-Lung Peng

Erschienen in: Arabian Journal for Science and Engineering | Ausgabe 4/2020

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Sentiment analysis aimed to automate the task of discriminating the sentiment tendency of a textual review, which expresses a simple sentiment as positive, negative, or neutral. In general, the basic sentiment analysis solution used for feature extraction is the word embedding technique, which only focuses on the contextual or global semantic information and ignores the sentiment polarity of text. Thus, the word embedding technique leads to biased analysis results, especially for some words that have the same semantic context but an opposite sentiment. In this paper, we propose an integrated sentiment embedding method to combine context and sentiment information using a dual-task learning algorithm to perform sentiment analysis. First, we propose three sentiment language models by encoding the sentiment information of texts into word embedding based on three existing semantic models, namely, continuous bag-of-words, prediction, and log-bilinear. Next, based on semantic language models and the proposed sentiment language models, we propose a dual-task learning algorithm to generate hybrid word embedding named integrated sentiment embedding, in which the joint learning method and parallel learning method are applied to jointly process tasks. Experiments on sentence-level and document-level sentiment classification tasks demonstrate that the proposed integrated sentiment embedding has better classification performances compared with basic word embedding methods.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Berger, A.L.; Pietra, V.J.D.; Pietra, S.A.D.: A maximum entropy approach to natural language processing. Comput. Linguist. 22(1), 39–71 (1996) Berger, A.L.; Pietra, V.J.D.; Pietra, S.A.D.: A maximum entropy approach to natural language processing. Comput. Linguist. 22(1), 39–71 (1996)
2.
Zurück zum Zitat Collobert, R.; Weston, J.; Bottou, L.; et al.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12, 2493–2537 (2011)MATH Collobert, R.; Weston, J.; Bottou, L.; et al.: Natural language processing (almost) from scratch. J. Mach. Learn. Res. 12, 2493–2537 (2011)MATH
3.
Zurück zum Zitat Chowdhury, G.: Natural language processing. Annu. Rev. Inf. Sci. Technol. 37, 51–89 (2003)CrossRef Chowdhury, G.: Natural language processing. Annu. Rev. Inf. Sci. Technol. 37, 51–89 (2003)CrossRef
4.
Zurück zum Zitat Mikolov, T.; Chen, K.; Corrado, G.; et al.: Efficient estimation of word representations in vector space (2013). arXiv preprint arXiv:1301.3781 Mikolov, T.; Chen, K.; Corrado, G.; et al.: Efficient estimation of word representations in vector space (2013). arXiv preprint arXiv:​1301.​3781
5.
Zurück zum Zitat Guthrie, D.; Allison, B.; Liu, W.; et al.: A closer look at skip-gram modelling. In: Proceedings of the 5th International Conference on Language Resources and Evaluation (LREC-2006), pp. 1–4 (2006) Guthrie, D.; Allison, B.; Liu, W.; et al.: A closer look at skip-gram modelling. In: Proceedings of the 5th International Conference on Language Resources and Evaluation (LREC-2006), pp. 1–4 (2006)
6.
Zurück zum Zitat Mnih, A.; Hinton, G.: Three new graphical models for statistical language modelling. In: Proceedings of the 24th International Conference on Machine Learning, pp. 641–648 (2007) Mnih, A.; Hinton, G.: Three new graphical models for statistical language modelling. In: Proceedings of the 24th International Conference on Machine Learning, pp. 641–648 (2007)
7.
Zurück zum Zitat Mikolov, T.; Sutskever, I.; Chen, K.; et al.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems, pp. 3111–3119 (2013) Mikolov, T.; Sutskever, I.; Chen, K.; et al.: Distributed representations of words and phrases and their compositionality. In: Advances in Neural Information Processing Systems, pp. 3111–3119 (2013)
8.
Zurück zum Zitat Kühnen, U.; Hannover, B.; Schubert, B.: The semantic-procedural interface model of the self: the role of self-knowledge for context-dependent versus context-independent modes of thinking. J. Pers. Soc. Psychol. 80(3), 397 (2001)CrossRef Kühnen, U.; Hannover, B.; Schubert, B.: The semantic-procedural interface model of the self: the role of self-knowledge for context-dependent versus context-independent modes of thinking. J. Pers. Soc. Psychol. 80(3), 397 (2001)CrossRef
9.
Zurück zum Zitat Chen, H.; Finin, T.; Joshi, A.: Semantic web in the context broker architecture, UMBC Faculty Collection (2004) Chen, H.; Finin, T.; Joshi, A.: Semantic web in the context broker architecture, UMBC Faculty Collection (2004)
10.
Zurück zum Zitat Maton, K.: Making semantic waves: a key to cumulative knowledge-building. Linguist. Educ. 24(1), 8–22 (2013)CrossRef Maton, K.: Making semantic waves: a key to cumulative knowledge-building. Linguist. Educ. 24(1), 8–22 (2013)CrossRef
11.
Zurück zum Zitat Bellegarda, J.R.: Exploiting latent semantic information in statistical language modeling. Proc. IEEE 88(8), 1279–1296 (2000)CrossRef Bellegarda, J.R.: Exploiting latent semantic information in statistical language modeling. Proc. IEEE 88(8), 1279–1296 (2000)CrossRef
12.
Zurück zum Zitat Pennington, J.; Socher, R.; Manning, C.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014) Pennington, J.; Socher, R.; Manning, C.: Glove: global vectors for word representation. In: Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1532–1543 (2014)
13.
Zurück zum Zitat Bellegarda, J.R.: Exploiting both local and global constraints for multi-span statistical language modeling. ICASSP 2, 677–680 (1998) Bellegarda, J.R.: Exploiting both local and global constraints for multi-span statistical language modeling. ICASSP 2, 677–680 (1998)
14.
Zurück zum Zitat Zhai, F.; Potdar, S.; Xiang, B.; et al.: Neural models for sequence chunking. In: Thirty-First AAAI Conference on Artificial Intelligence (2017) Zhai, F.; Potdar, S.; Xiang, B.; et al.: Neural models for sequence chunking. In: Thirty-First AAAI Conference on Artificial Intelligence (2017)
15.
Zurück zum Zitat Bonhage, C.E.; Meyer, L.; Gruber, T.; et al.: Oscillatory EEG dynamics underlying automatic chunking during sentence processing. Neuroimage 66, 11–21 (2015) Bonhage, C.E.; Meyer, L.; Gruber, T.; et al.: Oscillatory EEG dynamics underlying automatic chunking during sentence processing. Neuroimage 66, 11–21 (2015)
16.
Zurück zum Zitat Carneiro, H.C.C.; França, F.M.G.; Lima, P.M.V.: Multilingual part-of-speech tagging with weightless neural networks. Neural Netw. 152, 647–657 (2017) Carneiro, H.C.C.; França, F.M.G.; Lima, P.M.V.: Multilingual part-of-speech tagging with weightless neural networks. Neural Netw. 152, 647–657 (2017)
17.
Zurück zum Zitat Jamatia, A.; Gambäck, B.; Das, A.: Part-of-speech tagging for code-mixed English-Hindi twitter and facebook chat messages. In: Proceedings of the International Conference Recent Advances in Natural Language Processing, pp. 239–248 (2015) Jamatia, A.; Gambäck, B.; Das, A.: Part-of-speech tagging for code-mixed English-Hindi twitter and facebook chat messages. In: Proceedings of the International Conference Recent Advances in Natural Language Processing, pp. 239–248 (2015)
18.
Zurück zum Zitat Lample, G.; Ballesteros, M.; Subramanian, S.; et al.: Neural architectures for named entity recognition (2016). arXiv preprint arXiv:1603.01360 Lample, G.; Ballesteros, M.; Subramanian, S.; et al.: Neural architectures for named entity recognition (2016). arXiv preprint arXiv:​1603.​01360
19.
Zurück zum Zitat Neelakantan, A.; Collins, M.: Learning dictionaries for named entity recognition using minimal supervision (2015). arXiv preprint arXiv:1504.06650 Neelakantan, A.; Collins, M.: Learning dictionaries for named entity recognition using minimal supervision (2015). arXiv preprint arXiv:​1504.​06650
20.
Zurück zum Zitat Cambria, E.: Affective computing and sentiment analysis. IEEE Intell. Syst. 31(2), 102–107 (2016)CrossRef Cambria, E.: Affective computing and sentiment analysis. IEEE Intell. Syst. 31(2), 102–107 (2016)CrossRef
21.
Zurück zum Zitat Tang, D.; Wei, F.; Qin, B.; et al.: Sentiment embeddings with applications to sentiment analysis. IEEE Trans. Knowl. Data Eng. 28(2), 496–509 (2016)CrossRef Tang, D.; Wei, F.; Qin, B.; et al.: Sentiment embeddings with applications to sentiment analysis. IEEE Trans. Knowl. Data Eng. 28(2), 496–509 (2016)CrossRef
22.
Zurück zum Zitat Liu, K.L.; Li, W.J.; Guo, M.: Emoticon smoothed language models for twitter sentiment analysis. Aaai 12, 22–26 (2012) Liu, K.L.; Li, W.J.; Guo, M.: Emoticon smoothed language models for twitter sentiment analysis. Aaai 12, 22–26 (2012)
23.
Zurück zum Zitat Maas, A.L.; Daly, R.E.; Pham, P.T.; et al.: Learning word vectors for sentiment analysis. In: Meeting of the Association for Computational Linguistics. Human Language Technologies. Association for Computational Linguistics (2011) Maas, A.L.; Daly, R.E.; Pham, P.T.; et al.: Learning word vectors for sentiment analysis. In: Meeting of the Association for Computational Linguistics. Human Language Technologies. Association for Computational Linguistics (2011)
24.
Zurück zum Zitat Tang, D.; Wei, F.; Yang, N.; et al.: Learning sentiment-specific word embedding for twitter sentiment classification. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, vol. 1, pp. 1555–1565 (2014) Tang, D.; Wei, F.; Yang, N.; et al.: Learning sentiment-specific word embedding for twitter sentiment classification. In: Proceedings of the 52nd Annual Meeting of the Association for Computational Linguistics, vol. 1, pp. 1555–1565 (2014)
25.
Zurück zum Zitat Tang, D.; Wei, F.; Qin, B.; et al.: Coooolll: a deep learning system for twitter sentiment classification. In: Proceedings of the 8th international workshop on semantic evaluation (SemEval 2014), pp. 208–212 (2014) Tang, D.; Wei, F.; Qin, B.; et al.: Coooolll: a deep learning system for twitter sentiment classification. In: Proceedings of the 8th international workshop on semantic evaluation (SemEval 2014), pp. 208–212 (2014)
26.
Zurück zum Zitat Pang, B.; Lee, L.: A sentimental education: sentiment analysis using subjectivity summarization based on minimum cuts. In: Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics, pp. 271–279. Association for Computational Linguistics (2004) Pang, B.; Lee, L.: A sentimental education: sentiment analysis using subjectivity summarization based on minimum cuts. In: Proceedings of the 42nd Annual Meeting on Association for Computational Linguistics, pp. 271–279. Association for Computational Linguistics (2004)
27.
Zurück zum Zitat Lai, S.; Liu, K.; He, S.; et al.: How to generate a good word embedding. IEEE Intell. Syst. 31(6), 5–14 (2016)CrossRef Lai, S.; Liu, K.; He, S.; et al.: How to generate a good word embedding. IEEE Intell. Syst. 31(6), 5–14 (2016)CrossRef
28.
Zurück zum Zitat Mnih, A.; Hinton, G.E.: A scalable hierarchical distributed language model. In: Advances in Neural Information Processing Systems, pp. 1081–1088 (2009) Mnih, A.; Hinton, G.E.: A scalable hierarchical distributed language model. In: Advances in Neural Information Processing Systems, pp. 1081–1088 (2009)
29.
Zurück zum Zitat Mikolov, T.; Kombrink, S.; Burget, L.; et al.: Extensions of recurrent neural network language model. In: Acoustics, Speech and Signal Processing (ICASSP), pp. 5528–5531 (2011) Mikolov, T.; Kombrink, S.; Burget, L.; et al.: Extensions of recurrent neural network language model. In: Acoustics, Speech and Signal Processing (ICASSP), pp. 5528–5531 (2011)
30.
Zurück zum Zitat Mikolov, T.; Zweig, G.: Context dependent recurrent neural network language model. In: 2012 IEEE Spoken Language Technology Workshop (SLT) pp. 234–239 (2012) Mikolov, T.; Zweig, G.: Context dependent recurrent neural network language model. In: 2012 IEEE Spoken Language Technology Workshop (SLT) pp. 234–239 (2012)
31.
Zurück zum Zitat Bengio, Y.; Ducharme, R.; Vincent, P.: A neural probabilistic language model. J. Mach. Learn. Res. 3, 1137–1155 (2003)MATH Bengio, Y.; Ducharme, R.; Vincent, P.: A neural probabilistic language model. J. Mach. Learn. Res. 3, 1137–1155 (2003)MATH
32.
Zurück zum Zitat Collobert, R.; Weston, J.: A unified architecture for natural language processing: deep neural networks with multitask learning. In: Proceedings of the 25th International Conference on Machine Learning (2008) Collobert, R.; Weston, J.: A unified architecture for natural language processing: deep neural networks with multitask learning. In: Proceedings of the 25th International Conference on Machine Learning (2008)
33.
Zurück zum Zitat Young, T.; Hazarika, D.; Poria, S.; et al.: Recent trends in deep learning based natural language processing. IEEE Comput. Intell. Mag. 13(3), 55–75 (2018)CrossRef Young, T.; Hazarika, D.; Poria, S.; et al.: Recent trends in deep learning based natural language processing. IEEE Comput. Intell. Mag. 13(3), 55–75 (2018)CrossRef
34.
Zurück zum Zitat Kumar, A.; Irsoy, O.; Ondruska, P.; et al.: Ask me anything: dynamic memory networks for natural language processing. In: International Conference on Machine Learning, pp. 1378–1387 (2016) Kumar, A.; Irsoy, O.; Ondruska, P.; et al.: Ask me anything: dynamic memory networks for natural language processing. In: International Conference on Machine Learning, pp. 1378–1387 (2016)
35.
Zurück zum Zitat Kombrink, S.; Mikolov, T.; Karafiät M.; et al.: Recurrent neural network based language modeling in meeting recognition. In: Twelfth Annual Conference of the International Speech Communication Association (2011) Kombrink, S.; Mikolov, T.; Karafiät M.; et al.: Recurrent neural network based language modeling in meeting recognition. In: Twelfth Annual Conference of the International Speech Communication Association (2011)
36.
Zurück zum Zitat Mikolov, T.; Karafiät, M.; Burget, L.; et al.: Recurrent neural network based language model. In: Eleventh Annual Conference of the International Speech Communication Association (2010) Mikolov, T.; Karafiät, M.; Burget, L.; et al.: Recurrent neural network based language model. In: Eleventh Annual Conference of the International Speech Communication Association (2010)
37.
Zurück zum Zitat Mikolov, T.; Chen, K.; Corrado, G.; Dean, J.: Efficient estimation of word representations in vector space (2013). arXiv preprint arXiv:1301.3781 Mikolov, T.; Chen, K.; Corrado, G.; Dean, J.: Efficient estimation of word representations in vector space (2013). arXiv preprint arXiv:​1301.​3781
38.
Zurück zum Zitat Morin, F.; Bengio, Y.: Hierarchical probabilistic neural network language model. Aistats 5, 246–252 (2005) Morin, F.; Bengio, Y.: Hierarchical probabilistic neural network language model. Aistats 5, 246–252 (2005)
39.
Zurück zum Zitat Goldberg, Y.; Levy, O.: word2vec explained: deriving mikolov et al.’s negative-sampling word-embedding method [Online] (2014). arXiv:1402.3722 Goldberg, Y.; Levy, O.: word2vec explained: deriving mikolov et al.’s negative-sampling word-embedding method [Online] (2014). arXiv:​1402.​3722
40.
Zurück zum Zitat Hinton, G.E.; Osindero, S.; Teh, Y.W.: A fast learning algorithm for deep belief networks. Neural Comput. 18, 1527–1554 (2006)MathSciNetCrossRef Hinton, G.E.; Osindero, S.; Teh, Y.W.: A fast learning algorithm for deep belief networks. Neural Comput. 18, 1527–1554 (2006)MathSciNetCrossRef
41.
Zurück zum Zitat Ma, Y.; Peng, H.; Cambria, E.: Targeted aspect-based sentiment analysis via embedding commonsense knowledge into an attentive LSTM[C]. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018) Ma, Y.; Peng, H.; Cambria, E.: Targeted aspect-based sentiment analysis via embedding commonsense knowledge into an attentive LSTM[C]. In: Thirty-Second AAAI Conference on Artificial Intelligence (2018)
42.
Zurück zum Zitat Al-Rfou, R.; Choe, D.; Constant, N.; et al.: Character-level language modeling with deeper self-attention[C]. Proc. AAAI Conf. Artif. Intell. 33, 3159–3166 (2019) Al-Rfou, R.; Choe, D.; Constant, N.; et al.: Character-level language modeling with deeper self-attention[C]. Proc. AAAI Conf. Artif. Intell. 33, 3159–3166 (2019)
43.
Zurück zum Zitat Devlin, J.; Chang, M.W.; Lee, K.; et al.: Bert: pre-training of deep bidirectional transformers for language understanding[J] (2018). arXiv preprint arXiv:1810.04805 Devlin, J.; Chang, M.W.; Lee, K.; et al.: Bert: pre-training of deep bidirectional transformers for language understanding[J] (2018). arXiv preprint arXiv:​1810.​04805
44.
Zurück zum Zitat Bespalov, D.; Bai, B.; Qi, Y.; Shokoufandeh, A.: Sentiment classification based on supervised latent n-gram analysis. In: Proceedings of the 20th ACM International Conference on Information and Knowledge Management, pp. 375–382 (2011) Bespalov, D.; Bai, B.; Qi, Y.; Shokoufandeh, A.: Sentiment classification based on supervised latent n-gram analysis. In: Proceedings of the 20th ACM International Conference on Information and Knowledge Management, pp. 375–382 (2011)
45.
Zurück zum Zitat Vilares, D.; Alonso, M.A.; et al.: Sentiment analysis on monolingual, multilingual and code-switching twitter corpora[C]. In: Proceedings of the 6th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, pp. 2–8 (2015) Vilares, D.; Alonso, M.A.; et al.: Sentiment analysis on monolingual, multilingual and code-switching twitter corpora[C]. In: Proceedings of the 6th Workshop on Computational Approaches to Subjectivity, Sentiment and Social Media Analysis, pp. 2–8 (2015)
46.
Zurück zum Zitat Abdulla, N.A.; Ahmed, N.A.; Shehab, M.A.; et al.: Arabic sentiment analysis: Lexicon-based and corpus-based[C]//2013. In: IEEE Jordan Conference on Applied Electrical Engineering and Computing Technologies, pp. 1–6 (2013) Abdulla, N.A.; Ahmed, N.A.; Shehab, M.A.; et al.: Arabic sentiment analysis: Lexicon-based and corpus-based[C]//2013. In: IEEE Jordan Conference on Applied Electrical Engineering and Computing Technologies, pp. 1–6 (2013)
47.
Zurück zum Zitat Steiner-Correa, F.; Viedma-del-Jesus, M.I.; Lopez-Herrera, A.G.: A survey of multilingual human-tagged short message datasets for sentiment analysis tasks. Soft. Comput. 22, 8227–8242 (2018)CrossRef Steiner-Correa, F.; Viedma-del-Jesus, M.I.; Lopez-Herrera, A.G.: A survey of multilingual human-tagged short message datasets for sentiment analysis tasks. Soft. Comput. 22, 8227–8242 (2018)CrossRef
48.
Zurück zum Zitat Al-Smadi, M.; Talafha, B.; Al-Ayyoub, M.; et al.: Using long short-term memory deep neural networks for aspect-based sentiment analysis of Arabic reviews. Int. J. Mach. Learn. Cybernet. 10, 2163–2175 (2018)CrossRef Al-Smadi, M.; Talafha, B.; Al-Ayyoub, M.; et al.: Using long short-term memory deep neural networks for aspect-based sentiment analysis of Arabic reviews. Int. J. Mach. Learn. Cybernet. 10, 2163–2175 (2018)CrossRef
49.
Zurück zum Zitat Ranjan, R.; Patel, V.M.; Chellappa, R.: Hyperface: a deep multi-task learning framework for face detection, landmark localization, pose estimation, and gender recognition. IEEE Trans. Pattern Anal. Mach. Intell. 41, 121–135 (2019)CrossRef Ranjan, R.; Patel, V.M.; Chellappa, R.: Hyperface: a deep multi-task learning framework for face detection, landmark localization, pose estimation, and gender recognition. IEEE Trans. Pattern Anal. Mach. Intell. 41, 121–135 (2019)CrossRef
50.
Zurück zum Zitat Zhang, Z.; Luo, P.; Loy, C.C.; et al.: Facial landmark detection by deep multi-task learning. In: European Conference on Computer Vision, pp. 94–108 (2014) Zhang, Z.; Luo, P.; Loy, C.C.; et al.: Facial landmark detection by deep multi-task learning. In: European Conference on Computer Vision, pp. 94–108 (2014)
51.
Zurück zum Zitat Liu, W.; et al.: Multi-task deep visual-semantic embedding for video thumbnail selection. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2015) Liu, W.; et al.: Multi-task deep visual-semantic embedding for video thumbnail selection. In: Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (2015)
52.
Zurück zum Zitat Argyriou, A.; Evgeniou, T.; Pontil, M.: Multi-task feature learning. In: Advances in Neural Information Processing Systems, pp. 41–48 (2007) Argyriou, A.; Evgeniou, T.; Pontil, M.: Multi-task feature learning. In: Advances in Neural Information Processing Systems, pp. 41–48 (2007)
53.
Zurück zum Zitat Dahl, G.; Yu, D.; Deng, L.; Acero, A.: Context-dependent pre-trained deep neural networks for large-vocabulary speech recognition. IEEE Trans. Audio Speech Lang. Process. 20(1), 30–42 (2012)CrossRef Dahl, G.; Yu, D.; Deng, L.; Acero, A.: Context-dependent pre-trained deep neural networks for large-vocabulary speech recognition. IEEE Trans. Audio Speech Lang. Process. 20(1), 30–42 (2012)CrossRef
54.
Zurück zum Zitat Agostinelli, F.; Hoffman, M.; Sadowski, P.; Baldi, P.: Learning activation functions to improve deep neural networks [Online] (2014). arXiv:1412.6830 Agostinelli, F.; Hoffman, M.; Sadowski, P.; Baldi, P.: Learning activation functions to improve deep neural networks [Online] (2014). arXiv:​1412.​6830
55.
Zurück zum Zitat Zhang, B.; Liu, C.H.; Tang, J.; et al.: Learning-based energy-efficient data collection by unmanned vehicles in smart cities. IEEE Trans. Ind. Inf. 14(4), 1666–1676 (2018)CrossRef Zhang, B.; Liu, C.H.; Tang, J.; et al.: Learning-based energy-efficient data collection by unmanned vehicles in smart cities. IEEE Trans. Ind. Inf. 14(4), 1666–1676 (2018)CrossRef
56.
Zurück zum Zitat Vogl, T.P.; Mangis, J.K.; Rigler, A.K.; et al.: Accelerating the convergence of the back-propagation method. Biol. Cybern. 59, 257–263 (1988)CrossRef Vogl, T.P.; Mangis, J.K.; Rigler, A.K.; et al.: Accelerating the convergence of the back-propagation method. Biol. Cybern. 59, 257–263 (1988)CrossRef
57.
Zurück zum Zitat Ng, A.Y.: Feature selection, L 1 vs. L 2 regularization, and rotational invariance. In: Proceedings of the Twenty-first International Conference on Machine Learning, pp. 78–98 (2004) Ng, A.Y.: Feature selection, L 1 vs. L 2 regularization, and rotational invariance. In: Proceedings of the Twenty-first International Conference on Machine Learning, pp. 78–98 (2004)
Metadaten
Titel
An Integrated Word Embedding-Based Dual-Task Learning Method for Sentiment Analysis
verfasst von
Yanping Fu
Yun Liu
Sheng-Lung Peng
Publikationsdatum
21.11.2019
Verlag
Springer Berlin Heidelberg
Erschienen in
Arabian Journal for Science and Engineering / Ausgabe 4/2020
Print ISSN: 2193-567X
Elektronische ISSN: 2191-4281
DOI
https://doi.org/10.1007/s13369-019-04241-7

Weitere Artikel der Ausgabe 4/2020

Arabian Journal for Science and Engineering 4/2020 Zur Ausgabe

Research Article - Computer Engineering and Computer Science

Cyber Security Threats and Vulnerabilities: A Systematic Mapping Study

RESEARCH ARTICLE - SPECIAL ISSUE - INTELLIGENT COMPUTING and INTERDISCIPLINARY APPLICATIONS

A Comparative Analysis on Effort Estimation for Agile and Non-agile Software Projects Using DBN-ALO

Research Article - Computer Engineering and Computer Science

EX-MAN Component Model for Component-Based Software Construction

    Marktübersichten

    Die im Laufe eines Jahres in der „adhäsion“ veröffentlichten Marktübersichten helfen Anwendern verschiedenster Branchen, sich einen gezielten Überblick über Lieferantenangebote zu verschaffen.