Skip to main content
Erschienen in: Neural Computing and Applications 23/2020

23.04.2020 | Original Article

Electronic word-of-mouth effects on studio performance leveraging attention-based model

verfasst von: Yang Liu, Hao Fei, Qingguo Zeng, Bobo Li, Lili Ma, Donghong Ji, Joaquín Ordieres Meré

Erschienen in: Neural Computing and Applications | Ausgabe 23/2020

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

While existing studies have established the relationship between electronic word-of-mouth (eWOM) and studio performance, limited research has been conducted to demonstrate how the attention-based model applies to the motion picture industry. In this study, examining a review corpus of seven Hollywood studios, we proved that deep learning with the attention mechanism has the best accuracy in both eWOM and stock price movement. We present both a hierarchical two-layer attention network and hierarchical convoluted attention network (HCAN), which quantify the importance of crucial eWOM features in capturing valuable information from audience members’ reviews. Further, comparing the two case studies, we determined that the HCAN model is superior to both machine learning and attention-based models. Our work helps to highlight the business value of the attention-based model and has implications for studio business decisions.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Literatur
5.
Zurück zum Zitat Gretzel U, Yoo KH (2008) Use and impact of online travel reviews. In: O’Connor P, Höpken W, Gretzel U (eds) Information and communication technologies in tourism 2008. Springer, Vienna, pp 35–46CrossRef Gretzel U, Yoo KH (2008) Use and impact of online travel reviews. In: O’Connor P, Höpken W, Gretzel U (eds) Information and communication technologies in tourism 2008. Springer, Vienna, pp 35–46CrossRef
9.
Zurück zum Zitat Kao YC, Shyu J, Huang JY (2015) eWOM for stock market by big data methods. J Acc Finance Manage Strategy 10(2):93–106 Kao YC, Shyu J, Huang JY (2015) eWOM for stock market by big data methods. J Acc Finance Manage Strategy 10(2):93–106
12.
Zurück zum Zitat Blei DM (2003) Latent dirichlet allocation. J Mach Learn Res 3:993–1022MATH Blei DM (2003) Latent dirichlet allocation. J Mach Learn Res 3:993–1022MATH
13.
Zurück zum Zitat Lee DD, Seung HS (2000) Algorithms for non-negative matrix factorization. Adv Neural Inf Process Syst 2001:556–562 Lee DD, Seung HS (2000) Algorithms for non-negative matrix factorization. Adv Neural Inf Process Syst 2001:556–562
15.
Zurück zum Zitat Greene S, Resnik P (2009) More than words: syntactic packaging and implicit sentiment. Association for Computational Linguistics, Boulder, Colorado, p 503 Greene S, Resnik P (2009) More than words: syntactic packaging and implicit sentiment. Association for Computational Linguistics, Boulder, Colorado, p 503
18.
Zurück zum Zitat Chen H, Sun M, Tu C et al (2016) Neural sentiment classification with user and product attention. Association for Computational Linguistics, Austin, pp 1650–1659 Chen H, Sun M, Tu C et al (2016) Neural sentiment classification with user and product attention. Association for Computational Linguistics, Austin, pp 1650–1659
19.
Zurück zum Zitat Yang Z, Yang D, Dyer C et al (2016) Hierarchical attention networks for document classification. Association for Computational Linguistics, pp 1480–1489 Yang Z, Yang D, Dyer C et al (2016) Hierarchical attention networks for document classification. Association for Computational Linguistics, pp 1480–1489
21.
Zurück zum Zitat Wang Y, Huang M, Xiaoyan Z, Zhao L (2016) Attention-based LSTM for aspect-level sentiment classification. Association for Computational Linguistics, Austin, pp 606–615 Wang Y, Huang M, Xiaoyan Z, Zhao L (2016) Attention-based LSTM for aspect-level sentiment classification. Association for Computational Linguistics, Austin, pp 606–615
22.
Zurück zum Zitat Wu Z, Dai X-Y, Yin C et al (2018) Improving review representations with user attention and product attention for sentiment classification. arXiv:180107861[cs] Wu Z, Dai X-Y, Yin C et al (2018) Improving review representations with user attention and product attention for sentiment classification. arXiv:​180107861[cs]
35.
Zurück zum Zitat Luo X (2007) Consumer negative voice and firm-idiosyncratic stock returns. J Mark 71(3):75–88CrossRef Luo X (2007) Consumer negative voice and firm-idiosyncratic stock returns. J Mark 71(3):75–88CrossRef
36.
Zurück zum Zitat Fama EE (1970) Efficient capital markets: a review of theory and empirical work. J Finance 25:36CrossRef Fama EE (1970) Efficient capital markets: a review of theory and empirical work. J Finance 25:36CrossRef
38.
Zurück zum Zitat Tetlock PC (2007) Giving content to investor sentiment: the role of media in the stock market. J Finance 62(3):1139–1168CrossRef Tetlock PC (2007) Giving content to investor sentiment: the role of media in the stock market. J Finance 62(3):1139–1168CrossRef
42.
Zurück zum Zitat Si J, Mukherjee A, Liu B et al (2014) Exploiting social relations and sentiment for stock prediction. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP). Association for Computational Linguistics, Doha, Qatar, pp 1139–1145 Si J, Mukherjee A, Liu B et al (2014) Exploiting social relations and sentiment for stock prediction. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP). Association for Computational Linguistics, Doha, Qatar, pp 1139–1145
43.
Zurück zum Zitat Cheng J, Zhao S, Zhang J et al (2017) Aspect-level sentiment classification with HEAT (HiErarchical ATtention) network. In: Proceedings of the 2017 ACM on conference on information and knowledge management—CIKM’17. ACM Press, Singapore, pp 97–106 Cheng J, Zhao S, Zhang J et al (2017) Aspect-level sentiment classification with HEAT (HiErarchical ATtention) network. In: Proceedings of the 2017 ACM on conference on information and knowledge management—CIKM’17. ACM Press, Singapore, pp 97–106
49.
Zurück zum Zitat Ding X, Zhang Y, Liu T, Duan J (2015) Deep learning for event-driven stock prediction. In: Twenty-fourth international joint conference on artificial intelligence Ding X, Zhang Y, Liu T, Duan J (2015) Deep learning for event-driven stock prediction. In: Twenty-fourth international joint conference on artificial intelligence
51.
52.
Zurück zum Zitat Liu J, Zhang Y (2017) Attention modeling for targeted sentiment. Association for Computational Linguistics, Valencia, pp 572–577 Liu J, Zhang Y (2017) Attention modeling for targeted sentiment. Association for Computational Linguistics, Valencia, pp 572–577
53.
Zurück zum Zitat Hu Z, Liu W, Bian J et al (2018) Listening to chaotic whispers: a deep learning framework for news-oriented stock trend prediction. ACM Press, New york, pp 261–269 Hu Z, Liu W, Bian J et al (2018) Listening to chaotic whispers: a deep learning framework for news-oriented stock trend prediction. ACM Press, New york, pp 261–269
54.
Zurück zum Zitat Liu Q, Cheng X, Su S, Zhu S (2018) Hierarchical complementary attention network for predicting stock price movements with news. ACM Press, Torino, pp 1603–1606 Liu Q, Cheng X, Su S, Zhu S (2018) Hierarchical complementary attention network for predicting stock price movements with news. ACM Press, Torino, pp 1603–1606
58.
Zurück zum Zitat Aizawa A (2003) An information-theoretic perspective of tf–idf measures. Inf Process Manag 39:45–65CrossRef Aizawa A (2003) An information-theoretic perspective of tf–idf measures. Inf Process Manag 39:45–65CrossRef
60.
Zurück zum Zitat Cho K, van Merrienboer B, Gulcehre C et al (2014) Learning phrase representations using RNN encoder–decoder for statistical machine translation. arXiv:14061078[cs,stat] Cho K, van Merrienboer B, Gulcehre C et al (2014) Learning phrase representations using RNN encoder–decoder for statistical machine translation. arXiv:​14061078[cs,stat]
63.
64.
Zurück zum Zitat Pennington J, Socher R, Manning C (2014) Glove: global vectors for word representation. Association for Computational Linguistics, Doha, pp 1532–1543 Pennington J, Socher R, Manning C (2014) Glove: global vectors for word representation. Association for Computational Linguistics, Doha, pp 1532–1543
71.
Zurück zum Zitat Devlin J, Chang M-W, Lee K, Toutanova K (2018) BERT: pre-training of deep bidirectional transformers for language understanding. arXiv:181004805[cs] Devlin J, Chang M-W, Lee K, Toutanova K (2018) BERT: pre-training of deep bidirectional transformers for language understanding. arXiv:​181004805[cs]
Metadaten
Titel
Electronic word-of-mouth effects on studio performance leveraging attention-based model
verfasst von
Yang Liu
Hao Fei
Qingguo Zeng
Bobo Li
Lili Ma
Donghong Ji
Joaquín Ordieres Meré
Publikationsdatum
23.04.2020
Verlag
Springer London
Erschienen in
Neural Computing and Applications / Ausgabe 23/2020
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-020-04937-0

Weitere Artikel der Ausgabe 23/2020

Neural Computing and Applications 23/2020 Zur Ausgabe

S.I. : Emerging applications of Deep Learning and Spiking ANN

The effect of reduced training in neural architecture search

S.I. : Emerging applications of Deep Learning and Spiking ANN

Change detection and convolution neural networks for fall recognition