2021 | OriginalPaper | Chapter
Hint
Swipe to navigate through the chapters of this book
Published in:
Recent Innovations in Computing
With the exploration of the Internet, social media platforms provide users to share their views toward various products, people, and topics. Nowadays, social media platforms have not limited their users to post or share only text but also images and videos to express their opinion or other social media activity. Multimodal sentiment analysis is an extension of sentiment analysis that is used to mine the heterogeneous type of unstructured data together. This paper gives a review of sentiment analysis and various studies contributed in the field of multimodal sentiment analysis and also discusses some important research challenges in the sentiment analysis of the social media data.
Please log in to get access to this content
To get access to this content you need the following product:
Advertisement
1.
go back to reference Hussein, D.M.E.D.M.: A survey on sentiment analysis challenges. J. King Saud Univ. Eng. Sci. 30, 330–338 (2018). https://doi.org/10.1016/j.jksues.2016.04.002. Hussein, D.M.E.D.M.: A survey on sentiment analysis challenges. J. King Saud Univ. Eng. Sci.
30, 330–338 (2018).
https://doi.org/10.1016/j.jksues.2016.04.002.
2.
go back to reference Perez Rosas, V., Mihalcea, R., Morency, L.P.: Multimodal sentiment analysis of Spanish online videos. IEEE Intell. Syst. 28, 38–45 (2013). https://doi.org/10.1109/MIS.2013.9 CrossRef Perez Rosas, V., Mihalcea, R., Morency, L.P.: Multimodal sentiment analysis of Spanish online videos. IEEE Intell. Syst.
28, 38–45 (2013).
https://doi.org/10.1109/MIS.2013.9
CrossRef
3.
go back to reference Pang, B., Lee, L.: Opinion mining and sentiment analysis. Found. Trends ® in Inf. Retr. 2, 1–135 (2008). https://doi.org/10.1561/1500000011. Pang, B., Lee, L.: Opinion mining and sentiment analysis. Found. Trends
® in Inf. Retr.
2, 1–135 (2008).
https://doi.org/10.1561/1500000011.
4.
go back to reference Liu, B.: Sentiment analysis and opinion mining. Synth. Lect. Hum. Lang. Technol. 5, 1–167 (2012). https://doi.org/10.2200/S00416ED1V01Y201204HLT016 CrossRef Liu, B.: Sentiment analysis and opinion mining. Synth. Lect. Hum. Lang. Technol.
5, 1–167 (2012).
https://doi.org/10.2200/S00416ED1V01Y201204HLT016
CrossRef
5.
go back to reference Asghar, M.Z., Khan, A., Bibi, A., Kundi, F.M., Ahmad, H.: Sentence-level emotion detection framework using rule-based classification. Cognit. Comput. 9, 868–894 (2017). https://doi.org/10.1007/s12559-017-9503-3 CrossRef Asghar, M.Z., Khan, A., Bibi, A., Kundi, F.M., Ahmad, H.: Sentence-level emotion detection framework using rule-based classification. Cognit. Comput.
9, 868–894 (2017).
https://doi.org/10.1007/s12559-017-9503-3
CrossRef
6.
go back to reference Hallsmar, F., Palm, J.: Multi-class sentiment classification on Twitter using an Emoji Training Heuristic, pp. 1–27 (2016) Hallsmar, F., Palm, J.: Multi-class sentiment classification on Twitter using an Emoji Training Heuristic, pp. 1–27 (2016)
7.
go back to reference Wood, I.D., Ruder, S.: Emoji as emotion tags for Tweets. Proc. Lr. 2016 Work. Emot. Sentim. Anal. 76–79 (2016) Wood, I.D., Ruder, S.: Emoji as emotion tags for Tweets. Proc. Lr. 2016 Work. Emot. Sentim. Anal. 76–79 (2016)
8.
go back to reference Kumar, A., Garg, G.: Sentiment analysis of multimodal twitter data. Multimed. Tools Appl. (2019). https://doi.org/10.1007/s11042-019-7390-1 CrossRef Kumar, A., Garg, G.: Sentiment analysis of multimodal twitter data. Multimed. Tools Appl. (2019).
https://doi.org/10.1007/s11042-019-7390-1
CrossRef
9.
go back to reference Huang, F., Zhang, X., Zhao, Z., Xu, J., Li, Z.: Image–text sentiment analysis via deep multimodal attentive fusion. Knowl.-Based Syst. 167, 26–37 (2019). https://doi.org/10.1016/j.knosys.2019.01.019 CrossRef Huang, F., Zhang, X., Zhao, Z., Xu, J., Li, Z.: Image–text sentiment analysis via deep multimodal attentive fusion. Knowl.-Based Syst.
167, 26–37 (2019).
https://doi.org/10.1016/j.knosys.2019.01.019
CrossRef
10.
go back to reference Poria, S., Cambria, E., Howard, N., Huang, G.B., Hussain, A.: Fusing audio, visual and textual clues for sentiment analysis from multimodal content. Neurocomputing. 174, 50–59 (2016). https://doi.org/10.1016/J.NEUCOM.2015.01.095 CrossRef Poria, S., Cambria, E., Howard, N., Huang, G.B., Hussain, A.: Fusing audio, visual and textual clues for sentiment analysis from multimodal content. Neurocomputing.
174, 50–59 (2016).
https://doi.org/10.1016/J.NEUCOM.2015.01.095
CrossRef
11.
go back to reference Katsurai, M., Satoh, S.: Image sentiment analysis using latent correlations among visual, textual, and sentiment views. In: ICASSP, IEEE International Conference on Acoust. Speech Signal Process. Proc. 2016-May, pp. 2837–2841 (2016). https://doi.org/10.1109/ICASSP.2016.7472195 Katsurai, M., Satoh, S.: Image sentiment analysis using latent correlations among visual, textual, and sentiment views. In: ICASSP, IEEE International Conference on Acoust. Speech Signal Process. Proc. 2016-May, pp. 2837–2841 (2016).
https://doi.org/10.1109/ICASSP.2016.7472195
12.
go back to reference Baecchi, C., Uricchio, T., Bertini, M., Del Bimbo, A.: A multimodal feature learning approach for sentiment analysis of social network multimedia. Multimed. Tools Appl. 75, 2507–2525 (2016). https://doi.org/10.1007/s11042-015-2646-x CrossRef Baecchi, C., Uricchio, T., Bertini, M., Del Bimbo, A.: A multimodal feature learning approach for sentiment analysis of social network multimedia. Multimed. Tools Appl.
75, 2507–2525 (2016).
https://doi.org/10.1007/s11042-015-2646-x
CrossRef
13.
go back to reference Poria, S., Peng, H., Hussain, A., Howard, N., Cambria, E.: Ensemble application of convolutional neural networks and multiple kernel learning for multimodal sentiment analysis. Neurocomputing 261, 217–230 (2017). https://doi.org/10.1016/j.neucom.2016.09.117 CrossRef Poria, S., Peng, H., Hussain, A., Howard, N., Cambria, E.: Ensemble application of convolutional neural networks and multiple kernel learning for multimodal sentiment analysis. Neurocomputing
261, 217–230 (2017).
https://doi.org/10.1016/j.neucom.2016.09.117
CrossRef
14.
go back to reference Chen, M., Wang, S., Liang, P.P., Baltrušaitis, T., Zadeh, A., Morency, L.P.: Multimodal sentiment analysis with word-level fusion and reinforcement learning. In: ICMI 2017 Proceedings of 19th ACM International Conference on Multimodal Interact. 2017-January, pp. 163–171 (2017). https://doi.org/10.1145/3136755.3136801 Chen, M., Wang, S., Liang, P.P., Baltrušaitis, T., Zadeh, A., Morency, L.P.: Multimodal sentiment analysis with word-level fusion and reinforcement learning. In: ICMI 2017 Proceedings of 19th ACM International Conference on Multimodal Interact. 2017-January, pp. 163–171 (2017).
https://doi.org/10.1145/3136755.3136801
15.
go back to reference Fang, Q., Xu, C., Sang, J., Hossain, M.S., Muhammad, G.: Word-of-mouth understanding: entity-centric multimodal aspect-opinion mining in social media. IEEE Trans. Multimed. 17, 2281–2296 (2015). https://doi.org/10.1109/TMM.2015.2491019 CrossRef Fang, Q., Xu, C., Sang, J., Hossain, M.S., Muhammad, G.: Word-of-mouth understanding: entity-centric multimodal aspect-opinion mining in social media. IEEE Trans. Multimed.
17, 2281–2296 (2015).
https://doi.org/10.1109/TMM.2015.2491019
CrossRef
16.
go back to reference You, Q., Cao, L., Cong, Y., Zhang, X., Luo, J.: A multifaceted approach to social multimedia-based prediction of elections. IEEE Trans. Multimed. 17, 2271–2280 (2015). https://doi.org/10.1109/TMM.2015.2487863 CrossRef You, Q., Cao, L., Cong, Y., Zhang, X., Luo, J.: A multifaceted approach to social multimedia-based prediction of elections. IEEE Trans. Multimed.
17, 2271–2280 (2015).
https://doi.org/10.1109/TMM.2015.2487863
CrossRef
17.
go back to reference Jianqiang, Z., Xiaolin, G.U.I., Xuejun, Z.: Deep convolution neural networks for Twitter sentiment analysis. IEEE Access. 6, 23253–23260 (2018). https://doi.org/10.1109/ACCESS.2017.2776930 CrossRef Jianqiang, Z., Xiaolin, G.U.I., Xuejun, Z.: Deep convolution neural networks for Twitter sentiment analysis. IEEE Access.
6, 23253–23260 (2018).
https://doi.org/10.1109/ACCESS.2017.2776930
CrossRef
18.
go back to reference Asghar, M.Z., Khan, A., Khan, F., Kundi, F.M.: RIFT: a rule induction framework for Twitter sentiment analysis. Arab. J. Sci. Eng. 43, 857–877 (2018). https://doi.org/10.1007/s13369-017-2770-1 CrossRef Asghar, M.Z., Khan, A., Khan, F., Kundi, F.M.: RIFT: a rule induction framework for Twitter sentiment analysis. Arab. J. Sci. Eng.
43, 857–877 (2018).
https://doi.org/10.1007/s13369-017-2770-1
CrossRef
19.
go back to reference Ji, X., Chun, S.A., Wei, Z., Geller, J.: Twitter sentiment classification for measuring public health concerns. Soc. Netw. Anal. Min. 5, 1–25 (2015). https://doi.org/10.1007/s13278-015-0253-5 CrossRef Ji, X., Chun, S.A., Wei, Z., Geller, J.: Twitter sentiment classification for measuring public health concerns. Soc. Netw. Anal. Min.
5, 1–25 (2015).
https://doi.org/10.1007/s13278-015-0253-5
CrossRef
20.
go back to reference Khodabakhsh, M., Kahani, M., Bagheri, E.: Predicting future personal life events on Twitter via recurrent neural networks. J. Intell. Inf. Syst. (2018). https://doi.org/10.1007/s10844-018-0519-2 CrossRef Khodabakhsh, M., Kahani, M., Bagheri, E.: Predicting future personal life events on Twitter via recurrent neural networks. J. Intell. Inf. Syst. (2018).
https://doi.org/10.1007/s10844-018-0519-2
CrossRef
21.
go back to reference Ruz, G.A., Henríquez, P.A., Mascareño, A.: Sentiment analysis of Twitter data during critical events through Bayesian networks classifiers. Futur. Gener. Comput. Syst. 106, 92–104 (2020). https://doi.org/10.1016/j.future.2020.01.005 CrossRef Ruz, G.A., Henríquez, P.A., Mascareño, A.: Sentiment analysis of Twitter data during critical events through Bayesian networks classifiers. Futur. Gener. Comput. Syst.
106, 92–104 (2020).
https://doi.org/10.1016/j.future.2020.01.005
CrossRef
22.
go back to reference Singh, N., Roy, N., Gangopadhyay, A.: Analyzing the emotions of crowd for improving the emergency response services. Pervasive Mob. Comput. J. 58, 101018 (2019) Singh, N., Roy, N., Gangopadhyay, A.: Analyzing the emotions of crowd for improving the emergency response services. Pervasive Mob. Comput. J.
58, 101018 (2019)
23.
go back to reference Azar, P.D., Lo, A.W.: The wisdom of twitter crowds: predicting stock market reactions to FOMC meetings via twitter feeds. J. Portf. Manage. 42, 123–134 (2016). https://doi.org/10.3905/jpm.2016.42.5.123 CrossRef Azar, P.D., Lo, A.W.: The wisdom of twitter crowds: predicting stock market reactions to FOMC meetings via twitter feeds. J. Portf. Manage.
42, 123–134 (2016).
https://doi.org/10.3905/jpm.2016.42.5.123
CrossRef
24.
go back to reference Xie, Z., Liu, G., Wu, J., Tan, Y.: Big data would not lie: prediction of the 2016 Taiwan election via online heterogeneous information. EPJ Data Sci. 7 (2018). https://doi.org/10.1140/epjds/s13688-018-0163-7 Xie, Z., Liu, G., Wu, J., Tan, Y.: Big data would not lie: prediction of the 2016 Taiwan election via online heterogeneous information. EPJ Data Sci.
7 (2018).
https://doi.org/10.1140/epjds/s13688-018-0163-7
25.
go back to reference Jaidka, K., Ahmed, S., Skoric, M., Hilbert, M.: Predicting elections from social media: a three-country, three-method comparative study. Asian J. Commun. 29, 252–273 (2019). https://doi.org/10.1080/01292986.2018.1453849 CrossRef Jaidka, K., Ahmed, S., Skoric, M., Hilbert, M.: Predicting elections from social media: a three-country, three-method comparative study. Asian J. Commun.
29, 252–273 (2019).
https://doi.org/10.1080/01292986.2018.1453849
CrossRef
26.
go back to reference Kušen, E., Strembeck, M.: Politics, sentiments, and misinformation: an analysis of the Twitter discussion on the 2016 Austrian Presidential Elections. Online Soc. Netw. Media 5, 37–50 (2018). https://doi.org/10.1016/j.osnem.2017.12.002 CrossRef Kušen, E., Strembeck, M.: Politics, sentiments, and misinformation: an analysis of the Twitter discussion on the 2016 Austrian Presidential Elections. Online Soc. Netw. Media
5, 37–50 (2018).
https://doi.org/10.1016/j.osnem.2017.12.002
CrossRef
27.
go back to reference Awwalu, J., Bakar, A.A., Yaakub, M.R.: Hybrid N-gram model using Naïve Bayes for classification of political sentiments on Twitter. Neural Comput. Appl. 31, 9207–9220 (2019). https://doi.org/10.1007/s00521-019-04248-z CrossRef Awwalu, J., Bakar, A.A., Yaakub, M.R.: Hybrid N-gram model using Naïve Bayes for classification of political sentiments on Twitter. Neural Comput. Appl.
31, 9207–9220 (2019).
https://doi.org/10.1007/s00521-019-04248-z
CrossRef
28.
go back to reference Ahmed, S.: My name is Khan: the use of Twitter in the campaign for 2013 Pakistan General Election. In: 2014 47th Hawaii International Conference on System Sciences, pp. 2242–2251 (2014). https://doi.org/10.1109/HICSS.2014.282 Ahmed, S.: My name is Khan: the use of Twitter in the campaign for 2013 Pakistan General Election. In: 2014 47th Hawaii International Conference on System Sciences, pp. 2242–2251 (2014).
https://doi.org/10.1109/HICSS.2014.282
29.
go back to reference Tumasjan, A., Sprenger, T.O., Sandner, P.G., Welpe, I.M.: Predicting Elections with Twitter: What 140 Characters Reveal About Political Sentiment, pp. 178–185 (2010) Tumasjan, A., Sprenger, T.O., Sandner, P.G., Welpe, I.M.: Predicting Elections with Twitter: What 140 Characters Reveal About Political Sentiment, pp. 178–185 (2010)
30.
go back to reference Tumasjan, A., Sprenger, T.O., Sandner, P.G., Welpe, I.M.: Election forecasts with Twitter: how 140 characters reflect the political landscape. Soc. Sci. Comput. Rev. 29, 402–418 (2011). https://doi.org/10.1177/0894439310386557 CrossRef Tumasjan, A., Sprenger, T.O., Sandner, P.G., Welpe, I.M.: Election forecasts with Twitter: how 140 characters reflect the political landscape. Soc. Sci. Comput. Rev.
29, 402–418 (2011).
https://doi.org/10.1177/0894439310386557
CrossRef
31.
go back to reference Awais, M., Hassan, S.U., Ahmed, A.: Leveraging big data for politics: predicting general election of Pakistan using a novel rigged model. J. Ambient Intell. Humaniz. Comput. (2019). https://doi.org/10.1007/s12652-019-01378-z CrossRef Awais, M., Hassan, S.U., Ahmed, A.: Leveraging big data for politics: predicting general election of Pakistan using a novel rigged model. J. Ambient Intell. Humaniz. Comput. (2019).
https://doi.org/10.1007/s12652-019-01378-z
CrossRef
32.
go back to reference Heredia, B., Prusa, J.D., Khoshgoftaar, T.M.: Social media for polling and predicting United States election outcome. Soc. Netw. Anal. Min. 8, 1–16 (2018). https://doi.org/10.1007/s13278-018-0525-y CrossRef Heredia, B., Prusa, J.D., Khoshgoftaar, T.M.: Social media for polling and predicting United States election outcome. Soc. Netw. Anal. Min.
8, 1–16 (2018).
https://doi.org/10.1007/s13278-018-0525-y
CrossRef
33.
go back to reference Khatua, A., Khatua, A., Ghosh, K., Chaki, N.: Can #Twitter-Trends predict election results? Evidence from 2014 Indian general election. In: Proceedings of Annual Hawaii International Conference on System Sciences, 2015-March, pp. 1676–1685 (2015). https://doi.org/10.1109/HICSS.2015.202 Khatua, A., Khatua, A., Ghosh, K., Chaki, N.: Can #Twitter-Trends predict election results? Evidence from 2014 Indian general election. In: Proceedings of Annual Hawaii International Conference on System Sciences, 2015-March, pp. 1676–1685 (2015).
https://doi.org/10.1109/HICSS.2015.202
34.
go back to reference Bermingham, A., Smeaton, A.F.: On using Twitter to monitor political sentiment and predict election results. Sentiment Analysis where AI meets Psychol. Work. Int. Jt. Conf. Nat. Lang. Process., pp. 2–10 (2011) Bermingham, A., Smeaton, A.F.: On using Twitter to monitor political sentiment and predict election results. Sentiment Analysis where AI meets Psychol. Work. Int. Jt. Conf. Nat. Lang. Process., pp. 2–10 (2011)
35.
go back to reference Gomes, R.F., Casais, B.: Feelings generated by threat appeals in social marketing: text and emoji analysis of user reactions to anorexia nervosa campaigns in social media. Int. Rev. Public Nonprofit Mark. 15, 591–607 (2018). https://doi.org/10.1007/s12208-018-0215-5 CrossRef Gomes, R.F., Casais, B.: Feelings generated by threat appeals in social marketing: text and emoji analysis of user reactions to anorexia nervosa campaigns in social media. Int. Rev. Public Nonprofit Mark.
15, 591–607 (2018).
https://doi.org/10.1007/s12208-018-0215-5
CrossRef
36.
go back to reference Bahri, S., Bahri, P., Lal, S.: A Novel approach of sentiment classification using emoticons. Procedia Comput. Sci. 132, 669–678 (2018). https://doi.org/10.1016/j.procs.2018.05.067 CrossRef Bahri, S., Bahri, P., Lal, S.: A Novel approach of sentiment classification using emoticons. Procedia Comput. Sci.
132, 669–678 (2018).
https://doi.org/10.1016/j.procs.2018.05.067
CrossRef
37.
go back to reference Hogenboom, A., Bal, D., Frasincar, F., Bal, M., De Jong, F., Kaymak, U.: Exploiting emoticons in polarity classification of text. J. Web Eng. 14, 022–040 (2015) Hogenboom, A., Bal, D., Frasincar, F., Bal, M., De Jong, F., Kaymak, U.: Exploiting emoticons in polarity classification of text. J. Web Eng.
14, 022–040 (2015)
38.
go back to reference Spina, S.: Role of Emoticons as Structural Markers in Twitter interactions. Discourse Process. 56, 345–362 (2019). https://doi.org/10.1080/0163853X.2018.1510654 CrossRef Spina, S.: Role of Emoticons as Structural Markers in Twitter interactions. Discourse Process.
56, 345–362 (2019).
https://doi.org/10.1080/0163853X.2018.1510654
CrossRef
39.
go back to reference Huddar, M.G., Sannakki, S.S., Rajpurohit, V.S.: Multi-level context extraction and attention-based contextual inter-modal fusion for multimodal sentiment analysis and emotion classification. Int. J. Multimed. Inf. Retr. (2019). https://doi.org/10.1007/s13735-019-00185-8 CrossRef Huddar, M.G., Sannakki, S.S., Rajpurohit, V.S.: Multi-level context extraction and attention-based contextual inter-modal fusion for multimodal sentiment analysis and emotion classification. Int. J. Multimed. Inf. Retr. (2019).
https://doi.org/10.1007/s13735-019-00185-8
CrossRef
40.
go back to reference Corchs, S., Fersini, E., Gasparini, F.: Ensemble learning on visual and textual data for social image emotion classification. Int. J. Mach. Learn. Cybern. 10, 2057–2070 (2019). https://doi.org/10.1007/s13042-017-0734-0 CrossRef Corchs, S., Fersini, E., Gasparini, F.: Ensemble learning on visual and textual data for social image emotion classification. Int. J. Mach. Learn. Cybern.
10, 2057–2070 (2019).
https://doi.org/10.1007/s13042-017-0734-0
CrossRef
41.
go back to reference Tran, H.N., Cambria, E.: Ensemble application of ELM and GPU for real-time multimodal sentiment analysis. Memetic Comput. 10, 3–13 (2018). https://doi.org/10.1007/s12293-017-0228-3 CrossRef Tran, H.N., Cambria, E.: Ensemble application of ELM and GPU for real-time multimodal sentiment analysis. Memetic Comput.
10, 3–13 (2018).
https://doi.org/10.1007/s12293-017-0228-3
CrossRef
42.
go back to reference Cerezo, E., Hupont, I., Baldassarri, S., Ballano, S.: Emotional facial sensing and multimodal fusion in a continuous 2D affective space. J. Ambient Intell. Humaniz. Comput. 3, 31–46 (2012). https://doi.org/10.1007/s12652-011-0087-6 CrossRef Cerezo, E., Hupont, I., Baldassarri, S., Ballano, S.: Emotional facial sensing and multimodal fusion in a continuous 2D affective space. J. Ambient Intell. Humaniz. Comput.
3, 31–46 (2012).
https://doi.org/10.1007/s12652-011-0087-6
CrossRef
43.
go back to reference Yu, J., Jiang, J., Xia, R.: Entity-sensitive attention and fusion network for entity-level multimodal sentiment classification. IEEE/ACM Trans. Audio Speech Lang. Process. 28, 429–439 (2020). https://doi.org/10.1109/TASLP.2019.2957872 CrossRef Yu, J., Jiang, J., Xia, R.: Entity-sensitive attention and fusion network for entity-level multimodal sentiment classification. IEEE/ACM Trans. Audio Speech Lang. Process.
28, 429–439 (2020).
https://doi.org/10.1109/TASLP.2019.2957872
CrossRef
44.
go back to reference Song, K.S., Nho, Y.H., Seo, J.H., Kwon, D.S.: Decision-level fusion method for emotion recognition using multimodal emotion recognition information. In: 2018 15th International Conference on Ubiquitous Robotic UR 2018, pp. 472–476 (2018). https://doi.org/10.1109/URAI.2018.8441795 Song, K.S., Nho, Y.H., Seo, J.H., Kwon, D.S.: Decision-level fusion method for emotion recognition using multimodal emotion recognition information. In: 2018 15th International Conference on Ubiquitous Robotic UR 2018, pp. 472–476 (2018).
https://doi.org/10.1109/URAI.2018.8441795
45.
go back to reference Williams, J., Comanescu, R., Radu, O., Tian, L.: DNN multimodal fusion techniques for predicting video sentiment. In: Proceedings of Grand Challenge and Workshop on Human Multimodal Language, pp. 64–72 (2018). https://doi.org/10.18653/v1/w18-3309 Williams, J., Comanescu, R., Radu, O., Tian, L.: DNN multimodal fusion techniques for predicting video sentiment. In: Proceedings of Grand Challenge and Workshop on Human Multimodal Language, pp. 64–72 (2018).
https://doi.org/10.18653/v1/w18-3309
46.
go back to reference Bose, R., Dey, R.K., Roy, S., Sarddar, D.: Analyzing political sentiment using Twitter data. Smart Innov. Syst. Technol. 107, 427–436 (2019). https://doi.org/10.1007/978-981-13-1747-7_41 CrossRef Bose, R., Dey, R.K., Roy, S., Sarddar, D.: Analyzing political sentiment using Twitter data. Smart Innov. Syst. Technol.
107, 427–436 (2019).
https://doi.org/10.1007/978-981-13-1747-7_41
CrossRef
47.
go back to reference Burnap, P., Gibson, R., Sloan, L., Southern, R., Williams, M.: 140 characters to victory? Using Twitter to predict the UK 2015 General Election. Elect. Stud. 41, 230–233 (2016). https://doi.org/10.1016/J.ELECTSTUD.2015.11.017 CrossRef Burnap, P., Gibson, R., Sloan, L., Southern, R., Williams, M.: 140 characters to victory? Using Twitter to predict the UK 2015 General Election. Elect. Stud.
41, 230–233 (2016).
https://doi.org/10.1016/J.ELECTSTUD.2015.11.017
CrossRef
- Title
- Multimodal Sentiment Analysis of Social Media Data: A Review
- DOI
- https://doi.org/10.1007/978-981-15-8297-4_44
- Authors:
-
Priyavrat
Nonita Sharma
Geeta Sikka
- Publisher
- Springer Singapore
- Sequence number
- 44