Skip to main content
Erschienen in: Neural Computing and Applications 11/2020

31.10.2018 | Multi-Source Data Understanding (MSDU)

Cross-domain aspect/sentiment-aware abstractive review summarization by combining topic modeling and deep reinforcement learning

verfasst von: Min Yang, Qiang Qu, Ying Shen, Kai Lei, Jia Zhu

Erschienen in: Neural Computing and Applications | Ausgabe 11/2020

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Review text has been widely studied in traditional tasks such as sentiment analysis and aspect extraction. However, to date, no work is toward the end-to-end abstractive review summarization that is essential for business organizations and individual consumers to make informed decisions. This study takes the lead to study the aspect/sentiment-aware abstractive review summarization in domain adaptation scenario. Our novel model Abstractive review Summarization with Topic modeling and Reinforcement deep learning (ASTR) leverages the benefits of the supervised deep neural networks, reinforcement learning, and unsupervised probabilistic generative model to strengthen the aspect/sentiment-aware review representation learning. ASTR is a multi-task learning system, which simultaneously optimizes two coupled objectives: domain classification (auxiliary task) and abstractive review summarization (primary task), in which a document modeling module is shared across tasks. The main purpose of our multi-task model is to strengthen the representation learning of documents and safeguard the performance of cross-domain abstractive review summarization. Specifically, ASTR consists of two key components: (1) a domain classifier, working on datasets of both source and target domains to recognize the domain information of texts and transfer knowledge from the source domain to the target domain. In particular, we propose a weakly supervised LDA model to learn the domain-specific aspect and sentiment lexicon representations, which are then fed into the neural hidden states of given reviews to form aspect/sentiment-aware review representations; (2) an abstractive review summarizer, sharing the document modeling module with the domain classifier. The learned aspect/lexicon-aware review representations are fed into a pointer-generator network to generate aspect/sentiment-aware abstractive summaries of given reviews by employing a reinforcement learning algorithm. We conduct extensive experiments on real-life Amazon reviews to evaluate the effectiveness of our model. Quantitatively, ASTR achieves better performance than the state-of-the-art summarization methods in terms of ROUGE score and human evaluation in both out-of-domain and in-domain setups. Qualitatively, our model can generate better sentiment-aware summarization for reviews with different categories and aspects.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Fußnoten
1
We select their RL + ML model which obtains second highest ROUGE score but produces summaries of highest readability.
 
Literatur
1.
Zurück zum Zitat Blei DM, Ng AY, Jordan MI (2003) Latent Dirichlet allocation. J Mach Learn Res 3:993–1022MATH Blei DM, Ng AY, Jordan MI (2003) Latent Dirichlet allocation. J Mach Learn Res 3:993–1022MATH
2.
Zurück zum Zitat Caruana R (1998) Multitask learning. In: Learning to learn. Springer, pp 95–133 Caruana R (1998) Multitask learning. In: Learning to learn. Springer, pp 95–133
3.
Zurück zum Zitat Chen Q, Zhu X, Ling Z, Wei S, Jiang H (2016) Distraction-based neural networks for modeling documents. In: Proceedings of the international joint conference on artificial intelligence Chen Q, Zhu X, Ling Z, Wei S, Jiang H (2016) Distraction-based neural networks for modeling documents. In: Proceedings of the international joint conference on artificial intelligence
4.
Zurück zum Zitat Chen T-H, Liao Y-H, Chuang C-Y, Hsu W-T, Fu J, Sun M (2017) Show, adapt and tell: adversarial training of cross-domain image captioner. IEEE Int Conf Comput Vis (ICCV) 2:521–530 Chen T-H, Liao Y-H, Chuang C-Y, Hsu W-T, Fu J, Sun M (2017) Show, adapt and tell: adversarial training of cross-domain image captioner. IEEE Int Conf Comput Vis (ICCV) 2:521–530
5.
Zurück zum Zitat Cheng J, Lapata M (2016) Neural summarization by extracting sentences and words. In: Proceedings of the 54th annual meeting of the association for computational linguistics, Berlin, Germany. Association for Computational Linguistics, vol 1: long papers, pp 484–494 Cheng J, Lapata M (2016) Neural summarization by extracting sentences and words. In: Proceedings of the 54th annual meeting of the association for computational linguistics, Berlin, Germany. Association for Computational Linguistics, vol 1: long papers, pp 484–494
6.
Zurück zum Zitat Chopra S, Auli M, Rush AM (2016) Abstractive sentence summarization with attentive recurrent neural networks. In: The 15th annual conference of the north American chapter of the association for computational linguistics: human language technologies, pp 93–98 Chopra S, Auli M, Rush AM (2016) Abstractive sentence summarization with attentive recurrent neural networks. In: The 15th annual conference of the north American chapter of the association for computational linguistics: human language technologies, pp 93–98
7.
Zurück zum Zitat Collobert R, Weston J (2008) A unified architecture for natural language processing: deep neural networks with multitask learning. In: The international conference on machine learning, pp 160–167 Collobert R, Weston J (2008) A unified architecture for natural language processing: deep neural networks with multitask learning. In: The international conference on machine learning, pp 160–167
8.
Zurück zum Zitat Ganesan K, Zhai CX, Han J (2010) Opinosis: a graph-based approach to abstractive summarization of highly redundant opinions. In: The 23rd international conference on computational linguistics. ACL, pp 340–348 Ganesan K, Zhai CX, Han J (2010) Opinosis: a graph-based approach to abstractive summarization of highly redundant opinions. In: The 23rd international conference on computational linguistics. ACL, pp 340–348
9.
Zurück zum Zitat Gerani S, Mehdad Y, Carenini G, Ng RT, Nejat B (2014) Abstractive summarization of product reviews using discourse structure. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), pp 1602–1613 Gerani S, Mehdad Y, Carenini G, Ng RT, Nejat B (2014) Abstractive summarization of product reviews using discourse structure. In: Proceedings of the 2014 conference on empirical methods in natural language processing (EMNLP), pp 1602–1613
10.
Zurück zum Zitat Gu J, Lu Z, Li H, Li VOK (2016) Incorporating copying mechanism in sequence-to-sequence learning. ACL 1:1631–1640 Gu J, Lu Z, Li H, Li VOK (2016) Incorporating copying mechanism in sequence-to-sequence learning. ACL 1:1631–1640
11.
Zurück zum Zitat Hochreiter S, Schmidhuber J (1997) Long short-term memory. In: Neural computation, pp 1735–1780 Hochreiter S, Schmidhuber J (1997) Long short-term memory. In: Neural computation, pp 1735–1780
12.
Zurück zum Zitat Hu M, Liu B (2006) Opinion extraction and summarization on the web. AAAI 7:1621–1624 Hu M, Liu B (2006) Opinion extraction and summarization on the web. AAAI 7:1621–1624
13.
Zurück zum Zitat Kikuchi Y, Neubig G, Sasano R, Takamura H, Okumura M (2016) Controlling output length in neural encoder-decoders. In: EMNLP, pp 1328–1338 Kikuchi Y, Neubig G, Sasano R, Takamura H, Okumura M (2016) Controlling output length in neural encoder-decoders. In: EMNLP, pp 1328–1338
14.
Zurück zum Zitat Lavie A, Agarwal A (2007) Meteor: an automatic metric for MT evaluation with high levels of correlation with human judgments. In: Proceedings of the second workshop on statistical machine translation. Association for Computational Linguistics, pp 228–231 Lavie A, Agarwal A (2007) Meteor: an automatic metric for MT evaluation with high levels of correlation with human judgments. In: Proceedings of the second workshop on statistical machine translation. Association for Computational Linguistics, pp 228–231
15.
Zurück zum Zitat Li F, Han C, Huang M, Zhu X, Xia Y-J, Zhang S, Yu H (2010) Structure-aware review mining and summarization. In: Proceedings of the 23rd international conference on computational linguistics. Association for Computational Linguistics, pp 653–661 Li F, Han C, Huang M, Zhu X, Xia Y-J, Zhang S, Yu H (2010) Structure-aware review mining and summarization. In: Proceedings of the 23rd international conference on computational linguistics. Association for Computational Linguistics, pp 653–661
16.
Zurück zum Zitat Lin CY (2004) Rouge: a package for automatic evaluation of summaries. In: ACL workshop on text summarization branches out, vol 8 Lin CY (2004) Rouge: a package for automatic evaluation of summaries. In: ACL workshop on text summarization branches out, vol 8
17.
Zurück zum Zitat Lin J, Sun X, Ma S, Su Q (2018) Global encoding for abstractive summarization. In: IJCAI Lin J, Sun X, Ma S, Su Q (2018) Global encoding for abstractive summarization. In: IJCAI
18.
Zurück zum Zitat Liu L, Lu Y, Yang M, Qu Q, Zhu J, Li H (2018) Generative adversarial network for abstractive text summarization. In: Association for the advancement of artificial intelligence Liu L, Lu Y, Yang M, Qu Q, Zhu J, Li H (2018) Generative adversarial network for abstractive text summarization. In: Association for the advancement of artificial intelligence
19.
Zurück zum Zitat Lu Y, Zhai C (2008) Opinion integration through semi-supervised topic modeling. In: Proceedings of the 17th international conference on World Wide Web. ACM, pp 121–130 Lu Y, Zhai C (2008) Opinion integration through semi-supervised topic modeling. In: Proceedings of the 17th international conference on World Wide Web. ACM, pp 121–130
20.
Zurück zum Zitat Luong MT, Le QV, Sutskever I, Vinyals O, Kaiser L (2016) Multi-task sequence to sequence learning. In: International conference on learning representations Luong MT, Le QV, Sutskever I, Vinyals O, Kaiser L (2016) Multi-task sequence to sequence learning. In: International conference on learning representations
21.
Zurück zum Zitat Ly DK, Sugiyama K, Lin Z, Kan MY (2011) Product review summarization from a deeper perspective. In: Annual international ACM/IEEE joint conference on digital libraries. ACM, pp 311–314 Ly DK, Sugiyama K, Lin Z, Kan MY (2011) Product review summarization from a deeper perspective. In: Annual international ACM/IEEE joint conference on digital libraries. ACM, pp 311–314
22.
Zurück zum Zitat Ma S, Sun X, Lin J, Ren X (2018) A hierarchical end-to-end model for jointly improving text summarization and sentiment classification. In: IJCAI Ma S, Sun X, Lin J, Ren X (2018) A hierarchical end-to-end model for jointly improving text summarization and sentiment classification. In: IJCAI
23.
Zurück zum Zitat Markatopoulou F, Mezaris V, Patras I (2016) Deep multi-task learning with label correlation constraint for video concept detection. In: Proceedings of the 2016 ACM on multimedia conference. ACM, pp 501–505 Markatopoulou F, Mezaris V, Patras I (2016) Deep multi-task learning with label correlation constraint for video concept detection. In: Proceedings of the 2016 ACM on multimedia conference. ACM, pp 501–505
24.
Zurück zum Zitat Mason R, Gaska B, Durme BV, Choudhury P, Hart T, Dolan B, Toutanova K, Mitchell M (2016) Microsummarization of online reviews: an experimental study. In: AAAI, pp 3015–3021 Mason R, Gaska B, Durme BV, Choudhury P, Hart T, Dolan B, Toutanova K, Mitchell M (2016) Microsummarization of online reviews: an experimental study. In: AAAI, pp 3015–3021
25.
Zurück zum Zitat McAuley J, Leskovec J (2013) Hidden factors and hidden topics: understanding rating dimensions with review text. In: ACM conference on recommender systems. ACM, pp 165–172 McAuley J, Leskovec J (2013) Hidden factors and hidden topics: understanding rating dimensions with review text. In: ACM conference on recommender systems. ACM, pp 165–172
26.
Zurück zum Zitat Mei Q, Ling X, Wondra M, Su H, Zhai CX (2007) Topic sentiment mixture: modeling facets and opinions in weblogs. In: Proceedings of the 16th international conference on World Wide Web. ACM, pp 171–180 Mei Q, Ling X, Wondra M, Su H, Zhai CX (2007) Topic sentiment mixture: modeling facets and opinions in weblogs. In: Proceedings of the 16th international conference on World Wide Web. ACM, pp 171–180
27.
Zurück zum Zitat Mo K, Li S, Zhang Y, Li J, Yang Q (2017) Personalizing a dialogue system with transfer learning. In: The thirty-first AAAI conference on artificial intelligence Mo K, Li S, Zhang Y, Li J, Yang Q (2017) Personalizing a dialogue system with transfer learning. In: The thirty-first AAAI conference on artificial intelligence
28.
Zurück zum Zitat Mukherjee A, Liu B (2012) Aspect extraction through semi-supervised modeling. In: ACL, pp 339–348 Mukherjee A, Liu B (2012) Aspect extraction through semi-supervised modeling. In: ACL, pp 339–348
29.
Zurück zum Zitat Nallapati R, Zhou B, Gulcehre C, Xiang B et al (2016) Abstractive text summarization using sequence-to-sequence RNNS and beyond. In: Proceedings of the 20th SIGNLL conference on computational natural language learning. Association for Computational Linguistics, pp 280–290 Nallapati R, Zhou B, Gulcehre C, Xiang B et al (2016) Abstractive text summarization using sequence-to-sequence RNNS and beyond. In: Proceedings of the 20th SIGNLL conference on computational natural language learning. Association for Computational Linguistics, pp 280–290
30.
Zurück zum Zitat Papineni K, Roukos S, Ward T, Zhu WJ (2002) BLEU: a method for automatic evaluation of machine translation. In: The 40th annual meeting on association for computational linguistics, pp 311–318 Papineni K, Roukos S, Ward T, Zhu WJ (2002) BLEU: a method for automatic evaluation of machine translation. In: The 40th annual meeting on association for computational linguistics, pp 311–318
31.
Zurück zum Zitat Pasunuru R, Bansal M (2017) Multi-task video captioning with video and entailment generation. In: The 55th annual meeting of the association for computational linguistics, pp 1273–1283 Pasunuru R, Bansal M (2017) Multi-task video captioning with video and entailment generation. In: The 55th annual meeting of the association for computational linguistics, pp 1273–1283
32.
33.
Zurück zum Zitat Popescu AM, Etzioni O (2007) Extracting product features and opinions from reviews. In: Natural language processing and text mining. Springer, pp 9–28 Popescu AM, Etzioni O (2007) Extracting product features and opinions from reviews. In: Natural language processing and text mining. Springer, pp 9–28
34.
Zurück zum Zitat Porteous I, Newman D, Ihler A, Asuncion A, Smyth P, Welling M (2008) Fast collapsed gibbs sampling for latent Dirichlet allocation. In: SIGKDD, pp 569–577 Porteous I, Newman D, Ihler A, Asuncion A, Smyth P, Welling M (2008) Fast collapsed gibbs sampling for latent Dirichlet allocation. In: SIGKDD, pp 569–577
35.
Zurück zum Zitat Ranzato MA, Chopra S, Auli M, Zaremba W (2015) Sequence level training with recurrent neural networks. arXiv preprint arXiv:1511.06732 Ranzato MA, Chopra S, Auli M, Zaremba W (2015) Sequence level training with recurrent neural networks. arXiv preprint arXiv:​1511.​06732
36.
Zurück zum Zitat Rush AM, Chopra S, Weston J (2015) A neural attention model for abstractive sentence summarization. In: Proceedings of the 2015 conference on empirical methods in natural language processing. Association for Computational Linguistics, pp 379–389 Rush AM, Chopra S, Weston J (2015) A neural attention model for abstractive sentence summarization. In: Proceedings of the 2015 conference on empirical methods in natural language processing. Association for Computational Linguistics, pp 379–389
37.
Zurück zum Zitat See A, Liu PJ, Manning CD (2017) Get to the point: summarization with pointer-generator networks. In: Proceedings of the 55th annual meeting of the association for computational linguistics. Association for Computational Linguistics, pp 1073–1083 See A, Liu PJ, Manning CD (2017) Get to the point: summarization with pointer-generator networks. In: Proceedings of the 55th annual meeting of the association for computational linguistics. Association for Computational Linguistics, pp 1073–1083
38.
Zurück zum Zitat Shen W, Zhao K, Jiang Y, Wang Y, Bai X, Yuille A (2017) Deepskeleton: learning multi-task scale-associated deep side outputs for object skeleton extraction in natural images. IEEE Trans Image Process 26(11):5298–5311MathSciNetMATH Shen W, Zhao K, Jiang Y, Wang Y, Bai X, Yuille A (2017) Deepskeleton: learning multi-task scale-associated deep side outputs for object skeleton extraction in natural images. IEEE Trans Image Process 26(11):5298–5311MathSciNetMATH
39.
Zurück zum Zitat Titov I, McDonald R (2008) Modeling online reviews with multi-grain topic models. In: Proceedings of the 17th international conference on World Wide Web. ACM, pp 111–120 Titov I, McDonald R (2008) Modeling online reviews with multi-grain topic models. In: Proceedings of the 17th international conference on World Wide Web. ACM, pp 111–120
40.
Zurück zum Zitat Venugopalan S, Hendricks LA, Mooney R, Saenko K (2016) Improving LSTM-based video description with linguistic knowledge mined from text. In: Proceedings of the conference on empirical methods in natural language processing, pp 1961–1966 Venugopalan S, Hendricks LA, Mooney R, Saenko K (2016) Improving LSTM-based video description with linguistic knowledge mined from text. In: Proceedings of the conference on empirical methods in natural language processing, pp 1961–1966
41.
Zurück zum Zitat Yang M, Mei J, Xu F, Tu W, Lu Z (2016) Discovering author interest evolution in topic modeling. In: Proceedings of the 39th international ACM SIGIR conference on research and development in information retrieval. ACM, pp 801–804 Yang M, Mei J, Xu F, Tu W, Lu Z (2016) Discovering author interest evolution in topic modeling. In: Proceedings of the 39th international ACM SIGIR conference on research and development in information retrieval. ACM, pp 801–804
42.
Zurück zum Zitat Yang M, Qu Q, Shen Y, Liu Q, Zhao W, Zhu J (2018) Aspect and sentiment aware abstractive review summarization. In: Proceedings of the 27th international conference on computational linguistics, pp 1110–1120 Yang M, Qu Q, Shen Y, Liu Q, Zhao W, Zhu J (2018) Aspect and sentiment aware abstractive review summarization. In: Proceedings of the 27th international conference on computational linguistics, pp 1110–1120
43.
Zurück zum Zitat Yang M, Zhao Z, Zhao W, Chen X, Zhu J, Zhou L, Cao Z (2017) Personalized response generation via domain adaptation. In: Proceedings of the 40th international ACM SIGIR conference on research and development in information retrieval. ACM, pp 1021–1024 Yang M, Zhao Z, Zhao W, Chen X, Zhu J, Zhou L, Cao Z (2017) Personalized response generation via domain adaptation. In: Proceedings of the 40th international ACM SIGIR conference on research and development in information retrieval. ACM, pp 1021–1024
44.
Zurück zum Zitat Yang M, Zhu D, Rashed M, Chow KP (2014) Learning domain-specific sentiment lexicon with supervised sentiment-aware LDA. In: Proceedings of the twenty-first European conference on artificial intelligence (ECAI'14), pp 927–932 Yang M, Zhu D, Rashed M, Chow KP (2014) Learning domain-specific sentiment lexicon with supervised sentiment-aware LDA. In: Proceedings of the twenty-first European conference on artificial intelligence (ECAI'14), pp 927–932
45.
Zurück zum Zitat Yang Y, Ma Z, Yang Y, Nie F, Tao Shen H (2015) Multitask spectral clustering by exploring intertask correlation. IEEE Trans Cybern 45(5):1083–1094 Yang Y, Ma Z, Yang Y, Nie F, Tao Shen H (2015) Multitask spectral clustering by exploring intertask correlation. IEEE Trans Cybern 45(5):1083–1094
46.
Zurück zum Zitat Yu N, Huang M, Shi Y et al (2016) Product review summarization by exploiting phrase properties. In: Proceedings of the 26th international conference on computational linguistics (COLING), pp 1113–1124 Yu N, Huang M, Shi Y et al (2016) Product review summarization by exploiting phrase properties. In: Proceedings of the 26th international conference on computational linguistics (COLING), pp 1113–1124
47.
Zurück zum Zitat Yuan Y, Lin J, Wang Q (2016) Hyperspectral image classification via multitask joint sparse representation and stepwise MRF optimization. IEEE Trans Cybern 46(12):2966–2977 Yuan Y, Lin J, Wang Q (2016) Hyperspectral image classification via multitask joint sparse representation and stepwise MRF optimization. IEEE Trans Cybern 46(12):2966–2977
48.
Zurück zum Zitat Zhang L, Zhang Y, Chen Y (2012) Summarizing highly structured documents for effective search interaction. In: SIGIR. ACM Zhang L, Zhang Y, Chen Y (2012) Summarizing highly structured documents for effective search interaction. In: SIGIR. ACM
49.
Zurück zum Zitat Zhang Q, Levine MD (2016) Robust multi-focus image fusion using multi-task sparse representation and spatial context. IEEE Trans Image Process 25(5):2045–2058MathSciNetMATH Zhang Q, Levine MD (2016) Robust multi-focus image fusion using multi-task sparse representation and spatial context. IEEE Trans Image Process 25(5):2045–2058MathSciNetMATH
51.
Zurück zum Zitat Zheng W, Zhu X, Zhu Y, Hu R, Lei C (2017) Dynamic graph learning for spectral feature selection. Multimed Tools Appl 77(22):29739–29755 Zheng W, Zhu X, Zhu Y, Hu R, Lei C (2017) Dynamic graph learning for spectral feature selection. Multimed Tools Appl 77(22):29739–29755
52.
Zurück zum Zitat Zhu X, Zhang S, Hu R, Zhu Y et al (2018) Local and global structure preservation for robust unsupervised spectral feature selection. IEEE Trans Knowl Data Eng 30(3):517–529 Zhu X, Zhang S, Hu R, Zhu Y et al (2018) Local and global structure preservation for robust unsupervised spectral feature selection. IEEE Trans Knowl Data Eng 30(3):517–529
54.
Zurück zum Zitat Zhuang L, Jing F, Zhu XY (2006) Movie review mining and summarization. In: Proceedings of the 15th ACM international conference on information and knowledge management. ACM, pp 43–50 Zhuang L, Jing F, Zhu XY (2006) Movie review mining and summarization. In: Proceedings of the 15th ACM international conference on information and knowledge management. ACM, pp 43–50
Metadaten
Titel
Cross-domain aspect/sentiment-aware abstractive review summarization by combining topic modeling and deep reinforcement learning
verfasst von
Min Yang
Qiang Qu
Ying Shen
Kai Lei
Jia Zhu
Publikationsdatum
31.10.2018
Verlag
Springer London
Erschienen in
Neural Computing and Applications / Ausgabe 11/2020
Print ISSN: 0941-0643
Elektronische ISSN: 1433-3058
DOI
https://doi.org/10.1007/s00521-018-3825-2

Weitere Artikel der Ausgabe 11/2020

Neural Computing and Applications 11/2020 Zur Ausgabe

Brain inspired Computing &Machine Learning Applied Research-BISMLARE

Modular domain-to-domain translation network