Skip to main content
Erschienen in: Neural Processing Letters 3/2022

29.01.2022

Supervised Shallow Multi-task Learning: Analysis of Methods

verfasst von: Stanley Ebhohimhen Abhadiomhen, Royransom Chimela Nzeh, Ernest Domanaanmwi Ganaa, Honour Chika Nwagwu, George Emeka Okereke, Sidheswar Routray

Erschienen in: Neural Processing Letters | Ausgabe 3/2022

Einloggen

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

The last decade has witnessed a continuous boom in the application of machine learning techniques in pattern recognition, with much more focus on single-task learning models. However, the increasing amount of multimedia data in the real world also suggests that these single-task learning models have become unsuitable for complex problems. Hence, multi-task learning (MTL), which leverages the common path shared between related tasks to improve a specific model’s performance, has grown popular in the last years. And several studies have been conducted to find a robust MTL method either in the supervised learning or unsupervised learning paradigm using a shallow or deep approach. This paper provides an analysis of supervised shallow-based multi-task learning methods. To begin, we present a rationale for MTL with a basic example that is easy to understand. Next, we formulate a supervised MTL problem to describe the various methods utilized to learn task relationships. We also present an overview of deep learning methods for supervised MTL to compare shallow to non-shallow approaches. Then, we highlight the challenges and future research opportunities of supervised MTL.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
2.
Zurück zum Zitat Abhadiomhen SE, Wang Z, Shen X, Fan J (2021) Multiview common subspace clustering via coupled low rank representation. ACM Trans Intell Syst Technol (TIST) 12(4):1–25 Abhadiomhen SE, Wang Z, Shen X, Fan J (2021) Multiview common subspace clustering via coupled low rank representation. ACM Trans Intell Syst Technol (TIST) 12(4):1–25
3.
Zurück zum Zitat Ando RK, Zhang T (2005) A framework for learning predictive structures from multiple tasks and unlabeled data. J Mach Learn Res 6(11):1817–1853MathSciNetMATH Ando RK, Zhang T (2005) A framework for learning predictive structures from multiple tasks and unlabeled data. J Mach Learn Res 6(11):1817–1853MathSciNetMATH
4.
Zurück zum Zitat Argyriou A, Evgeniou T, Pontil M (2007) Multi-task feature learning. In: Advances in neural information processing systems. pp 41–48 Argyriou A, Evgeniou T, Pontil M (2007) Multi-task feature learning. In: Advances in neural information processing systems. pp 41–48
5.
Zurück zum Zitat Argyriou A, Evgeniou T, Pontil M (2008) Convex multi-task feature learning. Mach Learn 73(3):243–272MATH Argyriou A, Evgeniou T, Pontil M (2008) Convex multi-task feature learning. Mach Learn 73(3):243–272MATH
6.
Zurück zum Zitat Aydin E, Yüksel SE (2018) Transfer and multitask learning method for buried wire detection via gpr. In: 2018 26th Signal processing and communications applications conference (SIU). IEEE, pp 1–4 Aydin E, Yüksel SE (2018) Transfer and multitask learning method for buried wire detection via gpr. In: 2018 26th Signal processing and communications applications conference (SIU). IEEE, pp 1–4
7.
Zurück zum Zitat Bakker B, Heskes T (2003) Task clustering and gating for Bayesian multitask learning. J Mach Learn Res 4(May):83–99MATH Bakker B, Heskes T (2003) Task clustering and gating for Bayesian multitask learning. J Mach Learn Res 4(May):83–99MATH
9.
Zurück zum Zitat Ben-David S, Borbely RS (2008) A notion of task relatedness yielding provable multiple-task learning guarantees. Mach Learn 73(3):273–287MATH Ben-David S, Borbely RS (2008) A notion of task relatedness yielding provable multiple-task learning guarantees. Mach Learn 73(3):273–287MATH
10.
Zurück zum Zitat Cai F, Cherkassky V (2009) Svm+ regression and multi-task learning. In: 2009 International joint conference on neural networks. IEEE, pp 418–424 Cai F, Cherkassky V (2009) Svm+ regression and multi-task learning. In: 2009 International joint conference on neural networks. IEEE, pp 418–424
12.
Zurück zum Zitat Chapelle O, Shivaswamy P, Vadrevu S, Weinberger K, Zhang Y, Tseng B (2011) Boosted multi-task learning. Mach Learn 85(1–2):149–173MathSciNet Chapelle O, Shivaswamy P, Vadrevu S, Weinberger K, Zhang Y, Tseng B (2011) Boosted multi-task learning. Mach Learn 85(1–2):149–173MathSciNet
13.
Zurück zum Zitat Chen J, Tang L, Liu J, Ye J (2009) A convex formulation for learning shared structures from multiple tasks. In: Proceedings of the 26th annual international conference on machine learning. pp 137–144 Chen J, Tang L, Liu J, Ye J (2009) A convex formulation for learning shared structures from multiple tasks. In: Proceedings of the 26th annual international conference on machine learning. pp 137–144
14.
Zurück zum Zitat Chen J, Liu J, Ye J (2012) Learning incoherent sparse and low-rank patterns from multiple tasks. ACM Trans Knowl Discov Data (TKDD) 5(4):1–31 Chen J, Liu J, Ye J (2012) Learning incoherent sparse and low-rank patterns from multiple tasks. ACM Trans Knowl Discov Data (TKDD) 5(4):1–31
15.
Zurück zum Zitat Chen X, Pan W, Kwok JT, Carbonell JG (2009) Accelerated gradient method for multi-task sparse learning problem. In: 2009 Ninth IEEE international conference on data mining. IEEE, pp 746–751 Chen X, Pan W, Kwok JT, Carbonell JG (2009) Accelerated gradient method for multi-task sparse learning problem. In: 2009 Ninth IEEE international conference on data mining. IEEE, pp 746–751
16.
Zurück zum Zitat Crammer K, Mansour Y (2012) Learning multiple tasks using shared hypotheses. Adv Neural Inf Process Syst 25:1475–1483 Crammer K, Mansour Y (2012) Learning multiple tasks using shared hypotheses. Adv Neural Inf Process Syst 25:1475–1483
17.
Zurück zum Zitat Crammer K, Singer Y (2001) On the algorithmic implementation of multiclass kernel-based vector machines. J Mach Learn Res 2(12):265–292MATH Crammer K, Singer Y (2001) On the algorithmic implementation of multiclass kernel-based vector machines. J Mach Learn Res 2(12):265–292MATH
18.
Zurück zum Zitat Dai Y, Zhang J, Yuan S, Xu Z (2019) A two-stage multi-task learning-based method for selective unsupervised domain adaptation. In: 2019 International conference on data mining workshops (ICDMW). IEEE, pp 863–868 Dai Y, Zhang J, Yuan S, Xu Z (2019) A two-stage multi-task learning-based method for selective unsupervised domain adaptation. In: 2019 International conference on data mining workshops (ICDMW). IEEE, pp 863–868
19.
Zurück zum Zitat Dereli O, Oğuz C, Gönen M (2019) A multitask multiple kernel learning algorithm for survival analysis with application to cancer biology. In: International conference on machine learning. pp 1576–1585 Dereli O, Oğuz C, Gönen M (2019) A multitask multiple kernel learning algorithm for survival analysis with application to cancer biology. In: International conference on machine learning. pp 1576–1585
20.
Zurück zum Zitat Dinuzzo F (2013) Learning output kernels for multi-task problems. Neurocomputing 118:119–126 Dinuzzo F (2013) Learning output kernels for multi-task problems. Neurocomputing 118:119–126
21.
Zurück zum Zitat Dinuzzo F, Fukumizu K (2011) Learning low-rank output kernels. In: Asian conference on machine learning, PMLR. pp 181–196 Dinuzzo F, Fukumizu K (2011) Learning low-rank output kernels. In: Asian conference on machine learning, PMLR. pp 181–196
22.
Zurück zum Zitat Evgeniou T, Pontil M (2004) Regularized multi-task learning. In: Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining. pp 109–117 Evgeniou T, Pontil M (2004) Regularized multi-task learning. In: Proceedings of the tenth ACM SIGKDD international conference on Knowledge discovery and data mining. pp 109–117
23.
Zurück zum Zitat Evgeniou T, Micchelli CA, Pontil M (2005) Learning multiple tasks with kernel methods. J Mach Learn Res 6(4):615–637MathSciNetMATH Evgeniou T, Micchelli CA, Pontil M (2005) Learning multiple tasks with kernel methods. J Mach Learn Res 6(4):615–637MathSciNetMATH
24.
Zurück zum Zitat Fang Y, Ma Z, Zhang Z, Zhang XY, Bai X et al (2017) Dynamic multi-task learning with convolutional neural network. In: IJCAI. pp 1668–1674 Fang Y, Ma Z, Zhang Z, Zhang XY, Bai X et al (2017) Dynamic multi-task learning with convolutional neural network. In: IJCAI. pp 1668–1674
25.
Zurück zum Zitat Gong P, Ye J, Zhang C (2013) Multi-stage multi-task feature learning. J Mach Learn Res 14(1):2979–3010MathSciNetMATH Gong P, Ye J, Zhang C (2013) Multi-stage multi-task feature learning. J Mach Learn Res 14(1):2979–3010MathSciNetMATH
26.
Zurück zum Zitat Gong P, Zhou J, Fan W, Ye J (2014) Efficient multi-task feature learning with calibration. In: Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining. pp 761–770 Gong P, Zhou J, Fan W, Ye J (2014) Efficient multi-task feature learning with calibration. In: Proceedings of the 20th ACM SIGKDD international conference on Knowledge discovery and data mining. pp 761–770
27.
Zurück zum Zitat Goussies NA, Ubalde S, Mejail M (2014) Transfer learning decision forests for gesture recognition. J Mach Learn Res 15(1):3667–3690MathSciNetMATH Goussies NA, Ubalde S, Mejail M (2014) Transfer learning decision forests for gesture recognition. J Mach Learn Res 15(1):3667–3690MathSciNetMATH
28.
Zurück zum Zitat Gu Q, Li Z, Han J (2011) Learning a kernel for multi-task clustering. In: AAAI. pp 368–373 Gu Q, Li Z, Han J (2011) Learning a kernel for multi-task clustering. In: AAAI. pp 368–373
29.
Zurück zum Zitat Han L, Zhang Y (2016) Multi-stage multi-task learning with reduced rank. In: AAAI. pp 1638–1644 Han L, Zhang Y (2016) Multi-stage multi-task learning with reduced rank. In: AAAI. pp 1638–1644
30.
Zurück zum Zitat He J, Lawrence R (2011) A graphbased framework for multi-task multi-view learning. In: ICML. pp 25–32 He J, Lawrence R (2011) A graphbased framework for multi-task multi-view learning. In: ICML. pp 25–32
31.
Zurück zum Zitat He Z, Li J, Liu L (2017) Hyperspectral classification based on kernel low-rank multitask learning. In: 2017 IEEE international geoscience and remote sensing symposium (IGARSS). IEEE, pp 3206–3209 He Z, Li J, Liu L (2017) Hyperspectral classification based on kernel low-rank multitask learning. In: 2017 IEEE international geoscience and remote sensing symposium (IGARSS). IEEE, pp 3206–3209
32.
Zurück zum Zitat He Z, Li J, Liu K, Liu L, Tao H (2018) Kernel low-rank multitask learning in variational mode decomposition domain for multi-/hyperspectral classification. IEEE Trans Geosci Remote Sens 56(7):4193–4208 He Z, Li J, Liu K, Liu L, Tao H (2018) Kernel low-rank multitask learning in variational mode decomposition domain for multi-/hyperspectral classification. IEEE Trans Geosci Remote Sens 56(7):4193–4208
33.
Zurück zum Zitat Huang J, Qin F, Zheng X, Cheng Z, Yuan Z, Zhang W (2018) Learning label-specific features for multi-label classification with missing labels. In: 2018 IEEE fourth international conference on multimedia big data (BigMM). IEEE, pp 1–5 Huang J, Qin F, Zheng X, Cheng Z, Yuan Z, Zhang W (2018) Learning label-specific features for multi-label classification with missing labels. In: 2018 IEEE fourth international conference on multimedia big data (BigMM). IEEE, pp 1–5
34.
Zurück zum Zitat Huang M, Zhuang F, Zhang X, Ao X, Niu Z, Zhang ML, He Q (2019) Supervised representation learning for multi-label classification. Mach Learn 108(5):747–763MathSciNetMATH Huang M, Zhuang F, Zhang X, Ao X, Niu Z, Zhang ML, He Q (2019) Supervised representation learning for multi-label classification. Mach Learn 108(5):747–763MathSciNetMATH
35.
36.
Zurück zum Zitat Jacob L, Vert J, Bach FR (2009) Clustered multi-task learning: a convex formulation. In: Advances in neural information processing systems. pp 745–752 Jacob L, Vert J, Bach FR (2009) Clustered multi-task learning: a convex formulation. In: Advances in neural information processing systems. pp 745–752
37.
Zurück zum Zitat Jalali A, Sanghavi S (2012) Greedy dirty models: a new algorithm for multiple sparse regression. In: 2012 IEEE statistical signal processing workshop (SSP). IEEE, pp 416–419 Jalali A, Sanghavi S (2012) Greedy dirty models: a new algorithm for multiple sparse regression. In: 2012 IEEE statistical signal processing workshop (SSP). IEEE, pp 416–419
38.
Zurück zum Zitat Jalali A, Sanghavi S, Ruan C, Ravikumar PK (2010) A dirty model for multi-task learning. In: Advances in neural information processing systems. pp 964–972 Jalali A, Sanghavi S, Ruan C, Ravikumar PK (2010) A dirty model for multi-task learning. In: Advances in neural information processing systems. pp 964–972
39.
Zurück zum Zitat Jalali A, Ravikumar P, Sanghavi S (2013) A dirty model for multiple sparse regression. IEEE Trans Inf Theory 59(12):7947–7968MathSciNetMATH Jalali A, Ravikumar P, Sanghavi S (2013) A dirty model for multiple sparse regression. IEEE Trans Inf Theory 59(12):7947–7968MathSciNetMATH
40.
Zurück zum Zitat Jawanpuria P, Nath JS (2011) Multi-task multiple kernel learning. In: Proceedings of the 2011 SIAM international conference on data mining. SIAM, pp 828–838 Jawanpuria P, Nath JS (2011) Multi-task multiple kernel learning. In: Proceedings of the 2011 SIAM international conference on data mining. SIAM, pp 828–838
41.
Zurück zum Zitat Jawanpuria P, Jagarlapudi SN, Ramakrishnan G (2011) Efficient rule ensemble learning using hierarchical kernels. In: Proceedings of the 28th international conference on machine learning (ICML-11). pp 161–168 Jawanpuria P, Jagarlapudi SN, Ramakrishnan G (2011) Efficient rule ensemble learning using hierarchical kernels. In: Proceedings of the 28th international conference on machine learning (ICML-11). pp 161–168
42.
Zurück zum Zitat Ji Y, Sun S (2013) Multitask multiclass support vector machines: model and experiments. Pattern Recognit 46(3):914–924MATH Ji Y, Sun S (2013) Multitask multiclass support vector machines: model and experiments. Pattern Recognit 46(3):914–924MATH
43.
Zurück zum Zitat Jiangmei Z, Binfeng Y, Haibo J, Wang K (2017) Multi-task feature learning by using trace norm regularization. Open Phys 15(1):674–681 Jiangmei Z, Binfeng Y, Haibo J, Wang K (2017) Multi-task feature learning by using trace norm regularization. Open Phys 15(1):674–681
44.
Zurück zum Zitat Kang Z, Grauman K, Sha F (2011) Learning with whom to share in multi-task feature learning. In: ICML, vol 2. p 4 Kang Z, Grauman K, Sha F (2011) Learning with whom to share in multi-task feature learning. In: ICML, vol 2. p 4
45.
Zurück zum Zitat Kato T, Kashima H, Sugiyama M, Asai K (2008) Multi-task learning via conic programming. In: Advances in neural information processing systems. pp 737–744 Kato T, Kashima H, Sugiyama M, Asai K (2008) Multi-task learning via conic programming. In: Advances in neural information processing systems. pp 737–744
46.
Zurück zum Zitat Kendall A, Gal Y, Cipolla R (2018) Multi-task learning using uncertainty to weigh losses for scene geometry and semantics. In: Proceedings of the IEEE conference on computer vision and pattern recognition. pp 7482–7491 Kendall A, Gal Y, Cipolla R (2018) Multi-task learning using uncertainty to weigh losses for scene geometry and semantics. In: Proceedings of the IEEE conference on computer vision and pattern recognition. pp 7482–7491
47.
Zurück zum Zitat Kumar A, Daumé H (2012) Learning task grouping and overlap in multi-task learning. In: Proceedings of the 29th international conference on machine learning. Omnipress, Madison, WI, USA, ICML’12. pp 1723–1730 Kumar A, Daumé H (2012) Learning task grouping and overlap in multi-task learning. In: Proceedings of the 29th international conference on machine learning. Omnipress, Madison, WI, USA, ICML’12. pp 1723–1730
48.
Zurück zum Zitat Lei Y, Binder A, Dogan Ü, Kloft M (2015) Theory and algorithms for the localized setting of learning kernels. In: Feature extraction: modern questions and challenges. pp 173–195 Lei Y, Binder A, Dogan Ü, Kloft M (2015) Theory and algorithms for the localized setting of learning kernels. In: Feature extraction: modern questions and challenges. pp 173–195
49.
Zurück zum Zitat Li Y, Wang J, Ye J, Reddy CK (2016) A multi-task learning formulation for survival analysis. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining. pp 1715–1724 Li Y, Wang J, Ye J, Reddy CK (2016) A multi-task learning formulation for survival analysis. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining. pp 1715–1724
50.
Zurück zum Zitat Liu G, Lin Z, Yan S, Sun J, Yu Y, Ma Y (2013) Robust recovery of subspace structures by low-rank representation. IEEE Trans Pattern Anal Mach Intell 35(1):171–184 Liu G, Lin Z, Yan S, Sun J, Yu Y, Ma Y (2013) Robust recovery of subspace structures by low-rank representation. IEEE Trans Pattern Anal Mach Intell 35(1):171–184
51.
Zurück zum Zitat Liu J, Ji S, Ye J (2009) Multi-task feature learning via efficient l 2, 1-norm minimization. In: Proceedings of the twenty th conference on uncertainty in artificial intelligence. AUAI Press, pp 339–348 Liu J, Ji S, Ye J (2009) Multi-task feature learning via efficient l 2, 1-norm minimization. In: Proceedings of the twenty th conference on uncertainty in artificial intelligence. AUAI Press, pp 339–348
52.
Zurück zum Zitat Liu T, Tao D, Song M, Maybank SJ (2017) Algorithm-dependent generalization bounds for multi-task learning. IEEE Trans Pattern Anal Mach Intell 39(2):227–241 Liu T, Tao D, Song M, Maybank SJ (2017) Algorithm-dependent generalization bounds for multi-task learning. IEEE Trans Pattern Anal Mach Intell 39(2):227–241
53.
Zurück zum Zitat Liu X, Zhu X, Li M, Wang L, Zhu E, Liu T, Kloft M, Shen D, Yin J, Gao W (2019) Multiple kernel \( k \) k-means with incomplete kernels. IEEE Trans Pattern Anal Mach Intell 42(5):1191–1204 Liu X, Zhu X, Li M, Wang L, Zhu E, Liu T, Kloft M, Shen D, Yin J, Gao W (2019) Multiple kernel \( k \) k-means with incomplete kernels. IEEE Trans Pattern Anal Mach Intell 42(5):1191–1204
54.
Zurück zum Zitat Lopez-Martinez D (2017) Regularization approaches for support vector machines with applications to biomedical data. arXiv preprint arXiv:1710.10600 Lopez-Martinez D (2017) Regularization approaches for support vector machines with applications to biomedical data. arXiv preprint arXiv:​1710.​10600
55.
Zurück zum Zitat Lounici K, Pontil M, Tsybakov AB, Van De Geer S (2009) Taking advantage of sparsity in multi-task learning. arXiv preprint arXiv:0903.1468 Lounici K, Pontil M, Tsybakov AB, Van De Geer S (2009) Taking advantage of sparsity in multi-task learning. arXiv preprint arXiv:​0903.​1468
56.
Zurück zum Zitat Lozano AC, Swirszcz G (2012) Multi-level lasso for sparse multi-task regression. In: Proceedings of the 29th international conference on machine learning. pp 595–602 Lozano AC, Swirszcz G (2012) Multi-level lasso for sparse multi-task regression. In: Proceedings of the 29th international conference on machine learning. pp 595–602
57.
Zurück zum Zitat Maurer A, Pontil M (2010) \( k \)-dimensional coding schemes in Hilbert spaces. IEEE Trans Inf Theory 56(11):5839–5846MathSciNetMATH Maurer A, Pontil M (2010) \( k \)-dimensional coding schemes in Hilbert spaces. IEEE Trans Inf Theory 56(11):5839–5846MathSciNetMATH
58.
Zurück zum Zitat Maurer A, Pontil M, Romera-Paredes B (2016) The benefit of multitask representation learning. J Mach Learn Res 17(1):2853–2884MathSciNetMATH Maurer A, Pontil M, Romera-Paredes B (2016) The benefit of multitask representation learning. J Mach Learn Res 17(1):2853–2884MathSciNetMATH
59.
Zurück zum Zitat McDonald AM, Pontil M, Stamos D (2014) Spectral k-support norm regularization. In: Advances in neural information processing systems. pp 3644–3652 McDonald AM, Pontil M, Stamos D (2014) Spectral k-support norm regularization. In: Advances in neural information processing systems. pp 3644–3652
60.
Zurück zum Zitat Mei B, Xu Y (2019) Multi-task least squares twin support vector machine for classification. Neurocomputing 338:26–33 Mei B, Xu Y (2019) Multi-task least squares twin support vector machine for classification. Neurocomputing 338:26–33
61.
Zurück zum Zitat Moeskops P, Wolterink JM, van der Velden BH, Gilhuijs KG, Leiner T, Viergever MA, Išgum I (2016) Deep learning for multi-task medical image segmentation in multiple modalities. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 478–486 Moeskops P, Wolterink JM, van der Velden BH, Gilhuijs KG, Leiner T, Viergever MA, Išgum I (2016) Deep learning for multi-task medical image segmentation in multiple modalities. In: International conference on medical image computing and computer-assisted intervention. Springer, pp 478–486
62.
Zurück zum Zitat Murugesan K, Carbonell J (2017) Multi-task multiple kernel relationship learning. In: Proceedings of the 2017 SIAM international conference on data mining. SIAM, pp 687–695 Murugesan K, Carbonell J (2017) Multi-task multiple kernel relationship learning. In: Proceedings of the 2017 SIAM international conference on data mining. SIAM, pp 687–695
63.
Zurück zum Zitat Naik SM, Jagannath RPK (2017) Accurate validation of GCV-based regularization parameter for extreme learning machine. In: 2017 International conference on advances in computing, communications and informatics (ICACCI). IEEE, pp 1727–1731 Naik SM, Jagannath RPK (2017) Accurate validation of GCV-based regularization parameter for extreme learning machine. In: 2017 International conference on advances in computing, communications and informatics (ICACCI). IEEE, pp 1727–1731
64.
Zurück zum Zitat Obozinski G, Taskar B, Jordan M (2006) Multi-task feature selection. Statistics Department, UC Berkeley, Tech Rep 2(2.2):2 Obozinski G, Taskar B, Jordan M (2006) Multi-task feature selection. Statistics Department, UC Berkeley, Tech Rep 2(2.2):2
65.
Zurück zum Zitat Oh J, Singh S, Lee H, Kohli P (2017) Zero-shot task generalization with multi-task deep reinforcement learning. In: Proceedings of the 34th international conference on machine learning, vol 70. pp 2661–2670 Oh J, Singh S, Lee H, Kohli P (2017) Zero-shot task generalization with multi-task deep reinforcement learning. In: Proceedings of the 34th international conference on machine learning, vol 70. pp 2661–2670
66.
Zurück zum Zitat Parameswaran S, Weinberger KQ (2010) Large margin multi-task metric learning. In: Advances in neural information processing systems. pp 1867–1875 Parameswaran S, Weinberger KQ (2010) Large margin multi-task metric learning. In: Advances in neural information processing systems. pp 1867–1875
67.
Zurück zum Zitat Pong TK, Tseng P, Ji S, Ye J (2010) Trace norm regularization: reformulations, algorithms, and multi-task learning. SIAM J Optim 20(6):3465–3489MathSciNetMATH Pong TK, Tseng P, Ji S, Ye J (2010) Trace norm regularization: reformulations, algorithms, and multi-task learning. SIAM J Optim 20(6):3465–3489MathSciNetMATH
68.
Zurück zum Zitat Rakotomamonjy A, Bach FR, Canu S, Grandvalet Y (2008) Simplemkl. J Mach Learn Res 9(11):2491–2521MathSciNetMATH Rakotomamonjy A, Bach FR, Canu S, Grandvalet Y (2008) Simplemkl. J Mach Learn Res 9(11):2491–2521MathSciNetMATH
69.
Zurück zum Zitat Ranjan R, Patel VM, Chellappa R (2017) Hyperface: a deep multi-task learning framework for face detection, landmark localization, pose estimation, and gender recognition. IEEE Trans Pattern Anal Mach Intell 41(1):121–135 Ranjan R, Patel VM, Chellappa R (2017) Hyperface: a deep multi-task learning framework for face detection, landmark localization, pose estimation, and gender recognition. IEEE Trans Pattern Anal Mach Intell 41(1):121–135
71.
Zurück zum Zitat Ruiz C, Alaíz CM, Dorronsoro JR (2019) A convex formulation of SVM-based multi-task learning. In: International conference on hybrid artificial intelligence systems. Springer, pp 404–415 Ruiz C, Alaíz CM, Dorronsoro JR (2019) A convex formulation of SVM-based multi-task learning. In: International conference on hybrid artificial intelligence systems. Springer, pp 404–415
72.
Zurück zum Zitat Shrivastava A, Patel VM, Chellappa R (2014) Multiple kernel learning for sparse representation-based classification. IEEE Trans Image Process 23(7):3013–3024MathSciNetMATH Shrivastava A, Patel VM, Chellappa R (2014) Multiple kernel learning for sparse representation-based classification. IEEE Trans Image Process 23(7):3013–3024MathSciNetMATH
73.
Zurück zum Zitat Standley T, Zamir A, Chen D, Guibas L, Malik J, Savarese S (2020) Which tasks should be learned together in multi-task learning? In: International conference on machine learning, PMLR. pp 9120–9132 Standley T, Zamir A, Chen D, Guibas L, Malik J, Savarese S (2020) Which tasks should be learned together in multi-task learning? In: International conference on machine learning, PMLR. pp 9120–9132
74.
Zurück zum Zitat Sun T, Jiao L, Liu F, Wang S, Feng J (2013) Selective multiple kernel learning for classification with ensemble strategy. Pattern Recognit 46(11):3081–3090 Sun T, Jiao L, Liu F, Wang S, Feng J (2013) Selective multiple kernel learning for classification with ensemble strategy. Pattern Recognit 46(11):3081–3090
75.
Zurück zum Zitat Suzuki T, Tomioka R (2011) Spicymkl: a fast algorithm for multiple kernel learning with thousands of kernels. Mach Learn 85(1–2):77–108MathSciNetMATH Suzuki T, Tomioka R (2011) Spicymkl: a fast algorithm for multiple kernel learning with thousands of kernels. Mach Learn 85(1–2):77–108MathSciNetMATH
76.
Zurück zum Zitat Thrun S, O’Sullivan J (1996) Discovering structure in multiple learning tasks: the TC algorithm. ICML 96:489–497 Thrun S, O’Sullivan J (1996) Discovering structure in multiple learning tasks: the TC algorithm. ICML 96:489–497
77.
Zurück zum Zitat Thung KH, Wee CY (2018) A brief review on multi-task learning. Multimedia Tools Appl 77(22):29705–29725 Thung KH, Wee CY (2018) A brief review on multi-task learning. Multimedia Tools Appl 77(22):29705–29725
78.
Zurück zum Zitat Tian X, Li Y, Liu T, Wang X, Tao D (2019) Eigenfunction-based multitask learning in a reproducing kernel Hilbert space. IEEE Trans Neural Netw Learn Syst 30(6):1818–1830MathSciNet Tian X, Li Y, Liu T, Wang X, Tao D (2019) Eigenfunction-based multitask learning in a reproducing kernel Hilbert space. IEEE Trans Neural Netw Learn Syst 30(6):1818–1830MathSciNet
79.
Zurück zum Zitat Wachinger C, Reuter M, Klein T (2018) Deepnat: deep convolutional neural network for segmenting neuroanatomy. NeuroImage 170:434–445 Wachinger C, Reuter M, Klein T (2018) Deepnat: deep convolutional neural network for segmenting neuroanatomy. NeuroImage 170:434–445
80.
Zurück zum Zitat Wang L, Zhu J, Zou H (2006) The doubly regularized support vector machine. Stat Sin 16:589–615MathSciNetMATH Wang L, Zhu J, Zou H (2006) The doubly regularized support vector machine. Stat Sin 16:589–615MathSciNetMATH
81.
Zurück zum Zitat Wang X, Bi J, Yu S, Sun J, Song M (2016) Multiplicative multitask feature learning. J Mach Learn Res 17(1):2820–2852MathSciNetMATH Wang X, Bi J, Yu S, Sun J, Song M (2016) Multiplicative multitask feature learning. J Mach Learn Res 17(1):2820–2852MathSciNetMATH
82.
Zurück zum Zitat Widmer C, Toussaint NC, Altun Y, Rätsch G (2010) Inferring latent task structure for multitask learning by multiple kernel learning. BMC Bioinform 11(8):1–8 Widmer C, Toussaint NC, Altun Y, Rätsch G (2010) Inferring latent task structure for multitask learning by multiple kernel learning. BMC Bioinform 11(8):1–8
83.
Zurück zum Zitat Williams C, Bonilla EV, Chai KM (2007) Multi-task gaussian process prediction. In: Advances in neural information processing systems. pp 153–160 Williams C, Bonilla EV, Chai KM (2007) Multi-task gaussian process prediction. In: Advances in neural information processing systems. pp 153–160
84.
Zurück zum Zitat Wu Z, Valentini-Botinhao C, Watts O, King S (2015) Deep neural networks employing multi-task learning and stacked bottleneck features for speech synthesis. In: 2015 IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 4460–4464 Wu Z, Valentini-Botinhao C, Watts O, King S (2015) Deep neural networks employing multi-task learning and stacked bottleneck features for speech synthesis. In: 2015 IEEE international conference on acoustics, speech and signal processing (ICASSP). IEEE, pp 4460–4464
85.
Zurück zum Zitat Xiao Y, Chang Z, Liu B (2020) An efficient active learning method for multi-task learning. Knowl Based Syst 190:105137 Xiao Y, Chang Z, Liu B (2020) An efficient active learning method for multi-task learning. Knowl Based Syst 190:105137
86.
Zurück zum Zitat Yu X, Zhou Z, Gao Q, Li D, Ríha K (2018) Infrared image segmentation using growing immune field and clone threshold. Infrared Phys Technol 88:184–193 Yu X, Zhou Z, Gao Q, Li D, Ríha K (2018) Infrared image segmentation using growing immune field and clone threshold. Infrared Phys Technol 88:184–193
87.
Zurück zum Zitat Zhang T et al (2013) Multi-stage convex relaxation for feature selection. Bernoulli 19(5B):2277–2293MathSciNetMATH Zhang T et al (2013) Multi-stage convex relaxation for feature selection. Bernoulli 19(5B):2277–2293MathSciNetMATH
89.
Zurück zum Zitat Zhang Y, Yang Q (2018) An overview of multi-task learning. Natl Sci Rev 5(1):30–43 Zhang Y, Yang Q (2018) An overview of multi-task learning. Natl Sci Rev 5(1):30–43
90.
Zurück zum Zitat Zhang Y, Yeung DY (2010) A convex formulation for learning task relationships in multi-task learning. In: Proceedings of the 26th conference on uncertainty in artificial intelligence, vol 7. pp 33–42 Zhang Y, Yeung DY (2010) A convex formulation for learning task relationships in multi-task learning. In: Proceedings of the 26th conference on uncertainty in artificial intelligence, vol 7. pp 33–42
91.
Zurück zum Zitat Zhang Y, Yeung DY (2010) Multi-task learning using generalized t process. In: Proceedings of the thirteenth international conference on artificial intelligence and statistics. pp 964–971 Zhang Y, Yeung DY (2010) Multi-task learning using generalized t process. In: Proceedings of the thirteenth international conference on artificial intelligence and statistics. pp 964–971
92.
Zurück zum Zitat Zhang Y, Yeung DY (2014) A regularization approach to learning task relationships in multitask learning. ACM Trans Knowl Discov Data (TKDD) 8(3):1–31 Zhang Y, Yeung DY (2014) A regularization approach to learning task relationships in multitask learning. ACM Trans Knowl Discov Data (TKDD) 8(3):1–31
93.
Zurück zum Zitat Zhong W, Kwok J (2012) Convex multitask learning with flexible task clusters. In: Proceedings of the 29th international conference on machine learning Zhong W, Kwok J (2012) Convex multitask learning with flexible task clusters. In: Proceedings of the 29th international conference on machine learning
94.
Zurück zum Zitat Zhou Q, Chen Y, Pan SJ (2020) Communication-efficient distributed multi-task learning with matrix sparsity regularization. Mach Learn 109:1–33MathSciNetMATH Zhou Q, Chen Y, Pan SJ (2020) Communication-efficient distributed multi-task learning with matrix sparsity regularization. Mach Learn 109:1–33MathSciNetMATH
95.
Zurück zum Zitat Zhou Y, Jin R, Hoi SCH (2010) Exclusive lasso for multi-task feature selection. In: Proceedings of the thirteenth international conference on artificial intelligence and statistics. pp 988–995 Zhou Y, Jin R, Hoi SCH (2010) Exclusive lasso for multi-task feature selection. In: Proceedings of the thirteenth international conference on artificial intelligence and statistics. pp 988–995
96.
Zurück zum Zitat Zweig A, Weinshall D (2013) Hierarchical regularization cascade for joint learning. In: International conference on machine learning. pp 37–45 Zweig A, Weinshall D (2013) Hierarchical regularization cascade for joint learning. In: International conference on machine learning. pp 37–45
Metadaten
Titel
Supervised Shallow Multi-task Learning: Analysis of Methods
verfasst von
Stanley Ebhohimhen Abhadiomhen
Royransom Chimela Nzeh
Ernest Domanaanmwi Ganaa
Honour Chika Nwagwu
George Emeka Okereke
Sidheswar Routray
Publikationsdatum
29.01.2022
Verlag
Springer US
Erschienen in
Neural Processing Letters / Ausgabe 3/2022
Print ISSN: 1370-4621
Elektronische ISSN: 1573-773X
DOI
https://doi.org/10.1007/s11063-021-10703-7

Weitere Artikel der Ausgabe 3/2022

Neural Processing Letters 3/2022 Zur Ausgabe

Neuer Inhalt