Skip to main content

2024 | OriginalPaper | Buchkapitel

Conjugate Gradient Method for finding Optimal Parameters in Linear Regression

verfasst von : Vishal Menon, V. Ashwin, G. Gopakumar

Erschienen in: Big Data, Machine Learning, and Applications

Verlag: Springer Nature Singapore

Aktivieren Sie unsere intelligente Suche, um passende Fachinhalte oder Patente zu finden.

search-config
loading …

Abstract

Linear regression is one of the most celebrated approaches for modeling the relationship between independent and dependent variables in a prediction problem. It can have applications in a number of domains including weather data analysis, price estimation, bioinformatics, etc. Various computational approaches have been devised for finding the best model parameter. In this work, we explore and establish the possibility of applying the Conjugate Gradient Method for finding the optimal parameters for our regression model, which is demonstrated by taking the house price prediction problem using the Boston dataset. The efficiency of the conjugate gradient method over the pseudo-inverse method and gradient descent methods in terms of computational requirement are discussed. We show that the weights obtained by the conjugate gradient are accurate and the parameter vector converges to an optimal value in relatively fewer iterations when compared to the gradient descent techniques. Hence, Conjugate Gradient Method proves to be a faster approach for a linear regression problem in ordinary least square settings.

Sie haben noch keine Lizenz? Dann Informieren Sie sich jetzt über unsere Produkte:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literatur
1.
Zurück zum Zitat Bradley JK, Schapire RE (2008) Filterboost: regression and classification on large datasets. In: Platt J, Koller D, Singer Y, Roweis S (eds) Advances in neural information processing systems, vol 20. Curran Associates Inc., pp 185–192 Bradley JK, Schapire RE (2008) Filterboost: regression and classification on large datasets. In: Platt J, Koller D, Singer Y, Roweis S (eds) Advances in neural information processing systems, vol 20. Curran Associates Inc., pp 185–192
4.
Zurück zum Zitat Fowdur T, Beeharry Y, Hurbungs V, Bassoo V, Ramnarain-Seetohul V, Lun ECM (2018) Performance analysis and implementation of an adaptive real-time weather forecasting system. Internet Things 3–4:12–33CrossRef Fowdur T, Beeharry Y, Hurbungs V, Bassoo V, Ramnarain-Seetohul V, Lun ECM (2018) Performance analysis and implementation of an adaptive real-time weather forecasting system. Internet Things 3–4:12–33CrossRef
7.
Zurück zum Zitat Gemulla R, Nijkamp E, Haas PJ, Sismanis Y (2011) Large-scale matrix factorization with distributed stochastic gradient descent. In: Proceedings of the 17th ACM SIGKDD international conference on knowledge discovery and data mining. KDD ’11, Association for Computing Machinery, New York, NY, USA, pp 69–77. https://doi.org/10.1145/2020408.2020426 Gemulla R, Nijkamp E, Haas PJ, Sismanis Y (2011) Large-scale matrix factorization with distributed stochastic gradient descent. In: Proceedings of the 17th ACM SIGKDD international conference on knowledge discovery and data mining. KDD ’11, Association for Computing Machinery, New York, NY, USA, pp 69–77. https://​doi.​org/​10.​1145/​2020408.​2020426
10.
Zurück zum Zitat Harrison D, Rubinfeld DL (1978) Hedonic housing prices and the demand for clean air. J Environ Econ Manag 5(1):81–102CrossRefMATH Harrison D, Rubinfeld DL (1978) Hedonic housing prices and the demand for clean air. J Environ Econ Manag 5(1):81–102CrossRefMATH
11.
Zurück zum Zitat Hestenes MR (1980) Conjugate gradient algorithms. in: conjugate direction methods in optimization. Springer New York, pp 231–318 Hestenes MR (1980) Conjugate gradient algorithms. in: conjugate direction methods in optimization. Springer New York, pp 231–318
14.
Zurück zum Zitat Jung Klaus SFHM (2017) Multiple linear regression for reconstruction of gene regulatory networks in solving cascade error problems. Adv Bioinf 94–95 Jung Klaus SFHM (2017) Multiple linear regression for reconstruction of gene regulatory networks in solving cascade error problems. Adv Bioinf 94–95
16.
Zurück zum Zitat Kershaw DS (1977) The incomplete cholesky-conjugate gradient method for the iterative solution of systems of linear equations. J Comput Phys Kershaw DS (1977) The incomplete cholesky-conjugate gradient method for the iterative solution of systems of linear equations. J Comput Phys
17.
Zurück zum Zitat Kershaw DS (1978) The incomplete cholesky-conjugate gradient method for the iterative solution of systems of linear equations. J Comput Phys 26(1):43–65MathSciNetCrossRefMATH Kershaw DS (1978) The incomplete cholesky-conjugate gradient method for the iterative solution of systems of linear equations. J Comput Phys 26(1):43–65MathSciNetCrossRefMATH
19.
Zurück zum Zitat Loh Pl, Wainwright MJ (2011) High-dimensional regression with noisy and missing data: Provable guarantees with non-convexity. In: Shawe-Taylor J, Zemel R, Bartlett P, Pereira F, Weinberger KQ (eds) Advances in neural information processing systems, vo. 24. Curran Associates Inc., pp 2726–2734 Loh Pl, Wainwright MJ (2011) High-dimensional regression with noisy and missing data: Provable guarantees with non-convexity. In: Shawe-Taylor J, Zemel R, Bartlett P, Pereira F, Weinberger KQ (eds) Advances in neural information processing systems, vo. 24. Curran Associates Inc., pp 2726–2734
20.
Zurück zum Zitat Lubis FF, Rosmansyah Y, Supangkat SH (2014) Gradient descent and normal equations on cost function minimization for online predictive using linear regression with multiple variables. In: 2014 international conference on ict for smart society (ICISS), pp 202–205. https://doi.org/10.1109/ICTSS.2014.7013173 Lubis FF, Rosmansyah Y, Supangkat SH (2014) Gradient descent and normal equations on cost function minimization for online predictive using linear regression with multiple variables. In: 2014 international conference on ict for smart society (ICISS), pp 202–205. https://​doi.​org/​10.​1109/​ICTSS.​2014.​7013173
21.
Zurück zum Zitat Luenberger DG, Ye Y (2008) Conjugate direction methods. In: Linear and nonlinear programming. Springer US, New York, pp 263–284 Luenberger DG, Ye Y (2008) Conjugate direction methods. In: Linear and nonlinear programming. Springer US, New York, pp 263–284
23.
Zurück zum Zitat Polyak B (1969) The conjugate gradient method in extremal problems. USSR Comput Math Math Phys 9(4):94–112CrossRefMATH Polyak B (1969) The conjugate gradient method in extremal problems. USSR Comput Math Math Phys 9(4):94–112CrossRefMATH
25.
Zurück zum Zitat Reddy MR, Kumar BN, Rao NM, Karthikeyan B (2020) A new approach for bias-variance analysis using regularized linear regression. In: Jain LC, Virvou M, Piuri V, Balas VE (eds) Advances in bioinformatics, multimedia, and electronics circuits and signals. Springer Singapore, Singapore, pp 35–46 Reddy MR, Kumar BN, Rao NM, Karthikeyan B (2020) A new approach for bias-variance analysis using regularized linear regression. In: Jain LC, Virvou M, Piuri V, Balas VE (eds) Advances in bioinformatics, multimedia, and electronics circuits and signals. Springer Singapore, Singapore, pp 35–46
27.
Zurück zum Zitat Sathyadevan S, Chaitra MA (2015) Airfoil self noise prediction using linear regression approach. In: Jain LC, Behera HS, Mandal JK, Mohapatra DP (eds) Computational intelligence in data mining, vol 2. Springer India, New Delhi, pp 551–561 Sathyadevan S, Chaitra MA (2015) Airfoil self noise prediction using linear regression approach. In: Jain LC, Behera HS, Mandal JK, Mohapatra DP (eds) Computational intelligence in data mining, vol 2. Springer India, New Delhi, pp 551–561
28.
Zurück zum Zitat Seal HL (1967) Studies in the history of probability and statistics. xv: The historical development of the gauss linear model. Biometrika 54(1/2):1–24 Seal HL (1967) Studies in the history of probability and statistics. xv: The historical development of the gauss linear model. Biometrika 54(1/2):1–24
29.
Zurück zum Zitat Sharmila Muralidharan KP (2018) Analysis and prediction of real estate prices: a case of the boston housing market. Issues Inf Syst 5:109–118 Sharmila Muralidharan KP (2018) Analysis and prediction of real estate prices: a case of the boston housing market. Issues Inf Syst 5:109–118
30.
Zurück zum Zitat Weisberg S (2005) Applied Linear Regression. Wiley series in probability and statistics. Wiley, New York Weisberg S (2005) Applied Linear Regression. Wiley series in probability and statistics. Wiley, New York
31.
Zurück zum Zitat Yan X (2009) Linear regression analysis: theory and computing. World Scientific publishing Co Yan X (2009) Linear regression analysis: theory and computing. World Scientific publishing Co
Metadaten
Titel
Conjugate Gradient Method for finding Optimal Parameters in Linear Regression
verfasst von
Vishal Menon
V. Ashwin
G. Gopakumar
Copyright-Jahr
2024
Verlag
Springer Nature Singapore
DOI
https://doi.org/10.1007/978-981-99-3481-2_15

Premium Partner