Skip to main content
Top

2018 | OriginalPaper | Chapter

Accelerated Gradient and Block-Wise Gradient Methods for Big Data Factorization

Authors : M. Reza Peyghami, Kevin Yang, Shengyuan Chen, Zijiang Yang, Masoud Ataei

Published in: Advances in Artificial Intelligence

Publisher: Springer International Publishing

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

A problem often encountered in analysis of the large-scale data pertains to approximation of a given matrix \(A\in \mathbf {R}^{m\times n}\) by \(UV^T\), where \(U\in \mathbf {R}^{m\times r}\), \(V\in \mathbf {R}^{n\times r}\) and \(r < \min \{ m, n \}\). The aim of this paper is to tackle this problem through proposing an accelerated gradient descent algorithm as well as its stochastic counterpart. These frameworks are suitable candidates to surmount the computational difficulties in computing the SVD form of big matrices. On the other hand, big data are usually presented and stored in some fixed-size blocks, which is an incentive to further propose a block-wise gradient descent algorithm for their low-rank approximation. A stochastic block-wise gradient method will further be suggested to enhance the computational efficiencies when a large number of blocks are presented in the problem. Under some standard assumptions, we investigate the convergence property of the block-wise approach. Computational results for both synthetic data as well as the real-world data are provided in this paper.

Dont have a licence yet? Then find out more about our products and how to get one now:

Springer Professional "Wirtschaft+Technik"

Online-Abonnement

Mit Springer Professional "Wirtschaft+Technik" erhalten Sie Zugriff auf:

  • über 102.000 Bücher
  • über 537 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Maschinenbau + Werkstoffe
  • Versicherung + Risiko

Jetzt Wissensvorsprung sichern!

Springer Professional "Technik"

Online-Abonnement

Mit Springer Professional "Technik" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 390 Zeitschriften

aus folgenden Fachgebieten:

  • Automobil + Motoren
  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Elektrotechnik + Elektronik
  • Energie + Nachhaltigkeit
  • Maschinenbau + Werkstoffe




 

Jetzt Wissensvorsprung sichern!

Springer Professional "Wirtschaft"

Online-Abonnement

Mit Springer Professional "Wirtschaft" erhalten Sie Zugriff auf:

  • über 67.000 Bücher
  • über 340 Zeitschriften

aus folgenden Fachgebieten:

  • Bauwesen + Immobilien
  • Business IT + Informatik
  • Finance + Banking
  • Management + Führung
  • Marketing + Vertrieb
  • Versicherung + Risiko




Jetzt Wissensvorsprung sichern!

Literature
1.
go back to reference Park, D., Kyrillidis, A., Caramanis, C., Sanghavi, S.: Finding low-rank solutions via non-convex matrix factorization, efficiently and provably. arXiv preprint arXiv:1606.03168 (2016) Park, D., Kyrillidis, A., Caramanis, C., Sanghavi, S.: Finding low-rank solutions via non-convex matrix factorization, efficiently and provably. arXiv preprint arXiv:​1606.​03168 (2016)
2.
go back to reference Cai, J.F., Candès, E.J., Shen, Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20(4), 1956–1982 (2010)MathSciNetCrossRefMATH Cai, J.F., Candès, E.J., Shen, Z.: A singular value thresholding algorithm for matrix completion. SIAM J. Optim. 20(4), 1956–1982 (2010)MathSciNetCrossRefMATH
3.
go back to reference Lee, K., Bresler, Y.: Admira: atomic decomposition for minimum rank approximation. IEEE Trans. Inf. Theory 56(9), 4402–4416 (2010)MathSciNetCrossRefMATH Lee, K., Bresler, Y.: Admira: atomic decomposition for minimum rank approximation. IEEE Trans. Inf. Theory 56(9), 4402–4416 (2010)MathSciNetCrossRefMATH
4.
5.
6.
go back to reference Jain, P., Netrapalli, P., Sanghavi, S.: Low-rank matrix completion using alternating minimization. In: Proceedings of the Forty-Fifth Annual ACM Symposium on Theory of Computing, pp. 665–674. ACM (2013) Jain, P., Netrapalli, P., Sanghavi, S.: Low-rank matrix completion using alternating minimization. In: Proceedings of the Forty-Fifth Annual ACM Symposium on Theory of Computing, pp. 665–674. ACM (2013)
7.
go back to reference Chen, Y., Wainwright, M.J.: Fast low-rank estimation by projected gradient descent: general statistical and algorithmic guarantees. arXiv preprint arXiv:1509.03025 (2015) Chen, Y., Wainwright, M.J.: Fast low-rank estimation by projected gradient descent: general statistical and algorithmic guarantees. arXiv preprint arXiv:​1509.​03025 (2015)
8.
go back to reference Tu, S., Boczar, R., Simchowitz, M., Soltanolkotabi, M., Recht, B.: Low-rank solutions of linear matrix equations via procrustes flow. arXiv preprint arXiv:1507.03566 (2015) Tu, S., Boczar, R., Simchowitz, M., Soltanolkotabi, M., Recht, B.: Low-rank solutions of linear matrix equations via procrustes flow. arXiv preprint arXiv:​1507.​03566 (2015)
9.
go back to reference Nesterov, Y.: A method for unconstrained convex minimization problem with the rate of convergence O(1/k\(^2\)). Doklady AN USSR 269, 543–547 (1983) Nesterov, Y.: A method for unconstrained convex minimization problem with the rate of convergence O(1/k\(^2\)). Doklady AN USSR 269, 543–547 (1983)
10.
go back to reference Ghadimi, S., Lan, G.: Accelerated gradient methods for nonconvex nonlinear and stochastic programming. Math. Program. 156(1–2), 59–99 (2016)MathSciNetCrossRefMATH Ghadimi, S., Lan, G.: Accelerated gradient methods for nonconvex nonlinear and stochastic programming. Math. Program. 156(1–2), 59–99 (2016)MathSciNetCrossRefMATH
11.
go back to reference Wang, L., Zhang, X., Gu, Q.: A universal variance reduction-based catalyst for nonconvex low-rank matrix recovery. arXiv preprint arXiv:1701.02301 (2017) Wang, L., Zhang, X., Gu, Q.: A universal variance reduction-based catalyst for nonconvex low-rank matrix recovery. arXiv preprint arXiv:​1701.​02301 (2017)
12.
13.
go back to reference Sanderson, C., Curtin, R.: Armadillo: a template-based C++ library for linear algebra. J. Open Source Softw. 1, 26 (2016)CrossRef Sanderson, C., Curtin, R.: Armadillo: a template-based C++ library for linear algebra. J. Open Source Softw. 1, 26 (2016)CrossRef
14.
go back to reference Roe, B.P., Yang, H.J., Zhu, J., Liu, Y., Stancu, I., McGregor, G.: Boosted decision trees as an alternative to artificial neural networks for particle identification. Nucl. Instrum. Methods Phys. Res., Sect. A 543(2–3), 577–584 (2005)CrossRef Roe, B.P., Yang, H.J., Zhu, J., Liu, Y., Stancu, I., McGregor, G.: Boosted decision trees as an alternative to artificial neural networks for particle identification. Nucl. Instrum. Methods Phys. Res., Sect. A 543(2–3), 577–584 (2005)CrossRef
Metadata
Title
Accelerated Gradient and Block-Wise Gradient Methods for Big Data Factorization
Authors
M. Reza Peyghami
Kevin Yang
Shengyuan Chen
Zijiang Yang
Masoud Ataei
Copyright Year
2018
DOI
https://doi.org/10.1007/978-3-319-89656-4_20

Premium Partner