Skip to main content
Top
Published in: Calcolo 4/2023

01-11-2023

On greedy randomized block Gauss–Seidel method with averaging for sparse linear least-squares problems

Authors: Yimou Liao, Tianxiu Lu

Published in: Calcolo | Issue 4/2023

Login to get access

Activate our intelligent search to find suitable subject content or patents.

search-config
loading …

Abstract

This paper presents a greedy randomized average block sampling Gauss–Seidel (GRABGS) method for solving sparse linear least-squares problems. The GRABGS method utilizes a novel probability criterion to collect the control index set of coordinates, and minimizes the quadratic convex objective by performing multiple accurate line searches on average per iteration. The probability criterion aims to capture subvectors whose norms are relatively large. Additionly, the GRABGS method is categorized as a member of randomized block Gauss–Seidel methods, which can be employed for parallel implementations. The convergence analysis encompasses two types of extrapolation stepsizes: constant and adaptive. It is proved that the GRABGS method converges to the unique solution of the sparse linear least-squares problem when the matrix has full column rank. Numerical examples demonstrate the superiority of this method over the greedy randomized coordinate descent method and several existing state-of-the-art block Gauss–Seidel methods.
Literature
1.
2.
go back to reference Byrne, C.: A unified treatment of some iterative algorithms in signal processing and image reconstruction. Inverse Probl. 20, 103–120 (2004)MathSciNetCrossRefMATH Byrne, C.: A unified treatment of some iterative algorithms in signal processing and image reconstruction. Inverse Probl. 20, 103–120 (2004)MathSciNetCrossRefMATH
3.
go back to reference Bouman, C.-A., Sauer, K.: A unified approach to statistical tomography using coordinate descent optimization. IEEE Trans. Image Process. 5, 480–492 (1996)CrossRef Bouman, C.-A., Sauer, K.: A unified approach to statistical tomography using coordinate descent optimization. IEEE Trans. Image Process. 5, 480–492 (1996)CrossRef
4.
go back to reference Breheny, P., Huang, J.: Coordinate descent algorithms for nonconvex penalized regression with applications to biological feature selection. Ann. Appl. Stat. 5, 232–253 (2011)MathSciNetCrossRefMATH Breheny, P., Huang, J.: Coordinate descent algorithms for nonconvex penalized regression with applications to biological feature selection. Ann. Appl. Stat. 5, 232–253 (2011)MathSciNetCrossRefMATH
5.
go back to reference Bai, Z.-Z., Wu, W.-T.: On greedy randomized coordinate descent methods for solving large linear least-squares problems. Numer. Linear Algebra Appl. 26, 22–37 (2019)MathSciNetCrossRefMATH Bai, Z.-Z., Wu, W.-T.: On greedy randomized coordinate descent methods for solving large linear least-squares problems. Numer. Linear Algebra Appl. 26, 22–37 (2019)MathSciNetCrossRefMATH
6.
go back to reference Bai, Z.-Z., Wu, W.-T.: On partially randomized extended Kaczmarz method for solving large sparse overdetermined inconsistent linear systems. Linear Algebra Appl. 578, 225–250 (2019)MathSciNetCrossRefMATH Bai, Z.-Z., Wu, W.-T.: On partially randomized extended Kaczmarz method for solving large sparse overdetermined inconsistent linear systems. Linear Algebra Appl. 578, 225–250 (2019)MathSciNetCrossRefMATH
7.
go back to reference Bai, Z.-Z., Wang, L., Wu, W.-T.: On convergence rate of the randomized Gauss–Seidel method. Linear Algebra Appl. 611, 237–252 (2021)MathSciNetCrossRefMATH Bai, Z.-Z., Wang, L., Wu, W.-T.: On convergence rate of the randomized Gauss–Seidel method. Linear Algebra Appl. 611, 237–252 (2021)MathSciNetCrossRefMATH
8.
go back to reference Canutescu, A.-A., Dunbrack, R.-L.: Cyclic coordinate descent: a robotics algorithm for protein loop closure. Protein Sci. 12, 963–972 (2003)CrossRef Canutescu, A.-A., Dunbrack, R.-L.: Cyclic coordinate descent: a robotics algorithm for protein loop closure. Protein Sci. 12, 963–972 (2003)CrossRef
9.
go back to reference Chang, K.-W., Hsieh, C.-J., Lin, C.-J.: Coordinate descent method for large-scale L2-loss linear support vector machines. J. Mach. Learn. Res. 9, 1369–1398 (2008)MathSciNetMATH Chang, K.-W., Hsieh, C.-J., Lin, C.-J.: Coordinate descent method for large-scale L2-loss linear support vector machines. J. Mach. Learn. Res. 9, 1369–1398 (2008)MathSciNetMATH
10.
go back to reference Du, K.: Tight upper bounds for the convergence of the randomized extended Kaczmarz and Gauss–Seidel algorithms. Numer. Linear Algebra Appl. 26, 22–33 (2019)MathSciNetCrossRefMATH Du, K.: Tight upper bounds for the convergence of the randomized extended Kaczmarz and Gauss–Seidel algorithms. Numer. Linear Algebra Appl. 26, 22–33 (2019)MathSciNetCrossRefMATH
11.
go back to reference Davis, T.-A., Hu, Y.: The University of Florida sparse matrix collection. ACM Trans. Math. Softw. 38, 1–25 (2011)MathSciNetMATH Davis, T.-A., Hu, Y.: The University of Florida sparse matrix collection. ACM Trans. Math. Softw. 38, 1–25 (2011)MathSciNetMATH
12.
go back to reference Du, K., Sun, X.-H.: A doubly stochastic block Gauss–Seidel algorithm for solving linear equations. Appl. Math. Comput. 408, 126373 (2021)MathSciNetMATH Du, K., Sun, X.-H.: A doubly stochastic block Gauss–Seidel algorithm for solving linear equations. Appl. Math. Comput. 408, 126373 (2021)MathSciNetMATH
13.
go back to reference Duan, L.-X., Zhang, G.-F.: Variant of greedy randomized Gauss–Seidel method for ridge regression. Numer. Math. Theory Methods Appl. 14, 714–737 (2021)MathSciNetCrossRefMATH Duan, L.-X., Zhang, G.-F.: Variant of greedy randomized Gauss–Seidel method for ridge regression. Numer. Math. Theory Methods Appl. 14, 714–737 (2021)MathSciNetCrossRefMATH
14.
go back to reference Elad, M., Matalon, B., Zibulevsky, M.: Coordinate and subspace optimization methods for linear least squares with non-quadratic regularization. Appl. Comput. Harmon. Anal. 23, 346–367 (2007)MathSciNetCrossRefMATH Elad, M., Matalon, B., Zibulevsky, M.: Coordinate and subspace optimization methods for linear least squares with non-quadratic regularization. Appl. Comput. Harmon. Anal. 23, 346–367 (2007)MathSciNetCrossRefMATH
15.
go back to reference Hansen, P.-C.: Regularization tools: a Matlab package for analysis and solution of discrete ill-posed problems. Numer. Algorithm 06, 1–35 (1994)MathSciNetCrossRefMATH Hansen, P.-C.: Regularization tools: a Matlab package for analysis and solution of discrete ill-posed problems. Numer. Algorithm 06, 1–35 (1994)MathSciNetCrossRefMATH
16.
17.
go back to reference Hefny, A., Needell, D., Ramdas, A.: Rows versus columns: randomized Kaczmarz or Gauss–Seidel for ridge regression. SIAM J. Sci. Comput. 39, S528–S542 (2017)MathSciNetCrossRefMATH Hefny, A., Needell, D., Ramdas, A.: Rows versus columns: randomized Kaczmarz or Gauss–Seidel for ridge regression. SIAM J. Sci. Comput. 39, S528–S542 (2017)MathSciNetCrossRefMATH
18.
go back to reference Huang, G.-X., Liu, Y.-Y., Yin, F.: Tikhonov regularization with MTRSVD method for solving large-scale discrete ill-posed problems. J. Comput. Appl. Math. 405, 113969 (2021)MathSciNetCrossRefMATH Huang, G.-X., Liu, Y.-Y., Yin, F.: Tikhonov regularization with MTRSVD method for solving large-scale discrete ill-posed problems. J. Comput. Appl. Math. 405, 113969 (2021)MathSciNetCrossRefMATH
19.
go back to reference Jiang, X.-L., Zhang, K., Yin, J.F.: Randomized block Kaczmarz methods with k-means clustering for solving lare linear systems. Comput. Appl. Math. 413, 113828 (2022)CrossRefMATH Jiang, X.-L., Zhang, K., Yin, J.F.: Randomized block Kaczmarz methods with k-means clustering for solving lare linear systems. Comput. Appl. Math. 413, 113828 (2022)CrossRefMATH
20.
21.
go back to reference Liu, Y., Jiang, X.-L., Gu, C.-Q.: On maximum residual block and two-step Gauss–Seidel algorithms for linear least-squares problems. Calcolo 58, 1–32 (2021)MathSciNetCrossRefMATH Liu, Y., Jiang, X.-L., Gu, C.-Q.: On maximum residual block and two-step Gauss–Seidel algorithms for linear least-squares problems. Calcolo 58, 1–32 (2021)MathSciNetCrossRefMATH
22.
go back to reference Leventhal, D., Lewis, A.-S.: Randomized methods for linear constraints: convergence rates and conditioning. Math. Oper. Res. 35, 641–654 (2010)MathSciNetCrossRefMATH Leventhal, D., Lewis, A.-S.: Randomized methods for linear constraints: convergence rates and conditioning. Math. Oper. Res. 35, 641–654 (2010)MathSciNetCrossRefMATH
23.
go back to reference Liao, Y.-M., Lu, T.-X., Yin, F.: A two-step randomized Gauss–Seidel method for solving large-scale linear least squares problems. Electron. Res. Arch. 30, 755–779 (2022)MathSciNetCrossRefMATH Liao, Y.-M., Lu, T.-X., Yin, F.: A two-step randomized Gauss–Seidel method for solving large-scale linear least squares problems. Electron. Res. Arch. 30, 755–779 (2022)MathSciNetCrossRefMATH
24.
go back to reference Liao, Y.-M., Lu, T.-X., Yin, F.: Block Kaczmarz–Motzkin method via mean shift clustering. Mathematics 10, 2408 (2022)CrossRef Liao, Y.-M., Lu, T.-X., Yin, F.: Block Kaczmarz–Motzkin method via mean shift clustering. Mathematics 10, 2408 (2022)CrossRef
25.
go back to reference Lu, Z.-S., Xiao, L.: On the complexity analysis of randomized block-coordinate descent methods. J. Math. Program. 152, 615–642 (2015)MathSciNetCrossRefMATH Lu, Z.-S., Xiao, L.: On the complexity analysis of randomized block-coordinate descent methods. J. Math. Program. 152, 615–642 (2015)MathSciNetCrossRefMATH
26.
27.
go back to reference Ma, A., Needell, D., Ramdas, A.: Convergence properties of the randomized extended Gauss–Seidel and Kaczmarz methods. SIAM J. Matrix Anal. Appl. 36, 1590–1604 (2015)MathSciNetCrossRefMATH Ma, A., Needell, D., Ramdas, A.: Convergence properties of the randomized extended Gauss–Seidel and Kaczmarz methods. SIAM J. Matrix Anal. Appl. 36, 1590–1604 (2015)MathSciNetCrossRefMATH
28.
go back to reference Nutini, J., Schmidt, M., Laradji, I.-H., Friedlander, M., Koepke, H.: Coordinate descent converges faster with the Gauss–Southwell rule than random selection. Int. J. Technol. Manag. 43, 1632–1641 (2015) Nutini, J., Schmidt, M., Laradji, I.-H., Friedlander, M., Koepke, H.: Coordinate descent converges faster with the Gauss–Southwell rule than random selection. Int. J. Technol. Manag. 43, 1632–1641 (2015)
29.
go back to reference Needell, D., Tropp, J.-A.: Paved with good intentions: analysis of a randomized block Kaczmarz method. Linear Algebra Appl. 441, 199–221 (2014)MathSciNetCrossRefMATH Needell, D., Tropp, J.-A.: Paved with good intentions: analysis of a randomized block Kaczmarz method. Linear Algebra Appl. 441, 199–221 (2014)MathSciNetCrossRefMATH
30.
go back to reference Scott, J.-A., Tuma, M.: Sparse stretching for solving sparse-dense linear least-squares problems. SIAM J. Sci. Comput. 41, A1604–A1625 (2019)MathSciNetCrossRefMATH Scott, J.-A., Tuma, M.: Sparse stretching for solving sparse-dense linear least-squares problems. SIAM J. Sci. Comput. 41, A1604–A1625 (2019)MathSciNetCrossRefMATH
31.
go back to reference Strohmer, T., Vershynin, R.: A randomized Kaczmarz algorithm with exponential convergence. J. Fourier Anal. Appl. 15, 262–278 (2009)MathSciNetCrossRefMATH Strohmer, T., Vershynin, R.: A randomized Kaczmarz algorithm with exponential convergence. J. Fourier Anal. Appl. 15, 262–278 (2009)MathSciNetCrossRefMATH
32.
go back to reference Wu, N.-C., Liu, C.-Z.: Randomized progressive iterative approximation for B-spline curve and surface fittings. Preprint, arXiv:2212.06398v1 (2022) Wu, N.-C., Liu, C.-Z.: Randomized progressive iterative approximation for B-spline curve and surface fittings. Preprint, arXiv:​2212.​06398v1 (2022)
33.
go back to reference Wu, W.: Paving the Randomized Gauss–Seidel Method. BSc Thesis, Scripps College, Claremont, California (2017) Wu, W.: Paving the Randomized Gauss–Seidel Method. BSc Thesis, Scripps College, Claremont, California (2017)
34.
go back to reference Wang, C., Wu, D., Yang, K.: New decentralized positioning schemes for wireless sensor networks based on recursive least-squares optimization. IEEE Wirel. Commun. Lett. 3, 78–81 (2014)CrossRef Wang, C., Wu, D., Yang, K.: New decentralized positioning schemes for wireless sensor networks based on recursive least-squares optimization. IEEE Wirel. Commun. Lett. 3, 78–81 (2014)CrossRef
35.
go back to reference Ye, J.-C., Webb, K.-J., Bouman, C.-A., Millane, R.P.: Optical diffusion tomography by iterative-coordinate-descent optimization in a Bayesian framework. J. Opt. Soc. Am. A 16, 2400–2412 (1999)CrossRef Ye, J.-C., Webb, K.-J., Bouman, C.-A., Millane, R.P.: Optical diffusion tomography by iterative-coordinate-descent optimization in a Bayesian framework. J. Opt. Soc. Am. A 16, 2400–2412 (1999)CrossRef
36.
go back to reference Zhang, J.-H., Guo, J.-H.: On relaxed greedy randomized coordinate descent methods for solving large linear least-squares problems. J. Appl. Numer. Math. 157, 372–384 (2020)MathSciNetCrossRefMATH Zhang, J.-H., Guo, J.-H.: On relaxed greedy randomized coordinate descent methods for solving large linear least-squares problems. J. Appl. Numer. Math. 157, 372–384 (2020)MathSciNetCrossRefMATH
Metadata
Title
On greedy randomized block Gauss–Seidel method with averaging for sparse linear least-squares problems
Authors
Yimou Liao
Tianxiu Lu
Publication date
01-11-2023
Publisher
Springer International Publishing
Published in
Calcolo / Issue 4/2023
Print ISSN: 0008-0624
Electronic ISSN: 1126-5434
DOI
https://doi.org/10.1007/s10092-023-00549-x

Other articles of this Issue 4/2023

Calcolo 4/2023 Go to the issue

Premium Partner