Abstract
The development of software for minimization problems is often based on a line search method. We consider line search methods that satisfy sufficient decrease and curvature conditions, and formulate the problem of determining a point that satisfies these two conditions in terms of finding a point in a set T(μ). We describe a search algorithm for this problem that produces a sequence of iterates that converge to a point in T(μ) and that, except for pathological cases, terminates in a finite number of steps. Numerical results for an implementation of the search algorithm on a set of test functions show that the algorithm terminates within a small number of iterations.
- AL-BAALI, M. 1985. Descent property and global convergence of the Fletcher-Reeves method with inexact line searches. IMA J. Namer. Anal. 5, 121-124.Google Scholar
- AL-BAALI, M., AND FLETCHER, R. 1984. An efficient line search for nonlinear least squares. J. Opttm. Theory Appl. 48, 359-377. Google Scholar
- BYRD, R. H., NOCEDAL, J., AND YUAN, Y. 1987. Global convergence of a class of quasi-Newton methods on convex problems. SIAM J. Numer. Anal. 24, 1171-1190.Google Scholar
- DE~, J. E., A~D SCm~AB~L, R. E. 1983. N~rner~cal Method~ for Unconstrained Optimization and Nonhnear Equations. Prentice-Hall, Englewood Cliffs, N.J.Google Scholar
- FLETCHER, R. 1980. ractzcal Methods of Optzmizat~on. Vol. 1. Unconstrained Optlmization. Wiley, New York Google Scholar
- FLETCHER, R. 1987. Practwal Methods of Optimization. 2nd ed. Wiley, New York. Google Scholar
- GILBERT, J. C., AND NOCEDAL, J. 1992. Global convergence properties of conjugate gradient methods for optimization. SIAM J. Optim. 2, 21-42.Google Scholar
- G~LL, P. E., ANn MURKY, W. 1974. Safeguarded steplength algorithms for optimization using descent methods. Rep. NAC 37, National Physical Laboratory, Teddington, England.Google Scholar
- GILL, P. E., MURRAY, W., SAUNVERS, M. A., AND WRIGHT, M. H. 1979. Two steplength algorithms for numerical optimization. Rep. SOL 79-25, Systems Optimization Laboratory, Stanford Univ., Stanford, Calif.Google Scholar
- HAGER, W. W. 1989. A derivative-based bracketing scheme for univariate minimization and the conjugate gradient method. Comput. Math. Appl. 18, 779-795.Google Scholar
- LEMARECnAL, C. 1981. A view of line-searches. In Optimization and Optimal Control, A. Auslender, W. Oettli, and J. Stoer, Eds. Lecture Notes in Control and Information Science, vol. 30. Springer-Verlag, Heidelberg, 59 78.Google Scholar
- LIu, D. C., ~3~D NOCEDAL, J. 1989. On the limited memory BFGS method for large scale optimization. Math. Program. 45, 503-528. Google Scholar
- MOR~, J. J., AND SORENSEN, D. C. 1984. Newton's method. In Studies in Numerical Analysis, G. H. Golub, Ed. Mathematical Association of America, Washington, D.C., 29 82.Google Scholar
- O'LEARY, D. P. 1990. Robust regression computation using iteratively reweighted least squares. SIAM J. Matrix Anal. Appl. 11,466-480. Google Scholar
- POWELL, M. J. D. 1976. Some global convergence properties of a variable metric method without line searches. In Nonlinear Programming, R. W. Cottie and C. E. Lemke, Eds. SIAM-AMS Proceedings, vol. IX. American Mathematical Society, Providence, R.I., 53-72.Google Scholar
- SCHLICK, T., AND FOGELSON, A. 1992a. TNPACK--A truncated Newton minimization package for large-scale problems: I. Algorithms and usage. ACM Trans. Math. Softw. 18, 1, 46-70. Google Scholar
- SC~L~CK, T., ANn FOGELSON, A. 1992b. TNPACK--A truncated Newton minimization package for large-scale problems: II. Implementations examples. ACM Trans. Math. Softw. 18, 1, 71-111. Google Scholar
- SHANNO, D. F., ^ND PHUA, K. H. 1976. Minimization of unconstrained multivariate functions. ACM Trans. Math. Softw. 2, 87 94. Google Scholar
- SH~NO, D. F., ~D PHVA, K. H. 1980. Remark on Algorithm 500. ACM Trans. Math. Softw. 6, 618-622.Google Scholar
- YANAL H., OZAWA, M., AND KANEKO, S. 1981. Interpolation methods in one dimensional optimization. Computing 27, 155-163.Google Scholar
Index Terms
- Line search algorithms with guaranteed sufficient decrease
Recommendations
On the worst-case evaluation complexity of non-monotone line search algorithms
A general class of non-monotone line search algorithms has been proposed by Sachs and Sachs (Control Cybern 40:1059---1075, 2011) for smooth unconstrained optimization, generalizing various non-monotone step size rules such as the modified Armijo rule ...
Eigenvalues versus singular values study in conjugate gradient algorithms for large-scale unconstrained optimization
Two different approaches based on eigenvalues and singular values of the matrix representing the search direction in conjugate gradient algorithms are considered. Using a special approximation of the inverse Hessian of the objective function, which ...
An affine scaling interior trust-region method combining with line search filter technique for optimization subject to bounds on variables
This paper proposes and analyzes an affine scaling trust-region method with line search filter technique for solving nonlinear optimization problems subject to bounds on variables. At the current iteration, the trial step is generated by the general ...
Comments