CONVERGENCE AND CONVERGENCE RATES OF DAMPED NEWTON METHODS
Main Article Content
Abstract
In this paper, we study the convergence and convergence rates of damped Newton algorithms for solving unconstrained optimization problems with twice continuously differentiable objective functions. Under the assumption of the positive definiteness of the Hessian matrix of the objective function on an open set containing the level set corresponding to the value of the objective function at the starting point, we prove that the sequence generated by the damped Newton algorithm belongs to that open set, and the corresponding sequence of objective function values is monotonically decreasing. If the sequence has a limit point, that limit point is a locally strong minimum of the objective function, and the iterative sequence superlinearly globally converges to this minimizer. Furthermore, if the Hessian matrix of the objective function is Lipschitz continuous, the iterative sequence achieves the quadratic convergence rate.
Keywords
convergence rates, damped Newton algorithm, global convergence, positive-definiteness, quadratic, superlinear
Article Details
References
Beck, A. (2017). First-Order Methods in Optimization. SIAM. https://doi.org/10.1137/1.9781611974997
Ben-Tal, A., & Nemirovski, A. (1987). Lectures on Modern Convex Optimization: Analysis, Algorithms, and Engineering Applications. SIAM. https://doi.org/10.1137/1.9780898718829
Bertsekas, D. P. (1999). Nonlinear programming. Athena Scientific. ISBN: 978-1-886529-05-2
Boyd, S., & Vandenberghe, L. (2004). Convex Optimization. Cambridge University Press. https://doi.org/10.1017/CBO9780511804441
Chieu, N. H., Lee, G. M., & Yen, N. D. (2017). Second-order subdifferentials and optimality conditions for -smooth optimization problems. Applied Analysis and Optimization, 1(3), 461-476.
Dennis, J. E., & Schnabel, R. B. (1987). Numerical Methods for Unconstrained Optimization and Nonlinear Equations. Pennsylvania. SIAM. https://doi.org/10.1137/1.9781611971200
Facchinei, F., & Pang, J.-S. (2003). Finite-dimensional Variational Inequalities and Complementarity Problems, volume II. Springer. https://doi.org/10.1007/b97544
Hiriart-Urruty, J.-B., & Lemaréchal, C. (2004). Fundamentals of Convex Analysis. Springer. https://doi.org/10.1007/978-3-642-56468-0
Izmailov, A., & Solodov, M. (2014). Newton-Type Methods for Optimization and Variational Problems. Springer. https://doi.org/10.1007/978-3-319-04247-3
James, V. B. (2014). Nonlinear Optimization. University of Washington.
Mordukhovich, B. S., & Nam, N. M. (2014). An Easy Path to Convex Analysis and Applications. Morgan & Claypool Publishers. https://doi.org/10.1007/978-3-031-02406-1
Nesterov, Y. (2004). Introductory Lectures on Convex Optimization: A basic course. Springer US. https://doi.org/10.1007/978-1-4419-8853-9
Nocedal, J., & Wright, S. (1999). Numerical optimization. Springer. https://doi.org/10.1007/978-0-387-40065-5
Pham, D. K., Mordukhovich, B. S., & Vo, T. P.(2022). A generalized Newton method for subgradient systems. Mathematics of Operations Research, 48(4). https://doi.org/10.1287/moor.2022.1320
Polyak, B. T. (1987). Introduction to Optimization. Optimization Software.
Rockafellar, R. T., & Wets, R.-B. (1998). Variational Analysis. Springer. https://doi.org/10.1007/978-3-642-02431-3
Ruszczyński, A. (2006). Nonlinear Optimization. Princeton University Press. https://doi.org/10.2307/j.ctvcm4hcj