GRADIENT-BASED METHODS FOR NOISY QUADRATIC OPTIMIZATION

Huỳnh Trâm Bùi1, , Lam Thuan Le2
1 Ho Chi Minh City University of Education
2 Trường Đại học Sư phạm Thành phố Hồ Chí Minh

Main Article Content

Abstract

In this paper, we study the problem of minimizing a noisy multivariate quadratic function using an inexact gradient descent framework, where the gradients are approximated via central finite difference schemes. Two noise models are analyzed: (i) the quadratic matrix is known, with noise in the linear and constant terms; (ii) both the matrix and linear term are known, with noise only in the constant term. We derive explicit error bounds, establish convergence rates using Polyak’s estimates (Polyak, 1987), and determine iteration complexity for the iterations generated by inexact gradient descent method with fixed step size.

Article Details

References

Berahas, A. S., Cao, L., Choromanski, K., & Scheinberg, K. (2022). A theoretical and empirical comparison of gradient approximations in derivative-free optimization. Foundations of Computational Mathematics, 22(2), 507–560. https://doi.org/10.1007/s10208-021-09513-z
Izmailov, A. F., & Solodov, M. V. (2014). Newton-type methods for optimization and variational problems (Vol. 1). Springer. https://doi.org/10.1007/978-3-319-04247-3
Polyak, B. T. (1987). Introduction to optimization. https://doi.org/10.1007/978-0-387-40065-5
Wright, S. J. (2006). Numerical optimization.