GRADIENT-BASED METHODS FOR NOISY QUADRATIC OPTIMIZATION
Main Article Content
Abstract
In this paper, we study the problem of minimizing a noisy multivariate quadratic function using an inexact gradient descent framework, where the gradients are approximated via central finite difference schemes. Two noise models are analyzed: (i) the quadratic matrix is known, with noise in the linear and constant terms; (ii) both the matrix and linear term are known, with noise only in the constant term. We derive explicit error bounds, establish convergence rates using Polyak’s estimates (Polyak, 1987), and determine iteration complexity for the iterations generated by inexact gradient descent method with fixed step size.
Keywords
noisy multivariate quadratic functions, derivative-free optimization, central finite difference scheme, inexact gradient descent method
Article Details
References
Izmailov, A. F., & Solodov, M. V. (2014). Newton-type methods for optimization and variational problems (Vol. 1). Springer. https://doi.org/10.1007/978-3-319-04247-3
Polyak, B. T. (1987). Introduction to optimization. https://doi.org/10.1007/978-0-387-40065-5
Wright, S. J. (2006). Numerical optimization.