Finite Difference Gradient Approximation: To Randomize or Not?

被引:8
|
作者
Scheinberg, Katya [1 ]
机构
[1] Cornell Univ, Sch Operat Res & Informat Engn, Ithaca, NY 14853 USA
关键词
finite difference approximation; gradient descent; randomized; STOCHASTIC-APPROXIMATION; OPTIMIZATION;
D O I
10.1287/ijoc.2022.1218
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
We discuss two classes of methods of approximating gradients of noisy black box functions-the classical finite difference method and recently popular randomized finite difference methods. Despite of the popularity of the latter, we argue that it is unclear whether the randomized schemes have an advantage over the traditional methods when employed inside an optimization method. We point to theoretical and practical evidence that show that the opposite is true at least in a general optimization setting. We then pose the question of whether a particular setting exists when the advantage of the new method may be clearly shown, at least numerically. The larger underlying challenge is a development of black box optimization methods that scale well with the problem dimension.
引用
收藏
页码:2384 / 2388
页数:5
相关论文
共 50 条