Global random optimization by simultaneous perturbation stochastic approximation

被引:0
|
作者
Maryak, JL [1 ]
Chin, DC [1 ]
机构
[1] Johns Hopkins Univ, Appl Phys Lab, Laurel, MD 20723 USA
关键词
stochastic optimization; global convergence; stochastic approximation; simultaneous perturbation stochastic approximation (SPSA); recursive annealing;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
A desire with iterative optimization techniques is that the algorithm reach the global optimum rather than get stranded at a local optimum value. In this paper, we examine the theoretical and numerical global convergence properties of a certain "gradient free" stochastic approximation algorithm called "SPSA," that has performed well in complex optimization problems. We establish two theorems on the global convergence of SPSA. The first provides conditions under which SPSA will converge in probability to a global optimum using the well-known method of injected noise. The injected noise prevents the algorithm from converging prematurely to a local optimum point. In the second theorem, we show that, under different conditions, "basic" SPSA without injected noise can achieve convergence in probability to a global optimum. This occurs because of the noise effectively (and automatically) introduced into the algorithm by the special form of the SPSA gradient approximation. This global convergence without injected noise can have important benefits in the setup (tuning) and performance (rate of convergence) of the algorithm. The discussion is supported by numerical studies showing favorable comparisons of SPSA to simulated annealing and genetic algorithms.
引用
收藏
页码:756 / 762
页数:7
相关论文
共 50 条