A Second-Order Finite-Difference Method for Derivative-Free Optimization

被引:1
|
作者
Chen, Qian [1 ]
Wang, Peng [1 ,2 ]
Zhu, Detong [3 ]
机构
[1] Hainan Normal Univ, Math & Stat Coll, Haikou 570203, Hainan, Peoples R China
[2] Hainan Normal Univ, Key Lab, Minist Educ, Haikou 570203, Hainan, Peoples R China
[3] Shanghai Normal Univ, Math & Sci Coll, Shanghai 200234, Peoples R China
基金
中国国家自然科学基金;
关键词
GRADIENT; CONVEX;
D O I
10.1155/2024/1947996
中图分类号
O1 [数学];
学科分类号
0701 ; 070101 ;
摘要
In this paper, a second-order finite-difference method is proposed for finding the second-order stationary point of derivative-free nonconvex unconstrained optimization problems. The forward-difference or the central-difference technique is used to approximate the gradient and Hessian matrix of objective function, respectively. The traditional trust-region framework is used, and we minimize the approximation trust region subproblem to obtain the search direction. The global convergence of the algorithm is given without the fully quadratic assumption. Numerical results show the effectiveness of the algorithm using the forward-difference and central-difference approximations.
引用
收藏
页数:12
相关论文
共 50 条