The adjoint Newton algorithm for large-scale unconstrained optimization in meteorology applications

被引:20
|
作者
Wang, Z [1 ]
Droegemeier, K
White, L
机构
[1] Univ Oklahoma, Ctr Anal & Predict Storms, Norman, OK 73019 USA
[2] Univ Oklahoma, Sch Meterol, Norman, OK 73019 USA
[3] Univ Oklahoma, Dept Math, Norman, OK 73019 USA
关键词
adjoint Newton algorithm; variational data assimilation; LBFGS; truncated Newton algorithm; large-scale unconstrained minimization;
D O I
10.1023/A:1018321307393
中图分类号
C93 [管理学]; O22 [运筹学];
学科分类号
070105 ; 12 ; 1201 ; 1202 ; 120202 ;
摘要
A new algorithm is presented for carrying out large-scale unconstrained optimization required in variational data assimilation using the Newton method. The algorithm is referred to as the adjoint Newton algorithm. The adjoint Newton algorithm is based on the first- and second-order adjoint techniques allowing us to obtain the Newton line search direction by integrating a tangent linear equations model backwards in time (starting from a final condition with negative time steps). The error present in approximating the Hessian (the matrix of second-order derivatives) of the cost function with respect to the control variables in the quasi-Newton type algorithm is thus completely eliminated, while the storage problem related to the Hessian no longer exists since the explicit Hessian is not required in this algorithm. The adjoint Newton algorithm is applied to three one-dimensional models and to a two-dimensional limited-area shallow water equations model with both model generated and First Global Geophysical Experiment data. We compare the performance of the adjoint Newton algorithm with that of truncated Newton, adjoint truncated Newton, and LBFGS methods. Our numerical tests indicate that the adjoint Newton algorithm is very efficient and could find the minima within three or four iterations for problems tested here. In the case of the two-dimensional shallow water equations model, the adjoint Newton algorithm improves upon the efficiencies of the truncated Newton and LBFGS methods by a factor of at least 14 in terms of the CPU time required to satisfy the same convergence criterion. The Newton, truncated Newton and LBFGS methods are general purpose unconstrained minimization methods. The adjoint Newton algorithm is only useful for optimal control problems where the model equations serve as strong constraints and their corresponding tangent linear model may be integrated backwards in time. When the backwards integration of the tangent linear model is ill-posed in the sense of Hadamard, the adjoint Newton algorithm may not work. Thus, the adjoint Newton algorithm must be used with some caution. A possible solution to avoid the current weakness of the adjoint Newton algorithm is proposed.
引用
收藏
页码:283 / 320
页数:38
相关论文
共 50 条
  • [41] A Competitive Divide-and-Conquer Algorithm for Unconstrained Large-Scale Black-Box Optimization
    Mei, Yi
    Omidvar, Mohammad Nabi
    Li, Xiaodong
    Yao, Xin
    ACM TRANSACTIONS ON MATHEMATICAL SOFTWARE, 2016, 42 (02): : 1 - 24
  • [42] Distributed Newton Method for Large-Scale Consensus Optimization
    Tutunov, Rasul
    Bou-Ammar, Haitham
    Jadbabaie, Ali
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2019, 64 (10) : 3983 - 3994
  • [43] Use of conjugate directions in algorithms for large-scale unconstrained optimization
    Fasano, G
    BOLLETTINO DELLA UNIONE MATEMATICA ITALIANA, 2001, 4A (03): : 447 - 450
  • [44] Gradient method with multiple damping for large-scale unconstrained optimization
    Hong Seng Sim
    Wah June Leong
    Chuei Yee Chen
    Optimization Letters, 2019, 13 : 617 - 632
  • [45] A TRUST REGION SUBSPACE METHOD FOR LARGE-SCALE UNCONSTRAINED OPTIMIZATION
    Gong, Lujin
    ASIA-PACIFIC JOURNAL OF OPERATIONAL RESEARCH, 2012, 29 (04)
  • [46] Gradient method with multiple damping for large-scale unconstrained optimization
    Sim, Hong Seng
    Leong, Wah June
    Chen, Chuei Yee
    OPTIMIZATION LETTERS, 2019, 13 (03) : 617 - 632
  • [47] On large-scale unconstrained optimization problems and higher order methods
    Gundersen, Geir
    Steihaug, Trond
    OPTIMIZATION METHODS & SOFTWARE, 2010, 25 (03): : 337 - 358
  • [48] A new method of moving asymptotes for large-scale unconstrained optimization
    Wang, Haijun
    Ni, Qin
    APPLIED MATHEMATICS AND COMPUTATION, 2008, 203 (01) : 62 - 71
  • [49] Software for large-scale bound-constrained or unconstrained optimization
    Stefan, W.
    Garnero, E.
    Renaut, R. A.
    GEOPHYSICAL JOURNAL INTERNATIONAL, 2006, 167 (03) : 1353 - 1362
  • [50] An adjoint feature-selection-based evolutionary algorithm for sparse large-scale multiobjective optimization
    Panpan Zhang
    Hang Yin
    Ye Tian
    Xingyi Zhang
    Complex & Intelligent Systems, 2025, 11 (2)