Distributionally Constrained Black-Box Stochastic Gradient Estimation and Optimization

被引:0
|
作者
Lam, Henry [1 ]
Zhang, Junhui [2 ]
机构
[1] Columbia Univ, Dept Ind Engn & Operat Res, New York, NY 10027 USA
[2] Columbia Univ, Dept Appl Phys & Appl Math, New York, NY 10027 USA
基金
美国国家科学基金会;
关键词
zeroth-order gradient estimation; finite difference; simultaneous perturbation; distributionally robust optimization; stochastic approximation; ROBUST; SIMULATION; SENSITIVITY; RISK;
D O I
10.1287/opre.2021.0307
中图分类号
C93 [管理学];
学科分类号
12 ; 1201 ; 1202 ; 120202 ;
摘要
We consider stochastic gradient estimation using only black-box function evaluations, where the function argument lies within a probability simplex. This problem is motivated from gradient-descent optimization procedures in multiple applications in distributionally robust analysis and inverse model calibration involving decision variables that are probability distributions. We are especially interested in obtaining gradient estimators where one or few sample observations or simulation runs apply simultaneously to all directions. Conventional zeroth-order gradient schemes such as simultaneous perturbation face challenges as the required moment conditions that allow the "canceling" of higher- order biases cannot be satisfied without violating the simplex constraints. We investigate a new set of required conditions on the random perturbation generator, which leads us to a class of implementable gradient estimators using Dirichlet mixtures. We study the statistical properties of these estimators and their utility in constrained stochastic approximation. We demonstrate the effectiveness of our procedures and compare with benchmarks via several numerical examples.
引用
收藏
页数:16
相关论文
共 50 条
  • [1] Approximation Algorithms for Distributionally-Robust Stochastic Optimization with Black-Box Distributions
    Linhares, Andre
    Swamy, Chaitanya
    PROCEEDINGS OF THE 51ST ANNUAL ACM SIGACT SYMPOSIUM ON THEORY OF COMPUTING (STOC '19), 2019, : 768 - 779
  • [2] Black-Box Optimizer with Stochastic Implicit Natural Gradient
    Lyu, Yueming
    Tsang, Ivor W.
    MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2021: RESEARCH TRACK, PT III, 2021, 12977 : 217 - 232
  • [3] On Numerical Methods for Black-Box Constrained Global Optimization
    Kvasov, Dmitri E.
    Grishagin, Vladimir A.
    INTERNATIONAL CONFERENCE ON NUMERICAL ANALYSIS AND APPLIED MATHEMATICS 2022, ICNAAM-2022, 2024, 3094
  • [4] Black-Box Boundary Attack Based on Gradient Optimization
    Yang, Yuli
    Liu, Zishuo
    Lei, Zhen
    Wu, Shuhong
    Chen, Yongle
    ELECTRONICS, 2024, 13 (06)
  • [5] Distributed Evolution Strategies for Black-Box Stochastic Optimization
    He, Xiaoyu
    Zheng, Zibin
    Chen, Chuan
    Zhou, Yuren
    Luo, Chuan
    Lin, Qingwei
    IEEE TRANSACTIONS ON PARALLEL AND DISTRIBUTED SYSTEMS, 2022, 33 (12) : 3718 - 3731
  • [6] DISTRIBUTIONALLY CONSTRAINED STOCHASTIC GRADIENT ESTIMATION USING NOISY FUNCTION EVALUATIONS
    Lam, Henry
    Zhang, Junhui
    2020 WINTER SIMULATION CONFERENCE (WSC), 2020, : 445 - 456
  • [7] On the Convergence of Adaptive Stochastic Search Methods for Constrained and Multi-objective Black-Box Optimization
    Regis, Rommel G.
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2016, 170 (03) : 932 - 959
  • [8] On the Convergence of Adaptive Stochastic Search Methods for Constrained and Multi-objective Black-Box Optimization
    Rommel G. Regis
    Journal of Optimization Theory and Applications, 2016, 170 : 932 - 959
  • [9] Online Selection of Surrogate Models for Constrained Black-Box Optimization
    Bagheri, Samineh
    Konen, Wolfgang
    Baeck, Thomas
    PROCEEDINGS OF 2016 IEEE SYMPOSIUM SERIES ON COMPUTATIONAL INTELLIGENCE (SSCI), 2016,
  • [10] Effective Sampling, Modeling and Optimization of Constrained Black-box Problems
    Bajaj, Ishan
    Hasan, M. M. Faruque
    26TH EUROPEAN SYMPOSIUM ON COMPUTER AIDED PROCESS ENGINEERING (ESCAPE), PT A, 2016, 38A : 553 - 558