Private (Stochastic) Non-Convex Optimization Revisited: Second-Order Stationary Points and Excess Risks

被引:0
|
作者
Ganesh, Arun [1 ]
Liu, Daogao [2 ]
Oh, Sewoong [1 ,2 ]
Thakurta, Abhradeep [3 ]
机构
[1] Google Res, Mountain View, CA 94043 USA
[2] Univ Washington, Seattle, WA USA
[3] Google DeepMind, Mountain View, CA USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
We reconsider the challenge of non-convex optimization under differential privacy constraint. Building upon the previous variance-reduced algorithm Spider-Boost, we propose a novel framework that employs two types of gradient oracles: one that estimates the gradient at a single point and a more cost-effective option that calculates the gradient difference between two points. Our framework can ensure continuous accuracy of gradient estimations and subsequently enhances the rates of identifying second-order stationary points. Additionally, we consider a more challenging task by attempting to locate the global minima of a non-convex objective via the exponential mechanism without almost any assumptions. Our preliminary results suggest that the regularized exponential mechanism can effectively emulate previous empirical and population risk bounds, negating the need for smoothness assumptions for algorithms with polynomial running time. Furthermore, with running time factors excluded, the exponential mechanism demonstrates promising population risk bound performance, and we provide a nearly matching lower bound.
引用
收藏
页数:24
相关论文
共 50 条
  • [41] Superfast Second-Order Methods for Unconstrained Convex Optimization
    Nesterov, Yurii
    JOURNAL OF OPTIMIZATION THEORY AND APPLICATIONS, 2021, 191 (01) : 1 - 30
  • [42] Combined non-convex second-order total variation with overlapping group sparsity for full waveform inversion
    Fu, Hongsun
    Qi, Hongyu
    Gu, Ruixue
    APPLIED MATHEMATICS IN SCIENCE AND ENGINEERING, 2023, 31 (01):
  • [43] Escaping Saddle Points for Zeroth-order Non-convex Optimization using Estimated Gradient Descent
    Bai, Qinbo
    Agarwal, Mridul
    Aggarwal, Vaneet
    2020 54TH ANNUAL CONFERENCE ON INFORMATION SCIENCES AND SYSTEMS (CISS), 2020, : 132 - 137
  • [44] Switched diffusion processes for non-convex optimization and saddle points search
    Journel, Lucas
    Monmarche, Pierre
    STATISTICS AND COMPUTING, 2023, 33 (06)
  • [45] Switched diffusion processes for non-convex optimization and saddle points search
    Lucas Journel
    Pierre Monmarché
    Statistics and Computing, 2023, 33
  • [46] Results on Second-Order Hankel Determinants for Convex Functions with Symmetric Points
    Ullah, Khalil
    Al-Shbeil, Isra
    Faisal, Muhammad Imran
    Arif, Muhammad
    Alsaud, Huda
    SYMMETRY-BASEL, 2023, 15 (04):
  • [47] A subclass of generating set search with convergence to second-order stationary points
    Abramson, Mark A.
    Frimannslund, Lennart
    Steihaug, Trond
    OPTIMIZATION METHODS & SOFTWARE, 2014, 29 (05): : 900 - 918
  • [48] Augmented Lagrangians with constrained subproblems and convergence to second-order stationary points
    E. G. Birgin
    G. Haeser
    A. Ramos
    Computational Optimization and Applications, 2018, 69 : 51 - 75
  • [49] Convergence to second order stationary points in inequality constrained optimization
    Facchinei, F
    Lucidi, S
    MATHEMATICS OF OPERATIONS RESEARCH, 1998, 23 (03) : 746 - 766
  • [50] Sample Complexity of Policy Gradient Finding Second-Order Stationary Points
    Yang, Long
    Zheng, Qian
    Pan, Gang
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 10630 - 10638