Stability and Differential Privacy of Stochastic Gradient Descent for Pairwise Learning with Non-Smooth Loss

被引:0
|
作者
Yang, Zhenhuan [1 ]
Lei, Yunwen [2 ]
Lyu, Siwei [3 ]
Ying, Yiming [1 ]
机构
[1] SUNY Albany, Albany, NY 12222 USA
[2] Univ Birmingham, Birmingham, W Midlands, England
[3] Univ Buffalo SUNY, Buffalo, NY USA
关键词
RANKING; ALGORITHMS;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pairwise learning has recently received increasing attention since it subsumes many important machine learning tasks (e.g. AUC maximization and metric learning) into a unifying framework. In this paper, we give the first-ever-known stability and generalization analysis of stochastic gradient descent (SGD) for pairwise learning with non-smooth loss functions, which are widely used (e.g. Ranking SVM with the hinge loss). We introduce a novel decomposition in its stability analysis to decouple the pairwisely dependent random variables, and derive generalization bounds which are consistent with the setting of pointwise learning. Furthermore, we apply our stability analysis to develop di.erentially private SGD for pairwise learning, for which our utility bounds match with the state-of-the-art output perturbation method (Huai et al., 2020) with smooth losses. Finally, we illustrate the results using specific examples of AUC maximization and similarity metric learning. As a byproduct, we provide an affirmative solution to an open question on the advantage of the nuclear-norm constraint over the Frobenius-norm constraint in similarity metric learning.
引用
收藏
页数:11
相关论文
共 50 条
  • [1] Tight analyses for non-smooth stochastic gradient descent
    Harvey, Nicholas J. A.
    Liaw, Christopher
    Plan, Yaniv
    Randhawa, Sikander
    [J]. CONFERENCE ON LEARNING THEORY, VOL 99, 2019, 99
  • [2] Stability and optimization error of stochastic gradient descent for pairwise learning
    Shen, Wei
    Yang, Zhenhuan
    Ying, Yiming
    Yuan, Xiaoming
    [J]. ANALYSIS AND APPLICATIONS, 2020, 18 (05) : 887 - 927
  • [3] Convergence of Constant Step Stochastic Gradient Descent for Non-Smooth Non-Convex Functions
    Pascal Bianchi
    Walid Hachem
    Sholom Schechtman
    [J]. Set-Valued and Variational Analysis, 2022, 30 : 1117 - 1147
  • [4] Convergence of Constant Step Stochastic Gradient Descent for Non-Smooth Non-Convex Functions
    Bianchi, Pascal
    Hachem, Walid
    Schechtman, Sholom
    [J]. SET-VALUED AND VARIATIONAL ANALYSIS, 2022, 30 (03) : 1117 - 1147
  • [5] Differential Privacy Stochastic Gradient Descent with Adaptive Privacy Budget Allocation
    Xie, Yun
    Li, Peng
    Wu, Chao
    Wu, Qiuling
    [J]. 2021 IEEE INTERNATIONAL CONFERENCE ON CONSUMER ELECTRONICS AND COMPUTER ENGINEERING (ICCECE), 2021, : 227 - 231
  • [6] Simple Stochastic and Online Gradient Descent Algorithms for Pairwise Learning
    Yang, Zhenhuan
    Lei, Yunwen
    Wang, Puyu
    Yang, Tianbao
    Ying, Yiming
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 34 (NEURIPS 2021), 2021, 34
  • [7] Online Stochastic Gradient Descent with Arbitrary Initialization Solves Non-smooth, Non-convex Phase Retrieval
    Tan, Yan Shuo
    Vershynin, Roman
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2023, 24
  • [8] A Stochastic Gradient Descent Algorithm Based on Adaptive Differential Privacy
    Deng, Yupeng
    Li, Xiong
    He, Jiabei
    Liu, Yuzhen
    Liang, Wei
    [J]. COLLABORATIVE COMPUTING: NETWORKING, APPLICATIONS AND WORKSHARING, COLLABORATECOM 2022, PT II, 2022, 461 : 133 - 152
  • [9] Fast Proximal Gradient Descent for A Class of Non-convex and Non-smooth Sparse Learning Problems
    Yang, Yingzhen
    Yu, Jiahui
    [J]. 35TH UNCERTAINTY IN ARTIFICIAL INTELLIGENCE CONFERENCE (UAI 2019), 2020, 115 : 1253 - 1262
  • [10] Existence and stability of weak solutions to stochastic differential equations with non-smooth coefficients
    Stramer, O
    Tweedie, RL
    [J]. STATISTICA SINICA, 1997, 7 (03) : 577 - 593