共 50 条
- [1] Learning with Gradient Descent and Weakly Convex Losses [J]. 24TH INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE AND STATISTICS (AISTATS), 2021, 130
- [2] A new inexact gradient descent method with applications to nonsmooth convex optimization [J]. OPTIMIZATION METHODS & SOFTWARE, 2024,
- [3] A New Inexact Gradient Descent Method with Applications to Nonsmooth Convex Optimization [J]. arXiv, 2023,
- [7] Towards stability and optimality in stochastic gradient descent [J]. ARTIFICIAL INTELLIGENCE AND STATISTICS, VOL 51, 2016, 51 : 1290 - 1298
- [8] Stability and Generalization of Decentralized Stochastic Gradient Descent [J]. THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 9756 - 9764
- [9] Global Convergence and Stability of Stochastic Gradient Descent [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,