Scalable and Efficient Pairwise Learning to Achieve Statistical Accuracy

被引:0
|
作者
Gu, Bin [1 ]
Huo, Zhouyuan [2 ]
Huang, Heng [1 ,2 ]
机构
[1] JDDGlobal Com, Westlake Village, CA USA
[2] Univ Pittsburgh, Dept Elect & Comp Engn, Pittsburgh, PA 15260 USA
来源
THIRTY-THIRD AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FIRST INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE / NINTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE | 2019年
关键词
GENERALIZATION BOUNDS; ALGORITHMS; RANKING;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Pairwise learning is an important learning topic in the machine learning community, where the loss function involves pairs of samples (e.g., AUC maximization and metric learning). Existing pairwise learning algorithms do not perform well in the generality, scalability and efficiency simultaneously. To address these challenging problems, in this paper, we first analyze the relationship between the statistical accuracy and the regularized empire risk for pairwise loss. Based on the relationship, we propose a scalable and efficient adaptive doubly stochastic gradient algorithm (AdaDSG) for generalized regularized pairwise learning problems. More importantly, we prove that the overall computational cost of AdaDSG is O(n) to achieve the statistical accuracy on the full training set with the size of n, which is the best theoretical result for pairwise learning to the best of our knowledge. The experimental results on a variety of real-world datasets not only confirm the effectiveness of our AdaDSG algorithm, but also show that AdaDSG has significantly better scalability and efficiency than the existing pairwise learning algorithms.
引用
收藏
页码:3697 / 3704
页数:8
相关论文
共 50 条
  • [31] Active pairwise distance learning for efficient labeling of large datasets by human experts
    Joris Pries
    Sandjai Bhulai
    Rob van der Mei
    Applied Intelligence, 2023, 53 : 24689 - 24708
  • [32] Co-Attention Graph Pooling for Efficient Pairwise Graph Interaction Learning
    Lee, Junhyun
    Kim, Bumsoo
    Jeon, Minji
    Kang, Jaewoo
    IEEE ACCESS, 2023, 11 : 78549 - 78560
  • [33] Efficient computation of comprehensive statistical information of large OWL datasets: a scalable approach
    Mohamed, Heba
    Fathalla, Said
    Lehmann, Jens
    Jabeen, Hajira
    ENTERPRISE INFORMATION SYSTEMS, 2023, 17 (07)
  • [34] Granular ball computing classifiers for efficient, scalable and robust learning
    Xia, Shuyin
    Liu, Yunsheng
    Ding, Xin
    Wang, Guoyin
    Yu, Hong
    Luo, Yuoguo
    INFORMATION SCIENCES, 2019, 483 : 136 - 152
  • [35] Nexus: Bringing Efficient and Scalable Training to Deep Learning Frameworks
    Wang, Yandong
    Zhang, Li
    Ren, Yufei
    Zhang, Wei
    2017 IEEE 25TH INTERNATIONAL SYMPOSIUM ON MODELING, ANALYSIS, AND SIMULATION OF COMPUTER AND TELECOMMUNICATION SYSTEMS (MASCOTS), 2017, : 12 - 21
  • [36] Sparker: Efficient Reduction for More Scalable Machine Learning with Spark
    Yu, Bowen
    Cao, Huanqi
    Shan, Tianyi
    Wang, Haojie
    Tang, Xiongchao
    Chen, Wenguang
    50TH INTERNATIONAL CONFERENCE ON PARALLEL PROCESSING, 2021,
  • [37] Auxo: Efficient Federated Learning via Scalable Client Clustering
    Liu, Jiachen
    Lai, Fan
    Dai, Yinwei
    Akella, Aditya
    Madhyastha, Harsha V.
    Chowdhury, Mosharaf
    PROCEEDINGS OF THE 2023 ACM SYMPOSIUM ON CLOUD COMPUTING, SOCC 2023, 2023, : 125 - 141
  • [38] Statistical physics of pairwise probability models
    Roudi, Yasser
    Aurell, Erik
    Hertz, John A.
    FRONTIERS IN COMPUTATIONAL NEUROSCIENCE, 2009, 3
  • [39] Pairwise statistical comparisons of multiple algorithms
    Bin-Bin Jia
    Jun-Ying Liu
    Min-Ling Zhang
    Frontiers of Computer Science, 2025, 19 (12):
  • [40] An Efficient and Scalable Deep Learning Approach for Road Damage Detection
    Naddaf-Sh, Sadra
    Naddaf-Sh, M-Mahdi
    Kashani, Amir R.
    Zargarzadeh, Hassan
    2020 IEEE INTERNATIONAL CONFERENCE ON BIG DATA (BIG DATA), 2020, : 5602 - 5608