Learning Supervised PageRank with Gradient-Based and Gradient-Free Optimization Methods

被引:0
|
作者
Bogolubsky, Lev [1 ,2 ]
Gusev, Gleb [1 ,6 ]
Raigorodskii, Andrei [1 ,2 ,3 ,6 ]
Tikhonov, Aleksey [1 ]
Zhukovskii, Maksim [1 ,6 ]
Dvurechensky, Pavel [4 ,5 ]
Gasnikov, Alexander [5 ,6 ]
Nesterov, Yurii [7 ,8 ]
机构
[1] Yandex, Moscow, Russia
[2] Moscow MV Lomonosov State Univ, Moscow, Russia
[3] Buryat State Univ, Ulan Ude, Russia
[4] Weierstrass Inst, Berlin, Germany
[5] Inst Informat Transmiss Problems RAS, Moscow, Russia
[6] Moscow Inst Phys & Technol, Moscow, Russia
[7] Ctr Operat Res & Econometr, Louvain La Neuve, Belgium
[8] Higher Sch Econ, Moscow, Russia
基金
俄罗斯科学基金会;
关键词
DERIVATIVES;
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In this paper, we consider a non-convex loss-minimization problem of learning Supervised PageRank models, which can account for features of nodes and edges. We propose gradient-based and random gradient-free methods to solve this problem. Our algorithms are based on the concept of an inexact oracle and unlike the state-of-the-art gradient-based method we manage to provide theoretically the convergence rate guarantees for both of them. Finally, we compare the performance of the proposed optimization methods with the state of the art applied to a ranking task.
引用
收藏
页数:9
相关论文
共 50 条
  • [41] Gradient-free MCMC methods for dynamic causal modelling
    Sengupta, Biswa
    Friston, Karl J.
    Penny, Will D.
    NEUROIMAGE, 2015, 112 : 375 - 381
  • [42] Gradient-free Online Learning in Games with Delayed Rewards
    Heliou, Amelie
    Mertikopoulos, Panayotis
    Zhou, Zhengyuan
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [43] Learning a Gradient-free Riemannian Optimizer on Tangent Spaces
    Fan, Xiaomeng
    Gao, Zhi
    Wu, Yuwei
    Jia, Yunde
    Harandi, Mehrtash
    THIRTY-FIFTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THIRTY-THIRD CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE AND THE ELEVENTH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2021, 35 : 7377 - 7384
  • [44] Variational learning the SDC quantum protocol with gradient-based optimization
    Haozhen Situ
    Zhiming Huang
    Xiangfu Zou
    Shenggen Zheng
    Quantum Information Processing, 2019, 18
  • [45] Variational learning the SDC quantum protocol with gradient-based optimization
    Situ, Haozhen
    Huang, Zhiming
    Zou, Xiangfu
    Zheng, Shenggen
    QUANTUM INFORMATION PROCESSING, 2019, 18 (07)
  • [46] Study of Geologically Consistent History Matching Peculiarities by Means of Gradient-Free Optimization Methods
    Shiryaev, I. M.
    Zakirov, E. S.
    Indrupskiy, I. M.
    8TH INTERNATIONAL CONFERENCE ON MATHEMATICAL MODELING IN PHYSICAL SCIENCE, 2019, 1391
  • [47] Randomized Gradient-Free Distributed Optimization Methods for a Multiagent System With Unknown Cost Function
    Pang, Yipeng
    Hu, Guoqiang
    IEEE TRANSACTIONS ON AUTOMATIC CONTROL, 2020, 65 (01) : 333 - 340
  • [48] Training Physics-Based Machine-Learning Parameterizations With Gradient-Free Ensemble Kalman Methods
    Lopez-Gomez, Ignacio
    Christopoulos, Costa
    Ervik, Haakon Ludvig Langeland
    Dunbar, Oliver R. A.
    Cohen, Yair
    Schneider, Tapio
    JOURNAL OF ADVANCES IN MODELING EARTH SYSTEMS, 2022, 14 (08)
  • [49] A gradient-based direct aperture optimization
    Yang, Jie
    Zhang, Pengcheng
    Zhang, Liyuan
    Gui, Zhiguo
    Shengwu Yixue Gongchengxue Zazhi/Journal of Biomedical Engineering, 2018, 35 (03): : 358 - 367
  • [50] Gradient-Free Textual Inversion
    Fei, Zhengcong
    Fan, Mingyuan
    Huang, Junshi
    PROCEEDINGS OF THE 31ST ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2023, 2023, : 1364 - 1373