Efficient regularized least-squares algorithms for conditional ranking on relational data

被引:25
|
作者
Pahikkala, Tapio [1 ,2 ]
Airola, Antti [1 ,2 ]
Stock, Michiel [3 ]
De Baets, Bernard [3 ]
Waegeman, Willem [3 ]
机构
[1] Univ Turku, Dept Informat Technol, Turku 20014, Finland
[2] Univ Turku, Turku Ctr Comp Sci, Turku 20014, Finland
[3] Univ Ghent, Dept Math Modelling Stat & Bioinformat, B-9000 Ghent, Belgium
基金
芬兰科学院;
关键词
Reciprocal relations; Symmetric relations; Learning to rank; Kernel methods; Regularized least-squares; NETWORK INFERENCE; KERNEL; EXAMPLES; INVERSE; MODEL; TIME;
D O I
10.1007/s10994-013-5354-7
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
In domains like bioinformatics, information retrieval and social network analysis, one can find learning tasks where the goal consists of inferring a ranking of objects, conditioned on a particular target object. We present a general kernel framework for learning conditional rankings from various types of relational data, where rankings can be conditioned on unseen data objects. We propose efficient algorithms for conditional ranking by optimizing squared regression and ranking loss functions. We show theoretically, that learning with the ranking loss is likely to generalize better than with the regression loss. Further, we prove that symmetry or reciprocity properties of relations can be efficiently enforced in the learned models. Experiments on synthetic and real-world data illustrate that the proposed methods deliver state-of-the-art performance in terms of predictive power and computational efficiency. Moreover, we also show empirically that incorporating symmetry or reciprocity properties can improve the generalization performance.
引用
收藏
页码:321 / 356
页数:36
相关论文
共 50 条
  • [1] Efficient regularized least-squares algorithms for conditional ranking on relational data
    Tapio Pahikkala
    Antti Airola
    Michiel Stock
    Bernard De Baets
    Willem Waegeman
    [J]. Machine Learning, 2013, 93 : 321 - 356
  • [2] Regularized Least-Squares for parse ranking
    Tsivtsivadze, E
    Pahikkala, T
    Pyysalo, S
    Boberg, J
    Mylläri, A
    Salakoski, T
    [J]. ADVANCES IN INTELLIGENT DATA ANALYSIS VI, PROCEEDINGS, 2005, 3646 : 464 - 474
  • [3] Efficient retrieval of the regularized least-squares solution
    Sundaram, R
    [J]. OPTICAL ENGINEERING, 1998, 37 (04) : 1283 - 1289
  • [4] Efficient AUC Maximization with Regularized Least-Squares
    Pahikkala, Tapio
    Airola, Antti
    Suominen, Hanna
    Boberg, Jorma
    Salakoski, Tapio
    [J]. TENTH SCANDINAVIAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2008, 173 : 12 - 19
  • [5] EFFICIENT ALGORITHMS FOR LEAST-SQUARES RESTORATION
    AUYEUNG, C
    MERSEREAU, RM
    [J]. VISUAL COMMUNICATIONS AND IMAGE PROCESSING IV, PTS 1-3, 1989, 1199 : 1534 - 1540
  • [6] Regularized Recursive Least-Squares Algorithms for the Identification of Bilinear Forms
    Elisei-Iliescu, Camelia
    Paleologu, Constantin
    Stanciu, Cristian
    Anghel, Cristian
    Ciochina, Silviu
    Benesty, Jacob
    [J]. 2018 13TH INTERNATIONAL SYMPOSIUM ON ELECTRONICS AND TELECOMMUNICATIONS (ISETC), 2018, : 247 - 250
  • [7] REGULARIZED FAST RECURSIVE LEAST-SQUARES ALGORITHMS FOR ADAPTIVE FILTERING
    HOUACINE, A
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1991, 39 (04) : 860 - 871
  • [8] Sobolev norm learning rates for regularized least-squares algorithms
    Fischer, Simon
    Steinwart, Ingo
    [J]. Journal of Machine Learning Research, 2020, 21
  • [9] On the Performance of the Variable-Regularized Recursive Least-Squares Algorithms
    Elisei-Iliescu, Camelia
    Paleologu, Constantin
    Tamas, Razvan
    [J]. ADVANCED TOPICS IN OPTOELECTRONICS, MICROELECTRONICS, AND NANOTECHNOLOGIES IX, 2018, 10977
  • [10] Sobolev Norm Learning Rates for Regularized Least-Squares Algorithms
    Fischer, Simon
    Steinwart, Ingo
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21