Efficient regularized least-squares algorithms for conditional ranking on relational data

被引:0
|
作者
Tapio Pahikkala
Antti Airola
Michiel Stock
Bernard De Baets
Willem Waegeman
机构
[1] University of Turku,Department of Information Technology and Turku Centre for Computer Science
[2] Ghent University,Department of Mathematical Modelling, Statistics and Bioinformatics
来源
Machine Learning | 2013年 / 93卷
关键词
Reciprocal relations; Symmetric relations; Learning to rank; Kernel methods; Regularized least-squares;
D O I
暂无
中图分类号
学科分类号
摘要
In domains like bioinformatics, information retrieval and social network analysis, one can find learning tasks where the goal consists of inferring a ranking of objects, conditioned on a particular target object. We present a general kernel framework for learning conditional rankings from various types of relational data, where rankings can be conditioned on unseen data objects. We propose efficient algorithms for conditional ranking by optimizing squared regression and ranking loss functions. We show theoretically, that learning with the ranking loss is likely to generalize better than with the regression loss. Further, we prove that symmetry or reciprocity properties of relations can be efficiently enforced in the learned models. Experiments on synthetic and real-world data illustrate that the proposed methods deliver state-of-the-art performance in terms of predictive power and computational efficiency. Moreover, we also show empirically that incorporating symmetry or reciprocity properties can improve the generalization performance.
引用
收藏
页码:321 / 356
页数:35
相关论文
共 50 条
  • [1] Efficient regularized least-squares algorithms for conditional ranking on relational data
    Pahikkala, Tapio
    Airola, Antti
    Stock, Michiel
    De Baets, Bernard
    Waegeman, Willem
    [J]. MACHINE LEARNING, 2013, 93 (2-3) : 321 - 356
  • [2] Regularized Least-Squares for parse ranking
    Tsivtsivadze, E
    Pahikkala, T
    Pyysalo, S
    Boberg, J
    Mylläri, A
    Salakoski, T
    [J]. ADVANCES IN INTELLIGENT DATA ANALYSIS VI, PROCEEDINGS, 2005, 3646 : 464 - 474
  • [3] Efficient retrieval of the regularized least-squares solution
    Sundaram, R
    [J]. OPTICAL ENGINEERING, 1998, 37 (04) : 1283 - 1289
  • [4] Efficient AUC Maximization with Regularized Least-Squares
    Pahikkala, Tapio
    Airola, Antti
    Suominen, Hanna
    Boberg, Jorma
    Salakoski, Tapio
    [J]. TENTH SCANDINAVIAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2008, 173 : 12 - 19
  • [5] EFFICIENT ALGORITHMS FOR LEAST-SQUARES RESTORATION
    AUYEUNG, C
    MERSEREAU, RM
    [J]. VISUAL COMMUNICATIONS AND IMAGE PROCESSING IV, PTS 1-3, 1989, 1199 : 1534 - 1540
  • [6] Regularized Recursive Least-Squares Algorithms for the Identification of Bilinear Forms
    Elisei-Iliescu, Camelia
    Paleologu, Constantin
    Stanciu, Cristian
    Anghel, Cristian
    Ciochina, Silviu
    Benesty, Jacob
    [J]. 2018 13TH INTERNATIONAL SYMPOSIUM ON ELECTRONICS AND TELECOMMUNICATIONS (ISETC), 2018, : 247 - 250
  • [7] REGULARIZED FAST RECURSIVE LEAST-SQUARES ALGORITHMS FOR ADAPTIVE FILTERING
    HOUACINE, A
    [J]. IEEE TRANSACTIONS ON SIGNAL PROCESSING, 1991, 39 (04) : 860 - 871
  • [8] Sobolev norm learning rates for regularized least-squares algorithms
    Fischer, Simon
    Steinwart, Ingo
    [J]. Journal of Machine Learning Research, 2020, 21
  • [9] On the Performance of the Variable-Regularized Recursive Least-Squares Algorithms
    Elisei-Iliescu, Camelia
    Paleologu, Constantin
    Tamas, Razvan
    [J]. ADVANCED TOPICS IN OPTOELECTRONICS, MICROELECTRONICS, AND NANOTECHNOLOGIES IX, 2018, 10977
  • [10] Sobolev Norm Learning Rates for Regularized Least-Squares Algorithms
    Fischer, Simon
    Steinwart, Ingo
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2020, 21