Learning From Weights: Cost-Sensitive Approach For Retrieval

被引:0
|
作者
Begwani, Nikit [1 ]
Harsola, Shrutendra [1 ,2 ]
Agrawal, Rahul [1 ]
机构
[1] Microsoft AI & R, Redmond, WA 98052 USA
[2] Intuit AI, Mountain View, CA USA
关键词
Neural Networks; Cost-sensitive Learning; Sponsored Ads; MODEL;
D O I
10.1145/3371158.3371178
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
In sponsored search, we need to retrieve the relevant top few ads from millions of ad copies within milliseconds. Running deep learning models or interaction models online becomes computationally expensive and hence not feasible for retrieval. There is not much discussion on improving cost-effective online retrieval models which work on representation based learning. In this paper we discuss one such improvement by incorporating cost-sensitive training. Online retrieval models which are trained on click-through data treats each clicked query-document pair as equivalent. While training on click-through data is reasonable, this paper argues that it is sub-optimal because of its noisy and long-tail nature (especially for sponsored search). In this paper, we discuss the impact of incorporating or disregarding the long tail pairs in the training set. Also, we propose a weighing based strategy using which we can learn semantic representations for tail pairs without compromising the quality of retrieval. Online A/B testing on live search engine traffic showed improvements in clicks (11.8% higher CTR) and as well as improvement in quality (8.2% lower bounce rate) when compared to the unweighted model. To prove the efficacy of the model we did offline experimentation with Bi-LSTM based representation model as well.
引用
收藏
页码:170 / 174
页数:5
相关论文
共 50 条
  • [21] Cost-Sensitive Learning with Noisy Labels
    Natarajan, Nagarajan
    Dhillon, Inderjit S.
    Ravikumar, Pradeep
    Tewari, Ambuj
    [J]. JOURNAL OF MACHINE LEARNING RESEARCH, 2018, 18 : 1 - 33
  • [22] Cost-sensitive learning with neural networks
    Kukar, M
    Kononenko, I
    [J]. ECAI 1998: 13TH EUROPEAN CONFERENCE ON ARTIFICIAL INTELLIGENCE, PROCEEDINGS, 1998, : 445 - 449
  • [23] COST-SENSITIVE STACKING FOR AUDIO TAG ANNOTATION AND RETRIEVAL
    Lo, Hung-Yi
    Wang, Ju-Chiang
    Wang, Hsin-Min
    Lin, Shou-De
    [J]. 2011 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH, AND SIGNAL PROCESSING, 2011, : 2308 - 2311
  • [24] Learning cost-sensitive active classifiers
    Greiner, R
    Grove, AJ
    Roth, D
    [J]. ARTIFICIAL INTELLIGENCE, 2002, 139 (02) : 137 - 174
  • [25] Merge reduction for cost-sensitive learning
    Zhang, Aiting
    Xu, Juan
    Chen, Wenbin
    Min, Fan
    [J]. Journal of Computational Information Systems, 2014, 10 (23): : 10093 - 10102
  • [26] Cost-sensitive rough set approach
    Ju, Hengrong
    Yang, Xibei
    Yu, Hualong
    Li, Tongjun
    Yu, Dong-Jun
    Yang, Jingyu
    [J]. INFORMATION SCIENCES, 2016, 355 : 282 - 298
  • [27] Cost-Sensitive Listwise Ranking Approach
    Lu, Min
    Xie, MaoQiang
    Wang, Yang
    Liu, Jie
    Huang, YaLou
    [J]. ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PT I, PROCEEDINGS, 2010, 6118 : 358 - +
  • [28] Weighted Learning Vector Quantization to Cost-Sensitive Learning
    Chen, Ning
    Ribeiro, Bernardete
    Vieira, Armando
    Duarte, Joao
    Neves, Joao
    [J]. ARTIFICIAL NEURAL NETWORKS (ICANN 2010), PT III, 2010, 6354 : 277 - +
  • [29] Evolutionary Cost-Sensitive Extreme Learning Machine
    Zhang, Lei
    Zhang, David
    [J]. IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2017, 28 (12) : 3045 - 3060
  • [30] Cost-sensitive learning based on Bregman divergences
    Santos-Rodriguez, Raul
    Guerrero-Curieses, Alicia
    Alaiz-Rodriguez, Rocio
    Cid-Sueiro, Jesus
    [J]. MACHINE LEARNING, 2009, 76 (2-3) : 271 - 285