A Dual Input-aware Factorization Machine for CTR Prediction

被引:0
|
作者
Lu, Wantong [1 ]
Yu, Yantao [1 ]
Chang, Yongzhe [1 ,2 ]
Wang, Zhen [3 ]
Li, Chenhui [1 ]
Yuan, Bo [1 ]
机构
[1] Tsinghua Univ, Shenzhen Int Grad Sch, Shenzhen, Peoples R China
[2] Data 61 CSIRO, Sydney, NSW, Australia
[3] Univ Sydney, UBTECH Sydney AI Ctr, Sydney, NSW, Australia
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Factorization Machines (FMs) refer to a class of general predictors working with real valued feature vectors, which are well-known for their ability to estimate model parameters under significant sparsity and have found successful applications in many areas such as the click-through rate (CTR) prediction. However, standard FMs only produce a single fixed representation for each feature across different input instances, which may limit the CTR model's expressive and predictive power. Inspired by the success of Input-aware Factorization Machines (IFMs), which aim to learn more flexible and informative representations of a given feature according to different input instances, we propose a novel model named Dual Input-aware Factorization Machines (DIFMs) that can adaptively reweight the original feature representations at the bit-wise and vector-wise levels simultaneously. Furthermore, DIFMs strategically integrate various components including Multi-Head Self-Attention, Residual Networks and DNNs into a unified end-to-end model. Comprehensive experiments on two real-world CTR prediction datasets show that the DIFM model can outperform several state-of-the-art models consistently.
引用
收藏
页码:3139 / 3145
页数:7
相关论文
共 50 条
  • [1] An Input-aware Factorization Machine for Sparse Prediction
    Yu, Yantao
    Wang, Zhen
    Yuan, Bo
    PROCEEDINGS OF THE TWENTY-EIGHTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2019, : 1466 - 1472
  • [2] Hierarchical Attention Factorization Machine for CTR Prediction
    Long, Lianjie
    Yin, Yunfei
    Huang, Faliang
    DATABASE SYSTEMS FOR ADVANCED APPLICATIONS, DASFAA 2022, PT II, 2022, : 343 - 358
  • [3] Field-aware Factorization Machines for CTR Prediction
    Juan, Yuchin
    Zhuang, Yong
    Chin, Wei-Sheng
    Lin, Chih-Jen
    PROCEEDINGS OF THE 10TH ACM CONFERENCE ON RECOMMENDER SYSTEMS (RECSYS'16), 2016, : 43 - 50
  • [4] Input Enhanced Logarithmic Factorization Network for CTR Prediction
    Li, Xianzhuang
    Wang, Zhen
    Wu, Xuesong
    Yuan, Bo
    Wang, Xueqian
    ADVANCES IN KNOWLEDGE DISCOVERY AND DATA MINING, PAKDD 2022, PT III, 2022, 13282 : 42 - 54
  • [5] Input-Aware Approximate Computing
    Piri, Ali
    Saeedi, Sepide
    Barbareschi, Mario
    Deveautour, Bastien
    Di Carlo, Stefano
    O'Connor, Ian
    Savino, Alessandro
    Traiola, Marcello
    Bosio, Alberto
    PROCEEDINGS OF 2022 IEEE INTERNATIONAL CONFERENCE ON AUTOMATION, QUALITY AND TESTING, ROBOTICS (AQTR 2022), 2022, : 71 - 76
  • [6] Input-Aware Dynamic Backdoor Attack
    Nguyen, Tuan Anh
    Tran, Tuan Anh
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 33, NEURIPS 2020, 2020, 33
  • [7] Factorization Machine Based on Bitwise Feature Importance for CTR Prediction
    Li, Hao
    Li, Caimao
    Hou, Yuquan
    Lin, Hao
    Chen, Qiuhong
    DATA SCIENCE (ICPCSEE 2022), PT I, 2022, 1628 : 29 - 40
  • [8] Adaptive Input-aware Compilation for Graphics Engines
    Samadi, Mehrzad
    Hormati, Amir
    Mehrara, Mojtaba
    Lee, Janghaeng
    Mahlke, Scott
    ACM SIGPLAN NOTICES, 2012, 47 (06) : 13 - 22
  • [9] Input-aware accuracy characterization for approximate circuits
    Piri, Ali
    Pappalardo, Salvatore
    Barone, Salvatore
    Barbareschi, Mario
    Deveautour, Bastien
    Traiola, Marcello
    O'Connor, Ian
    Bosio, Alberto
    2023 53RD ANNUAL IEEE/IFIP INTERNATIONAL CONFERENCE ON DEPENDABLE SYSTEMS AND NETWORKS WORKSHOPS, DSN-W, 2023, : 179 - 182
  • [10] HoAFM: A High-order Attentive Factorization Machine for CTR Prediction
    Tao, Zhulin
    Wang, Xiang
    He, Xiangnan
    Huang, Xianglin
    Chua, Tat-Seng
    INFORMATION PROCESSING & MANAGEMENT, 2020, 57 (06)