Inductive Representation Learning on Feature Rich Complex Networks for Churn Prediction in Telco

被引:1
|
作者
Oskarsdottir, Maria [1 ]
Cornette, Sander [2 ]
Deseure, Floris [2 ]
Baesens, Bart [2 ,3 ]
机构
[1] Reykjavik Univ, Dept Comp Sci, Menntavegi 1, IS-101 Reykjavik, Iceland
[2] Katholieke Univ Leuven, Dept Decis Sci & Informat Management, Naamsestr 69, B-3000 Leuven, Belgium
[3] Univ Southampton, Dept Decis Analyt & Risk, Southampton, Hants, England
关键词
Call network; Churn prediction; Representation learning; Supervised learning;
D O I
10.1007/978-3-030-36687-2_70
中图分类号
TP39 [计算机的应用];
学科分类号
081203 ; 0835 ;
摘要
In the mobile telecommunication industry, call networks have been used with great success to predict customer churn. These social networks are complex and rich in features, because the telecommunications operators have a lot of information about their customers. In this paper we leverage a novel framework called GraphSAGE for inductive representation learning on networks with the goal of predicting customer churn. The technique has an advantage over previously proposed representation learning techniques because it leverages node features in the learning process. It also features a supervised learning process, which can be used to predict churn directly, as well as an unsupervised variant which produces an embedding. We study how the number of node features impacts the predictive performance of churn models as well as the benefit of a complete learning process, compared to an embedding with supervised machine learning techniques. Finally, we compare the performance of GraphSAGE to that of standard local models.
引用
收藏
页码:845 / 853
页数:9
相关论文
共 50 条
  • [21] A novel deep learning model based on convolutional neural networks for employee churn prediction
    Ozmen, Ebru Pekel
    Ozcan, Tuncay
    JOURNAL OF FORECASTING, 2022, 41 (03) : 539 - 550
  • [22] Inductive Representation Learning in Temporal Networks via Mining Neighborhood and Community Influences
    Liu, Meng
    Liu, Yong
    SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, : 2202 - 2206
  • [23] Detecting Fake News Spreaders in Social Networks using Inductive Representation Learning
    Rath, Bhavtosh
    Salecha, Aadesh
    Srivastava, Jaideep
    2020 IEEE/ACM INTERNATIONAL CONFERENCE ON ADVANCES IN SOCIAL NETWORKS ANALYSIS AND MINING (ASONAM), 2020, : 182 - 189
  • [24] Inductive Representation Learning via CNN for Partially-Unseen Attributed Networks
    Zhao, Zhongying
    Zhou, Hui
    Qi, Liang
    Chang, Liang
    Zhou, MengChu
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2021, 8 (01): : 695 - 706
  • [25] Distance-Aware Learning for Inductive Link Prediction on Temporal Networks
    Pan, Zhiqiang
    Cai, Fei
    Liu, Xinwang
    Chen, Honghui
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2025, 36 (01) : 978 - 990
  • [26] tcc2vec: RFM-informed representation learning on call graphs for churn prediction
    Mitrovic, Sandra
    Baesens, Bart
    Lemahieu, Wilfried
    De Weerdt, Jochen
    INFORMATION SCIENCES, 2021, 557 : 270 - 285
  • [27] Representation Learning Based on Path Selection in Complex Networks
    Liu Q.-X.
    Long H.
    Zheng P.-X.
    Long, Hang (2120161020@bit.edu.cn), 1600, Beijing Institute of Technology (40): : 282 - 289
  • [28] Model Optimization Analysis of Customer Churn Prediction Using Machine Learning Algorithms with Focus on Feature Reductions
    Mirabdolbaghi, Seyed Mohammad Sina
    Amiri, Babak
    DISCRETE DYNAMICS IN NATURE AND SOCIETY, 2022, 2022
  • [29] Feature-rich networks: going beyond complex network topologies
    Interdonato, Roberto
    Atzmueller, Martin
    Gaito, Sabrina
    Kanawati, Rushed
    Largeron, Christine
    Sala, Alessandra
    APPLIED NETWORK SCIENCE, 2019, 4 (01) : 1 - 13
  • [30] Feature-rich networks: going beyond complex network topologies
    Roberto Interdonato
    Martin Atzmueller
    Sabrina Gaito
    Rushed Kanawati
    Christine Largeron
    Alessandra Sala
    Applied Network Science, 4