A Crowdsourcing Framework for On-Device Federated Learning

被引:185
|
作者
Pandey, Shashi Raj [1 ]
Tran, Nguyen H. [2 ]
Bennis, Mehdi [3 ,4 ]
Tun, Yan Kyaw [1 ]
Manzoor, Aunas [1 ]
Hong, Choong Seon [1 ]
机构
[1] Kyung Hee Univ, Dept Comp Sci & Engn, Yongin 17104, South Korea
[2] Univ Sydney, Sch Comp Sci, Sydney, NSW 2006, Australia
[3] Univ Oulu, Ctr Wireless Commun, Oulu 90014, Finland
[4] Kyung Hee Univ, Dept Comp Sci & Engn, Seoul 17104, South Korea
基金
新加坡国家研究基金会;
关键词
Decentralized machine learning; federated learning (FL); mobile crowdsourcing; incentive mechanism; stackelberg game; OPTIMIZATION;
D O I
10.1109/TWC.2020.2971981
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
Federated learning (FL) rests on the notion of training a global model in a decentralized manner. Under this setting, mobile devices perform computations on their local data before uploading the required updates to improve the global model. However, when the participating clients implement an uncoordinated computation strategy, the difficulty is to handle the communication efficiency (i.e., the number of communications per iteration) while exchanging the model parameters during aggregation. Therefore, a key challenge in FL is how users participate to build a high-quality global model with communication efficiency. We tackle this issue by formulating a utility maximization problem, and propose a novel crowdsourcing framework to leverage FL that considers the communication efficiency during parameters exchange. First, we show an incentive-based interaction between the crowdsourcing platform and the participating client's independent strategies for training a global learning model, where each side maximizes its own benefit. We formulate a two-stage Stackelberg game to analyze such scenario and find the game's equilibria. Second, we formalize an admission control scheme for participating clients to ensure a level of local accuracy. Simulated results demonstrate the efficacy of our proposed solution with up to 22% gain in the offered reward.
引用
收藏
页码:3241 / 3256
页数:16
相关论文
共 50 条
  • [1] Blockchained On-Device Federated Learning
    Kim, Hyesung
    Park, Jihong
    Bennis, Mehdi
    Kim, Seong-Lyun
    [J]. IEEE COMMUNICATIONS LETTERS, 2020, 24 (06) : 1279 - 1283
  • [2] On-Device Training of Machine Learning Models on Microcontrollers with Federated Learning
    Llisterri Gimenez, Nil
    Monfort Grau, Marc
    Pueyo Centelles, Roger
    Freitag, Felix
    [J]. ELECTRONICS, 2022, 11 (04)
  • [3] Trading Data For Learning: Incentive Mechanism For On-Device Federated Learning
    Hu, Rui
    Gong, Yanmin
    [J]. 2020 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2020,
  • [4] Incentivize to Build: A Crowdsourcing Framework for Federated Learning
    Pandey, Shashi Raj
    Tran, Nguyen H.
    Bennis, Mehdi
    Tun, Yan Kyaw
    Han, Zhu
    Hong, Choong Seon
    [J]. 2019 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM), 2019,
  • [5] Hiding in the Crowd: Federated Data Augmentation for On-Device Learning
    Jeong, Eunjeong
    Oh, Seungeun
    Park, Jihong
    Kim, Hyesung
    Bennis, Mehdi
    Kim, Seong-Lyun
    [J]. IEEE INTELLIGENT SYSTEMS, 2021, 36 (05) : 80 - 86
  • [6] PARTIAL VARIABLE TRAINING FOR EFFICIENT ON-DEVICE FEDERATED LEARNING
    Yang, Tien-Ju
    Guliani, Dhruv
    Beaufays, Francoise
    Motta, Giovanni
    [J]. 2022 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP), 2022, : 4348 - 4352
  • [7] TinyFL: On-Device Training, Communication And Aggregation On A Microcontroller For Federated Learning
    Wulfert, Lars
    Wiede, Christian
    Grabmaier, Anton
    [J]. 2023 21ST IEEE INTERREGIONAL NEWCAS CONFERENCE, NEWCAS, 2023,
  • [8] Internet of Things intrusion Detection: Centralized, On-Device, or Federated Learning?
    Rahman, Sawsan Abdul
    Tout, Hanine
    Talhi, Chamseddine
    Mourad, Azzam
    [J]. IEEE NETWORK, 2020, 34 (06): : 310 - 317
  • [9] Improving on-device speaker verification using federated learning with privacy
    Granqvist, Filip
    Seigel, Matt
    van Dalen, Rogier
    Cahill, Aine
    Shum, Stephen
    Paulik, Matthias
    [J]. INTERSPEECH 2020, 2020, : 4328 - 4332
  • [10] TensorFlow Lite: On-Device Machine Learning Framework
    Li S.
    [J]. Jisuanji Yanjiu yu Fazhan/Computer Research and Development, 2020, 57 (09): : 1839 - 1853