Secure and Efficient Federated Transfer Learning

被引:0
|
作者
Sharma, Shreya [1 ]
Xing, Chaoping [2 ,3 ]
Liu, Yang [4 ]
Kang, Yan [4 ]
机构
[1] Indian Inst Technol BHU Varanasi, Dept Elect Engn, Varanasi, Uttar Pradesh, India
[2] Shanghai Jiao Tong Univ, Sch Elect Informat & Elect Engn, Shanghai, Peoples R China
[3] Nanyang Technol Univ, Sch Phys & Math Sci, Singapore, Singapore
[4] Webank, Shenzhen, Peoples R China
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Machine Learning models require a vast amount of data for accurate training. In reality, most data is scattered across different organizations and cannot be easily integrated under many legal and practical constraints. Federated Transfer Learning (FTL) was introduced in [1] to improve statistical models under a data federation that allow knowledge to be shared without compromising user privacy, and enable complementary knowledge to be transferred in the network. As a result, a target-domain party can build more flexible and powerful models by leveraging rich labels from a source-domain party. However, the excessive computational overhead of the security protocol involved in this model rendered it impractical. In this work, we aim towards enhancing the efficiency and security of existing models for practical collaborative training under a data federation by incorporating Secret Sharing (SS). In literature, only the semi-honest model for Federated Transfer Learning has been considered. In this paper, we improve upon the previous solution, and also allow malicious players who can arbitrarily deviate from the protocol in our FTL model. This is much stronger than the semi-hottest model where we assume that parties follow the protocol precisely. We do so using the one of the practical MPC protocol called SPDZ, thus our model can be efficiently extended to any number of parties even in the case of a dishonest majority. In addition, the models evaluated in our setting significantly outperform the previous work, in terms of both runtime and communication cost. A single iteration in our model executes in 0.8 seconds for the semi-honest case and 1.4 seconds for the malicious case for 500 samples, as compared to 35 seconds taken by the previous implementation.
引用
收藏
页码:2569 / 2576
页数:8
相关论文
共 50 条
  • [1] A Secure Federated Transfer Learning Framework
    Liu, Yang
    Kang, Yan
    Xing, Chaoping
    Chen, Tianjian
    Yang, Qiang
    [J]. IEEE INTELLIGENT SYSTEMS, 2020, 35 (04) : 70 - 82
  • [2] Efficient and Secure Federated Learning for Financial Applications
    Liu, Tao
    Wang, Zhi
    He, Hui
    Shi, Wei
    Lin, Liangliang
    An, Ran
    Li, Chenhao
    [J]. APPLIED SCIENCES-BASEL, 2023, 13 (10):
  • [3] A Secure and Efficient Federated Learning Framework for NLP
    Deng, Jieren
    Wang, Chenghong
    Meng, Xianrui
    Wang, Yijue
    Li, Ji
    Lin, Sheng
    Han, Shuo
    Miao, Fei
    Rajasekaran, Sanguthevar
    Ding, Caiwen
    [J]. 2021 CONFERENCE ON EMPIRICAL METHODS IN NATURAL LANGUAGE PROCESSING (EMNLP 2021), 2021, : 7676 - 7682
  • [4] EaSTFLy: Efficient and secure ternary federated learning
    Dong, Ye
    Chen, Xiaojun
    Shen, Liyan
    Wang, Dakui
    [J]. COMPUTERS & SECURITY, 2020, 94
  • [5] HFTL: Hierarchical Federated Transfer Learning for Secure and Efficient Fault Classification in Additive Manufacturing
    Putra, Made Adi Paramartha
    Rachmawati, Syifa Maliah
    Abisado, Mideth
    Sampedro, Gabriel Avelino
    [J]. IEEE ACCESS, 2023, 11 : 54795 - 54807
  • [6] Secure Federated Learning with Efficient Communication in Vehicle Network
    Li, Yinglong
    Zhang, Zhenjiang
    Zhang, Zhiyuan
    Kao, Yi-Chih
    [J]. JOURNAL OF INTERNET TECHNOLOGY, 2020, 21 (07): : 2075 - 2084
  • [7] Efficient and Secure Federated Learning Against Backdoor Attacks
    Miao, Yinbin
    Xie, Rongpeng
    Li, Xinghua
    Liu, Zhiquan
    Choo, Kim-Kwang Raymond
    Deng, Robert H.
    [J]. IEEE TRANSACTIONS ON DEPENDABLE AND SECURE COMPUTING, 2024, 21 (05) : 4619 - 4636
  • [8] Secure and Efficient Federated Learning Schemes for Healthcare Systems
    Song, Cheng
    Wang, Zhichao
    Peng, Weiping
    Yang, Nannan
    [J]. ELECTRONICS, 2024, 13 (13)
  • [9] Communication-Efficient Secure Aggregation for Federated Learning
    Ergun, Irem
    Sami, Hasin Us
    Guler, Basak
    [J]. 2022 IEEE GLOBAL COMMUNICATIONS CONFERENCE (GLOBECOM 2022), 2022, : 3881 - 3886
  • [10] Secure and Efficient Smart Healthcare System Based on Federated Learning
    Liu, Wei
    Zhang, Yinghui
    Han, Gang
    Cao, Jin
    Cui, Hui
    Zheng, Dong
    [J]. INTERNATIONAL JOURNAL OF INTELLIGENT SYSTEMS, 2023, 2023