Optimal and Efficient Distributed Online Learning for Big Data

被引:3
|
作者
Sayin, Muhammed O. [1 ]
Vanli, N. Denizcan [1 ]
Delibalta, Ibrahim [2 ,3 ]
Kozat, Suleyman S. [1 ]
机构
[1] Bilkent Univ, Dept Elect & Elect Engn, Ankara, Turkey
[2] AVEA Commun Serv Inc, AveaLabs, Istanbul, Turkey
[3] Koc Univ, Grad Sch Social Sci & Humanities, Istanbul, Turkey
关键词
distributed processing; online learning; optimal and efficient; static state estimation; Big Data; smart grid; DIFFUSION STRATEGIES; STATE ESTIMATION; CONSENSUS; NETWORKS; SCHEME;
D O I
10.1109/BigDataCongress.2015.27
中图分类号
TP301 [理论、方法];
学科分类号
081202 ;
摘要
We propose optimal and efficient distributed online learning strategies for Big Data applications. Here, we consider the optimal state estimation over distributed network of autonomous data sources. The autonomous data sources can generate and process data locally irrespective of any centralized control unit. We seek to enhance the learning rate through the distributed control of those autonomous data sources. We emphasize that although this problem attracted significant attention and extensively studied in different fields including services computing and machine learning disciplines, all the well-known strategies achieve suboptimal online learning performance in the mean square error sense. To this end, we introduce the oracle algorithm as the optimal distributed online learning strategy. We also propose the optimal and efficient distributed online learning algorithm that reduces the communication load tremendously, i.e., requires the undirected disclosure of only a single scalar. Finally, we demonstrate the significant performance gains due to the proposed strategies with respect to the state-of-the-art approaches.
引用
收藏
页码:126 / 133
页数:8
相关论文
共 50 条
  • [41] Toward Efficient Online Scheduling for Distributed Machine Learning Systems
    Yu, Menglu
    Liu, Jia
    Wu, Chuan
    Ji, Bo
    Bentley, Elizabeth S.
    IEEE TRANSACTIONS ON NETWORK SCIENCE AND ENGINEERING, 2022, 9 (04): : 1951 - 1969
  • [42] A Novel Efficient Discriminative Projection Learning for Big Data
    Cui, Yan
    Jiang, Jielin
    Jiang, Xiaoyan
    Dai, Yue
    2022 IEEE INTL CONF ON DEPENDABLE, AUTONOMIC AND SECURE COMPUTING, INTL CONF ON PERVASIVE INTELLIGENCE AND COMPUTING, INTL CONF ON CLOUD AND BIG DATA COMPUTING, INTL CONF ON CYBER SCIENCE AND TECHNOLOGY CONGRESS (DASC/PICOM/CBDCOM/CYBERSCITECH), 2022, : 658 - 665
  • [43] Online Categorical Subspace Learning for Sketching Big Data with Misses
    Shen, Yanning
    Mardani, Morteza
    Giannakis, Georgios B.
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2017, 65 (15) : 4004 - 4018
  • [44] Protecting Machine Learning Integrity in Distributed Big Data Networking
    Wei, Yunkai
    Chen, Yijin
    Xiao, Mingyue
    Maharjan, Sabita
    Zhang, Yan
    IEEE NETWORK, 2020, 34 (04): : 84 - 90
  • [45] A Systematic Review of Distributed Deep Learning Frameworks for Big Data
    Berloco, Francesco
    Bevilacqua, Vitoantonio
    Colucci, Simona
    INTELLIGENT COMPUTING METHODOLOGIES, PT III, 2022, 13395 : 242 - 256
  • [46] On the Distributed Implementation of Unsupervised Extreme Learning Machines for Big Data
    Rizk, Yara
    Awad, Mariette
    INNS CONFERENCE ON BIG DATA 2015 PROGRAM, 2015, 53 : 167 - 174
  • [47] Distributed dictionary learning for industrial process monitoring with big data
    Keke Huang
    Ke Wei
    Yonggang Li
    Chunhua Yang
    Applied Intelligence, 2021, 51 : 7718 - 7734
  • [48] Instance segmentation on distributed deep learning big data cluster
    Elhmadany, Mohammed
    Elmadah, Islam
    Abdelmunim, Hossam E.
    JOURNAL OF BIG DATA, 2024, 11 (01)
  • [49] Nonparametric Distributed Learning Architecture for Big Data: Algorithm and Applications
    Bruce, Scott
    Li, Zeda
    Yang, Hsiang-Chieh
    Mukhopadhyay, Subhadeep
    IEEE TRANSACTIONS ON BIG DATA, 2019, 5 (02) : 166 - 179
  • [50] Distributed dictionary learning for industrial process monitoring with big data
    Huang, Keke
    Wei, Ke
    Li, Yonggang
    Yang, Chunhua
    APPLIED INTELLIGENCE, 2021, 51 (11) : 7718 - 7734