Scalable learning of large networks

被引:1
|
作者
Roy, S. [1 ]
Plis, S. [1 ]
Werner-Washburne, M. [2 ]
Lane, T. [1 ]
机构
[1] Univ New Mexico, Dept Comp Sci, Albuquerque, NM 87131 USA
[2] Univ New Mexico, Dept Biol, Albuquerque, NM 87131 USA
关键词
NONQUIESCENT CELLS; QUIESCENT; TIME;
D O I
10.1049/iet-syb.2008.0161
中图分类号
Q2 [细胞生物学];
学科分类号
071009 ; 090102 ;
摘要
Cellular networks inferred from condition-specific microarray data can capture the functional rewiring of cells in response to different environmental conditions. Unfortunately, many algorithms for inferring cellular networks do not scale to whole-genome data with thousands of variables. We propose a novel approach for scalable learning of large networks: cluster and infer networks (CIN). CIN learns network structures in two steps: (a) partition variables into smaller clusters, and (b) learn networks per cluster. We optionally revisit the cluster assignment of variables with poor neighbourhoods. Results on networks with known topologies suggest that CIN has substantial speed benefits, without substantial performance loss. We applied our approach to microarray compendia of glucose-starved yeast cells. The inferred networks had significantly higher number of subgraphs representing meaningful biological dependencies than random graphs. Analysis of subgraphs identified biological processes that agreed well with existing information about yeast populations under glucose starvation, and also implicated novel pathways that were previously not known to be associated with these populations.
引用
收藏
页码:404 / U157
页数:16
相关论文
共 50 条
  • [41] Nonmonotonic Learning in Large Biological Networks
    Bragaglia, Stefano
    Ray, Oliver
    INDUCTIVE LOGIC PROGRAMMING, ILP 2014, 2015, 9046 : 33 - 48
  • [42] Team oriented multicast: A scalable routing protocol for large mobile networks
    Yi, YJ
    Gerla, M
    Park, JS
    Maggiorini, D
    GROUP COMMUNICATIONS AND CHARGES, PROCEEDINGS: TECHNOLOGY AND BUSINESS MODELS, 2003, 2816 : 143 - 154
  • [43] A Scalable Model for Efficient Information Diffusion in Large Real World Networks
    Mohan, Anuraj
    Kunnakadan, Shafeeq
    Neelakantan, Bharadhwaj
    Jayakumar, Ashwin
    Salim, Hanzal
    2016 INTERNATIONAL CONFERENCE ON NEXT GENERATION INTELLIGENT SYSTEMS (ICNGIS), 2016, : 370 - 375
  • [44] DistGNN: Scalable Distributed Training for Large -Scale Graph Neural Networks
    Md, Vasimuddin
    Misra, Sanchit
    Ma, Guixiang
    Mohanty, Ramanarayan
    Georganas, Evangelos
    Heinecke, Alexander
    Kalamkar, Dhiraj
    Ahmed, Nesreen K.
    Avancha, Sasikanth
    SC21: INTERNATIONAL CONFERENCE FOR HIGH PERFORMANCE COMPUTING, NETWORKING, STORAGE AND ANALYSIS, 2021,
  • [45] SuperFlow: A Reliable and Scalable Architecture for Large-Scale Enterprise Networks
    Hu Guangwu
    Jiang Yong
    Chen Wenlong
    Chen Tan
    Wu Jianping
    CHINESE JOURNAL OF ELECTRONICS, 2016, 25 (06) : 1134 - 1140
  • [46] Accurate and Scalable Nearest Neighbors in Large Networks Based on Effective Importance
    Bogdanov, Petko
    Singh, Ambuj
    PROCEEDINGS OF THE 22ND ACM INTERNATIONAL CONFERENCE ON INFORMATION & KNOWLEDGE MANAGEMENT (CIKM'13), 2013, : 1009 - 1018
  • [47] RBPC: A Scalable Routing Protocol for Large Scale Wireless Sensor Networks
    Hao, Binbin
    Li, Changle
    2010 6TH INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS NETWORKING AND MOBILE COMPUTING (WICOM), 2010,
  • [48] SRSO: A Scalable Routing Scheme for Large-scale OpenFlow Networks
    Liang, Haochi
    Hong, Peilin
    Zhou, Wei
    2014 IEEE INTERNATIONAL CONFERENCE ON COMPUTER AND INFORMATION TECHNOLOGY (CIT), 2014, : 636 - 641
  • [49] Implementation and analysis of scalable and flexible node for large-scale networks
    Kodama, T
    Oguchi, N
    Kawasaki, T
    Sudo, T
    Tsuruoka, T
    2005 Asia-Pacific Conference on Communications (APCC), Vols 1& 2, 2005, : 842 - 846
  • [50] Scalable source/channel decoding for large-scale sensor networks
    Barros, J
    Tüchler, M
    Lee, SP
    2004 IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, VOLS 1-7, 2004, : 881 - 885