ADMoE: Anomaly Detection with Mixture-of-Experts from Noisy Labels

被引:0
|
作者
Zhao, Yue [1 ,2 ,3 ]
Zheng, Guoqing [2 ]
Mukherjee, Subhabrata [2 ]
McCann, Robert [2 ]
Awadallah, Ahmed [2 ]
机构
[1] Carnegie Mellon Univ, Pittsburgh, PA 15213 USA
[2] Microsoft, Redmond, WA 98052 USA
[3] Microsoft Res, Redmond, WA USA
关键词
D O I
暂无
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Existing works on anomaly detection (AD) rely on clean labels from human annotators that are expensive to acquire in practice. In this work, we propose a method to leverage weak/noisy labels (e.g., risk scores generated by machine rules for detecting malware) that are cheaper to obtain for anomaly detection. Specifically, we propose ADMoE, the first framework for anomaly detection algorithms to learn from noisy labels. In a nutshell, ADMoE leverages Mixture-of-experts (MoE) architecture to encourage specialized and scalable learning from multiple noisy sources. It captures the similarities among noisy labels by sharing most model parameters, while encouraging specialization by building "expert" sub-networks. To further juice out the signals from noisy labels, ADMoE uses them as input features to facilitate expert learning. Extensive results on eight datasets (including a proprietary enterprise security dataset) demonstrate the effectiveness of ADMoE, where it brings up to 34% performance improvement over not using it. Also, it outperforms a total of 13 leading baselines with equivalent network parameters and FLOPS. Notably, ADMoE is model-agnostic to enable any neural network-based detection methods to handle noisy labels, where we showcase its results on both multiple-layer perceptron (MLP) and leading AD method DeepSAD.
引用
收藏
页码:4937 / 4945
页数:9
相关论文
共 50 条
  • [1] Spatial Mixture-of-Experts
    Dryden, Nikoli
    Hoefler, Torsten
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35 (NEURIPS 2022), 2022,
  • [2] Asymptotic properties of mixture-of-experts models
    Olteanu, M.
    Rynkiewicz, J.
    [J]. NEUROCOMPUTING, 2011, 74 (09) : 1444 - 1449
  • [3] Mixture-of-Experts with Expert Choice Routing
    Zhou, Yanqi
    Lei, Tao
    Liu, Hanxiao
    Du, Nan
    Huang, Yanping
    Zhao, Vincent Y.
    Dai, Andrew
    Chen, Zhifeng
    Le, Quoc
    Laudon, James
    [J]. ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 35, NEURIPS 2022, 2022,
  • [4] Efficient Routing in Sparse Mixture-of-Experts
    [J]. Shamsolmoali, Pourya (pshams55@gmail.com), 1600, Institute of Electrical and Electronics Engineers Inc.
  • [5] MoDE: A Mixture-of-Experts Model with Mutual Distillation among the Experts
    Xie, Zhitian
    Zhang, Yinger
    Zhuang, Chenyi
    Shi, Qitao
    Liu, Zhining
    Gu, Jinjie
    Zhang, Guannan
    [J]. THIRTY-EIGHTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, VOL 38 NO 14, 2024, : 16067 - 16075
  • [6] A Universal Approximation Theorem for Mixture-of-Experts Models
    Nguyen, Hien D.
    Lloyd-Jones, Luke R.
    McLachlan, Geoffrey J.
    [J]. NEURAL COMPUTATION, 2016, 28 (12) : 2585 - 2593
  • [7] Semi-supervised mixture-of-experts classification
    Karakoulas, G
    Salakhutdinov, R
    [J]. FOURTH IEEE INTERNATIONAL CONFERENCE ON DATA MINING, PROCEEDINGS, 2004, : 138 - 145
  • [8] A mixture-of-experts framework for adaptive Kalman filtering
    Chaer, WS
    Bishop, RH
    Ghosh, J
    [J]. IEEE TRANSACTIONS ON SYSTEMS MAN AND CYBERNETICS PART B-CYBERNETICS, 1997, 27 (03): : 452 - 464
  • [9] A Multilevel Mixture-of-Experts Framework for Pedestrian Classification
    Enzweiler, Markus
    Gavrila, Dariu M.
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2011, 20 (10) : 2967 - 2979
  • [10] Collaborative Mixture-of-Experts Model for Multi-Domain Fake News Detection
    Zhao, Jian
    Zhao, Zisong
    Shi, Lijuan
    Kuang, Zhejun
    Liu, Yazhou
    [J]. ELECTRONICS, 2023, 12 (16)