Attentive Multi-Layer Perceptron for Non-autoregressive Generation

被引:0
|
作者
Jiang, Shuyang [1 ]
Zhang, Jun [2 ]
Feng, Jiangtao [2 ]
Zheng, Lin [3 ]
Kong, Lingpeng [3 ]
机构
[1] Shanghai Jiao Tong Univ, Shanghai, Peoples R China
[2] Shanghai Artificial Intelligence Lab, Shanghai, Peoples R China
[3] Univ Hong Kong, Hong Kong, Peoples R China
关键词
AMLP; Multi-Layer Perceptron; Attention Mechanism; Non-Autoregressive Model; TRANSLATION;
D O I
10.1007/978-3-031-43415-0_36
中图分类号
TP18 [人工智能理论];
学科分类号
081104 ; 0812 ; 0835 ; 1405 ;
摘要
Autoregressive (AR) generation almost dominates sequence generation for its efficacy. Recently, non-autoregressive (NAR) generation gains increasing popularity for its efficiency and growing efficacy. However, its efficiency is still bottlenecked by quadratic complexity in sequence lengths, which is prohibitive for scaling to long sequence generation and few works have been done to mitigate this problem. In this paper, we propose a novel MLP variant, Attentive Multi-Layer Perceptron (AMLP), to produce a generation model with linear time and space complexity. Different from classic MLP with static and learnable projection matrices, AMLP leverages adaptive projections computed from inputs in an attentive mode. The sample-aware adaptive projections enable communications among tokens in a sequence, and model the measurement between the query and key space. Furthermore, we marry AMLP with popular NAR models, deriving a highly efficient NAR-AMLP architecture with linear time and space complexity. Empirical results show that such marriage architecture surpasses competitive efficient NAR models, by a significant margin on text-to-speech synthesis and machine translation. We also test AMLP's self- and cross-attention ability separately with extensive ablation experiments, and find them comparable or even superior to the other efficient models. The efficiency analysis further shows that AMLP extremely reduces the memory cost against vanilla non-autoregressive models for long sequences.
引用
收藏
页码:612 / 629
页数:18
相关论文
共 50 条
  • [31] An efficient implementation of multi-layer perceptron on mesh architecture
    Ayoubi, RA
    Bayoumi, MA
    2002 IEEE INTERNATIONAL SYMPOSIUM ON CIRCUITS AND SYSTEMS, VOL II, PROCEEDINGS, 2002, : 109 - 112
  • [32] A Study on Single and Multi-layer Perceptron Neural Network
    Singh, Jaswinder
    Banerjee, Rajdeep
    PROCEEDINGS OF THE 2019 3RD INTERNATIONAL CONFERENCE ON COMPUTING METHODOLOGIES AND COMMUNICATION (ICCMC 2019), 2019, : 35 - 40
  • [33] Training multi-layer perceptron with artificial algae algorithm
    Turkoglu, Bahaeddin
    Kaya, Ersin
    ENGINEERING SCIENCE AND TECHNOLOGY-AN INTERNATIONAL JOURNAL-JESTECH, 2020, 23 (06): : 1342 - 1350
  • [34] Many-objective training of a multi-layer perceptron
    Koeppen, Mario
    Yoshida, Kaori
    NEURAL NETWORK WORLD, 2007, 17 (06) : 627 - 637
  • [35] A Stochastic Computational Multi-Layer Perceptron with Backward Propagation
    Liu, Yidong
    Liu, Siting
    Wang, Yanzhi
    Lombardi, Fabrizio
    Han, Jie
    IEEE TRANSACTIONS ON COMPUTERS, 2018, 67 (09) : 1273 - 1286
  • [36] Multi-layer perceptron learning in the domain of attributed graphs
    Jain, BJ
    Wysotzki, F
    2004 IEEE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS, VOLS 1-4, PROCEEDINGS, 2004, : 1003 - 1008
  • [37] Geno-mathematical identification of the multi-layer perceptron
    Ostermark, Ralf
    NEURAL COMPUTING & APPLICATIONS, 2009, 18 (04): : 331 - 344
  • [38] Battle royale optimizer for training multi-layer perceptron
    Saeid Agahian
    Taymaz Akan
    Evolving Systems, 2022, 13 : 563 - 575
  • [39] Multi-layer Perceptron Based Video Surveillance System
    Harihar, Vijai Kumar
    Sukumaran, Sandeep
    Sirajuddin, Samar
    Sali, Aswathy
    2017 IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL INTELLIGENCE AND COMPUTING RESEARCH (ICCIC), 2017, : 490 - 495
  • [40] Seismic data denoising based on multi-layer perceptron
    Wang, Qiqi
    Tang, Jingtian
    Zhang, Liang
    Liu, Xiaojia
    Xu, Zhimin
    Shiyou Diqiu Wuli Kantan/Oil Geophysical Prospecting, 2020, 55 (02): : 272 - 281