Learning the Long-term Memory Effect of Power Amplifiers Using Temporal Convolutional Network

被引:0
|
作者
Akram, Iqra [1 ]
Ma, Yi [1 ]
He, Ziming [2 ]
Tong, Fei [2 ]
机构
[1] Univ Surrey, Inst Commun Syst, 5G & 6G Innovat Ctr, Guildford GU2 7XH, Surrey, England
[2] Samsung Elect, Samsung Cambridge Solut Ctr, Syst LSI, Cambridge CB4 0DS, England
关键词
Power amplifier (PA); digital pre-distortion (DPD); neural network (NN); temporal convolutional network (TCN); real-valued time-delay temporal convolutional network (RVTDTCN); long-term memory effects; 5G; DIGITAL PREDISTORTION;
D O I
10.1109/VTC2023-Fall60731.2023.10333622
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The objective of this work is to address one of the recent ITU AI/machine learning challenges, specifically on the RF power amplifier (PA) behavioral modelling. This paper presents a novel model namely real-valued time-delay temporal convolutional network (RVTDTCN) to learn the long-term memory effect from PA in 5G base-station transmitters. The proposed model employs dilated convolutions to incorporate information from various time stamps and capture intricate temporal dependencies. This is achieved without the requirement of additional coefficients. In our experiment, the ITU ML5G-PS-007 datasets collected from commercial 5G base-stations are employed for both training and testing purposes. The performance of the learned PA behavior is evaluated using both normalized mean square error (NMSE) and adjacent channel error power ratio (ACEPR). The results show that the RVTDTCN model achieves competitive performance with the state-of-the-art feedforward neural network (FNN) models but reducing the computational complexity by 60%. Moreover, the proposed model outperforms the state-of-the-art convolutional neural network (CNN) model by 2-3dB in NMSE and ACEPR, with a similar complexity.
引用
收藏
页数:6
相关论文
共 50 条
  • [1] Classification of Power Quality Disturbances Using Convolutional Network and Long Short-Term Memory Network
    Rodrigues Junior, Wilson Leal
    Silva Borges, Fabbio Anderson
    Lira Rabelo, Ricardo de A.
    Alves de Lima, Bruno Vicente
    Almeida de Alencar, Jose Eduardo
    2019 INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS (IJCNN), 2019,
  • [2] Incremental learning in dynamic environments using neural network with long-term memory
    Tsumori, K
    Ozawa, S
    PROCEEDINGS OF THE INTERNATIONAL JOINT CONFERENCE ON NEURAL NETWORKS 2003, VOLS 1-4, 2003, : 2583 - 2588
  • [3] Time-Frequency Characterization of Long-Term Memory in Nonlinear Power Amplifiers
    Hu, Jie
    Gard, Kevin G.
    Carvalho, Nuno Borges
    Steer, Michael B.
    2008 IEEE MTT-S INTERNATIONAL MICROWAVE SYMPOSIUM DIGEST, VOLS 1-4, 2008, : 269 - +
  • [4] Short- and long-term temporal network prediction based on network memory
    Zou, Li
    Ceria, Alberto
    Wang, Huijuan
    APPLIED NETWORK SCIENCE, 2023, 8 (01)
  • [5] Short- and long-term temporal network prediction based on network memory
    Li Zou
    Alberto Ceria
    Huijuan Wang
    Applied Network Science, 8
  • [6] Detecting Arabic sexual harassment using bidirectional long-short-term memory and a temporal convolutional network
    Hamzah, Noor Amer
    Dhannoon, Ban N.
    EGYPTIAN INFORMATICS JOURNAL, 2023, 24 (02) : 365 - 373
  • [7] Long-term memory for temporal structure:
    Matthew D. Schulkind
    Memory & Cognition, 1999, 27 (5) : 896 - 906
  • [8] Neural Network Structure for Spatio-Temporal Long-Term Memory
    Vu Anh Nguyen
    Starzyk, Janusz A.
    Goh, Wooi-Boon
    Jachyra, Daniel
    IEEE TRANSACTIONS ON NEURAL NETWORKS AND LEARNING SYSTEMS, 2012, 23 (06) : 971 - 983
  • [9] Spontaneous Temporal Grouping Neural Network for Long-Term Memory Modeling
    Shan, Dongjing
    Zhang, Xiongwei
    Zhang, Chao
    IEEE TRANSACTIONS ON COGNITIVE AND DEVELOPMENTAL SYSTEMS, 2022, 14 (02) : 472 - 484
  • [10] How embedded memory in recurrent neural network architectures helps learning long-term temporal dependencies
    EPSON Palo Alto Laboratory, Palo Alto, CA 94034, United States
    不详
    不详
    不详
    Neural Netw., 5 (861-868):