Classification, Denoising, and Deinterleaving of Pulse Streams With Recurrent Neural Networks

被引:104
|
作者
Liu, Zhang-Meng [1 ]
Yu, Philip S. [2 ]
机构
[1] Natl Univ Def Technol, State Key Lab Complex Electromagnet Environm Effe, Changsha 410073, Hunan, Peoples R China
[2] Univ Illinois, Dept Comp Sci, Chicago, IL 60607 USA
基金
美国国家科学基金会;
关键词
IMPROVED ALGORITHM; RADAR; PREDICTION; SIGNALS; TRAINS; GAME; GO;
D O I
10.1109/TAES.2018.2874139
中图分类号
V [航空、航天];
学科分类号
08 ; 0825 ;
摘要
Pulse streams of many emitters have flexible features and complicated patterns. They can hardly be identified or further processed from a statistical perspective. In this paper, we introduce recurrent neural networks (RNNs) to mine and exploit long-term temporal patterns in streams and solve problems of sequential pattern classification, denoising, and deinterleaving of pulse streams. RNNs mine temporal patterns from previously collected streams of certain classes via supervised learning. The learned patterns are stored in the trained RNNs, which can then be used to recognize patterns-of-interest in testing streams and categorize them to different classes, and also predict features of upcoming pulses based on features of preceding ones. As predicted features contain sufficient information for distinguishing between pulses-of-interest and noises or interfering pulses, they are then used to solve problems of denoising and deinterleaving of noise-contaminated and aliasing streams. Detailed introductions of the methods, together with explanative simulation results, are presented to describe the procedures and behaviors of the RNNs in solving the aimed problems. Statistical results are provided to show satisfying performances of the proposed methods.
引用
收藏
页码:1624 / 1639
页数:16
相关论文
共 50 条
  • [1] Deinterleaving of Pulse Streams With Denoising Autoencoders
    Li, Xueqiong
    Liu, Zhangmeng
    Huang, Zhitao
    IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2020, 56 (06) : 4767 - 4778
  • [2] Deinterleaving Radar Pulse Train Using Neural Networks
    Erdogan, Alex
    George, Kiran
    2019 22ND IEEE INTERNATIONAL CONFERENCE ON COMPUTATIONAL SCIENCE AND ENGINEERING (IEEE CSE 2019) AND 17TH IEEE INTERNATIONAL CONFERENCE ON EMBEDDED AND UBIQUITOUS COMPUTING (IEEE EUC 2019), 2019, : 147 - 153
  • [3] Pulse Deinterleaving for Multifunction Radars With Hierarchical Deep Neural Networks
    Liu, Zhang-Meng
    IEEE TRANSACTIONS ON AEROSPACE AND ELECTRONIC SYSTEMS, 2021, 57 (06) : 3585 - 3599
  • [4] Mesh Denoising Based on Recurrent Neural Networks
    Xing, Yan
    Tan, Jieqing
    Hong, Peilin
    He, Yeyuan
    Hu, Min
    SYMMETRY-BASEL, 2022, 14 (06):
  • [5] An intelligent radar signal classification and deinterleaving method with unified residual recurrent neural network
    Al-Malahi, Abdulrahman
    Farhan, Abubaker
    Feng, HanCong
    Almaqtari, Omar
    Tang, Bin
    IET RADAR SONAR AND NAVIGATION, 2023, 17 (08): : 1259 - 1276
  • [6] COMPLEX RECURRENT NEURAL NETWORKS FOR DENOISING SPEECH SIGNALS
    Osako, Keiichi
    Singh, Rita
    Raj, Bhiksha
    2015 IEEE WORKSHOP ON APPLICATIONS OF SIGNAL PROCESSING TO AUDIO AND ACOUSTICS (WASPAA), 2015,
  • [7] Denoising of Radar Pulse Streams With Autoencoders
    Li, Xueqiong
    Liu, Zhang-Meng
    Huang, Zhitao
    IEEE COMMUNICATIONS LETTERS, 2020, 24 (04) : 797 - 801
  • [8] Convolutional Neural Networks for Noise Classification and Denoising of Images
    Sil, Dibakar
    Dutta, Arindam
    Chandra, Aniruddha
    PROCEEDINGS OF THE 2019 IEEE REGION 10 CONFERENCE (TENCON 2019): TECHNOLOGY, KNOWLEDGE, AND SOCIETY, 2019, : 447 - 451
  • [9] Signal Denoising with Recurrent Spiking Neural Networks and Active Tuning
    Ciurletti, Melvin
    Traub, Manuel
    Karlbauer, Matthias
    Butz, Martin, V
    Otte, Sebastian
    ARTIFICIAL NEURAL NETWORKS AND MACHINE LEARNING, ICANN 2021, PT V, 2021, 12895 : 220 - 232
  • [10] Image Denoising and Restoration Using Pulse Coupled Neural Networks
    Wen, Haijiao
    Wen, Jie
    2013 6TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING (CISP), VOLS 1-3, 2013, : 282 - 287