Multitask-Learning-Based Deep Neural Network for Automatic Modulation Classification

被引:59
|
作者
Chang, Shuo [1 ]
Huang, Sai [1 ]
Zhang, Ruiyun [1 ]
Feng, Zhiyong [1 ]
Liu, Liang [2 ]
机构
[1] Beijing Univ Posts & Telecommun, Minist Educ, Key Lab Universal Wireless Commun, Beijing 100876, Peoples R China
[2] Beijing Univ Posts & Telecommun, Beijing Key Lab Intelligent Telecommun Software &, Beijing 100876, Peoples R China
基金
中国国家自然科学基金; 北京市自然科学基金;
关键词
Feature extraction; Signal to noise ratio; Deep learning; Task analysis; Modulation; Internet of Things; Long short term memory; Automatic modulation classification (AMC); bidirectional gated recurrent unit (BiGRU); convolutional neural network (CNN); deep neural network; multitask learning; step attention fusion network (SAFN); IDENTIFICATION; PERFORMANCE; LSTM;
D O I
10.1109/JIOT.2021.3091523
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Automatic modulation classification (AMC) is to identify the modulation type of a received signal, which plays a vital role to ensure the physical-layer security for Internet of Things (IoT) networks. Inspired by the great success of deep learning in pattern recognition, the convolutional neural network (CNN) and recurrent neural network (RNN) are introduced into the AMC. In general, there are two popular data formats used by AMC, which are the in-phase/quadrature (I/Q) representation and amplitude/phase (A/P) representation, respectively. However, most of AMC algorithms aim at structure innovations, while the differences and characteristics of I/Q and A/P are ignored to analyze. In this article, lots of popular AMC algorithms are reproduced and evaluated on the same data set, where the I/Q and A/P are used, respectively, for comparison. Based on the experimental results, it is found that: 1) CNN-RNN-like algorithms using A/P as input data are superior to those using I/Q at high signal-to-noise ratio (SNR), while it has an opposite result in low SNR and 2) the features extracted from I/Q and A/P are complementary to each other. Motivated by the aforementioned findings, a multitask learning-based deep neural network (MLDNN) is proposed, which effectively fuses I/Q and A/P. In addition, the MLDNN also has a novel backbone, which is made up of three blocks to extract discriminative features, and they are CNN block, bidirectional gated recurrent unit (BiGRU) block, and a step attention fusion network (SAFN) block. Different from most of CNN-RNN-like algorithms (i.e., they only use the last step outputs of RNN), all step outputs of BiGRU can be effectively utilized by MLDNN with the help of SAFN. Extensive simulations are conducted to verify that the proposed MLDNN achieves superior performance in the public benchmark.
引用
收藏
页码:2192 / 2206
页数:15
相关论文
共 50 条
  • [41] Deep geometric convolutional network for automatic modulation classification
    Li, Rundong
    Song, Chengtian
    Song, Yuxuan
    Hao, Xiaojun
    Yang, Shuyuan
    Song, Xiyu
    [J]. SIGNAL IMAGE AND VIDEO PROCESSING, 2020, 14 (06) : 1199 - 1205
  • [42] Deep geometric convolutional network for automatic modulation classification
    Rundong Li
    Chengtian Song
    Yuxuan Song
    Xiaojun Hao
    Shuyuan Yang
    Xiyu Song
    [J]. Signal, Image and Video Processing, 2020, 14 : 1199 - 1205
  • [43] Automatic Defect Classification for Infrared Thermography in CFRP based on Deep Learning Dense Convolutional Neural Network
    Liu, Guozeng
    Gao, Weicheng
    Liu, Wei
    Chen, Yijiao
    Wang, Tianlong
    Xie, Yongzhi
    Bai, Weiliang
    Li, Zijing
    [J]. JOURNAL OF NONDESTRUCTIVE EVALUATION, 2024, 43 (03)
  • [44] A Hybrid Neural Network for Fast Automatic Modulation Classification
    Lin, Rendeng
    Ren, Wenjuan
    Sun, Xian
    Yang, Zhanpeng
    Fu, Kun
    [J]. IEEE ACCESS, 2020, 8 : 130314 - 130322
  • [45] Automatic Modulation Classification with Genetic Backpropagation Neural Network
    Zhou, Qianlin
    Lu, Hui
    Jia, Liwei
    Mao, Kefei
    [J]. 2016 IEEE CONGRESS ON EVOLUTIONARY COMPUTATION (CEC), 2016, : 4626 - 4633
  • [46] Chromosome Classification with Convolutional Neural Network based Deep Learning
    Zhang, Wenbo
    Song, Sifan
    Bai, Tianming
    Zhao, Yanxin
    Ma, Fei
    Su, Jionglong
    Yu, Limin
    [J]. 2018 11TH INTERNATIONAL CONGRESS ON IMAGE AND SIGNAL PROCESSING, BIOMEDICAL ENGINEERING AND INFORMATICS (CISP-BMEI 2018), 2018,
  • [47] Automatic Modulation Classification: A Deep Learning Enabled Approach
    Meng, Fan
    Chen, Peng
    Wu, Lenan
    Wang, Xianbin
    [J]. IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2018, 67 (11) : 10760 - 10772
  • [48] Hardware Implementation of Automatic Modulation Classification with Deep Learning
    Kumar, Satish
    Singh, Anurag
    Mahapatra, Rajarshi
    [J]. 13TH IEEE INTERNATIONAL CONFERENCE ON ADVANCED NETWORKS AND TELECOMMUNICATION SYSTEMS (IEEE ANTS), 2019,
  • [49] A Hybrid Deep Learning Model for Automatic Modulation Classification
    Kim, Seung-Hwan
    Moon, Chang-Bae
    Kim, Jae-Woo
    Kim, Dong-Seong
    [J]. IEEE WIRELESS COMMUNICATIONS LETTERS, 2022, 11 (02) : 313 - 317
  • [50] Low-Rank Deep Convolutional Neural Network for Multitask Learning
    Su, Fang
    Shang, Hai-Yang
    Wang, Jing-Yan
    [J]. COMPUTATIONAL INTELLIGENCE AND NEUROSCIENCE, 2019, 2019