Learn codes: Inventing low-latency codes via recurrent neural networks

被引:22
|
作者
Jiang, Yihan [1 ]
Kim, Hyeji [2 ]
Asnani, Himanshu [3 ]
Kannan, Sreeram [1 ]
Oh, Sewoong [4 ]
Viswanath, Pramod [5 ,6 ]
机构
[1] The Department of Electrical and Computer Engineering, University of Washington, Seattle,WA,98195, United States
[2] Samsung AI Center Cambridge, Cambridge,CB1 2JD, United Kingdom
[3] The School of Technology and Computer Science, Tata Institute of Fundamental Research, Mumbai,400005, India
[4] The Allen School of Computer Science Engineering, University of Washington, Seattle,WA,98195, United States
[5] The Coordinated Science Laboratory, University of Illinois at Urbana–Champaign, Champaign,IL,61801, United States
[6] The Department of Electrical Engineering, University of Illinois at Urbana–Champaign, Champaign,IL,61801, United States
基金
美国国家科学基金会;
关键词
5G mobile communication systems - Asymptotic analysis - Channel coding - Convolution - Convolutional codes - Decoding - Gaussian noise (electronic) - Network coding - White noise;
D O I
10.1109/JSAIT.2020.2988577
中图分类号
学科分类号
摘要
Designing channel codes under low-latency constraints is one of the most demanding requirements in 5G standards. However, a sharp characterization of the performance of traditional codes is available only in the large block-length limit. Guided by such asymptotic analysis, code designs require large block lengths as well as latency to achieve the desired error rate. Tail-biting convolutional codes and other recent state-of-the-art short block codes, while promising reduced latency, are neither robust to channel-mismatch nor adaptive to varying channel conditions. When the codes designed for one channel (e.g., Additive White Gaussian Noise (AWGN) channel) are used for another (e.g., non-AWGN channels), heuristics are necessary to achieve non-trivial performance. In this paper, we first propose an end-to-end learned neural code, obtained by jointly designing a Recurrent Neural Network (RNN) based encoder and decoder. This code outperforms canonical convolutional code under block settings. We then leverage this experience to propose a new class of codes under low-latency constraints, which we call Low-latency Efficient Adaptive Robust Neural (LEARN) codes. These codes outperform state-of-the-art low-latency codes and exhibit robustness and adaptivity properties. LEARN codes show the potential to design new versatile and universal codes for future communications via tools of modern deep learning coupled with communication engineering insights. © 2020 IEEE.
引用
收藏
页码:207 / 216
相关论文
共 50 条
  • [31] Efficient-Memory and Low-Latency BP Decoding Algorithm for Polar Codes
    Feng, Baoping
    Liu, Rongke
    [J]. IEEE COMMUNICATIONS LETTERS, 2020, 24 (06) : 1236 - 1239
  • [32] LOW-LATENCY APPROXIMATION OF BIDIRECTIONAL RECURRENT NETWORKS FOR SPEECH DENOISING
    Wichern, Gordon
    Lukin, Alexey
    [J]. 2017 IEEE WORKSHOP ON APPLICATIONS OF SIGNAL PROCESSING TO AUDIO AND ACOUSTICS (WASPAA), 2017, : 66 - 70
  • [33] Full-Duplex Relay in High-Reliability Low-latency Networks Operating with Finite Blocklength Codes
    Hu, Yulin
    Jorswieck, Eduard
    Schmeink, Anke
    [J]. 2019 16TH INTERNATIONAL SYMPOSIUM ON WIRELESS COMMUNICATION SYSTEMS (ISWCS), 2019, : 367 - 372
  • [34] Can Deep Neural Networks be Converted to Ultra Low-Latency Spiking Neural Networks?
    Datta, Gourav
    Beerel, Peter A.
    [J]. PROCEEDINGS OF THE 2022 DESIGN, AUTOMATION & TEST IN EUROPE CONFERENCE & EXHIBITION (DATE 2022), 2022, : 718 - 723
  • [35] Block Orthogonal Sparse Superposition Codes for Ultra-Reliable Low-Latency Communications
    Han, Donghwa
    Park, Jeonghun
    Lee, Youngjoo
    Poor, H. Vincent
    Lee, Namyoon
    [J]. IEEE TRANSACTIONS ON COMMUNICATIONS, 2023, 71 (12) : 6884 - 6897
  • [36] An Improved Concatenation Scheme of BCH-Polar Codes With Low-Latency Decoding Architecture
    Wang, Yu
    Chen, Lirui
    Liu, Cang
    Xing, Zuocheng
    [J]. IEEE ACCESS, 2019, 7 : 95867 - 95877
  • [37] Fast HARQ Over Finite Blocklength Codes: A Technique for Low-Latency Reliable Communication
    Makki, Behrooz
    Svensson, Tommy
    Caire, Giuseppe
    Zorzi, Michele
    [J]. IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2019, 18 (01) : 194 - 209
  • [38] Optimized Potential Initialization for Low-Latency Spiking Neural Networks
    Bu, Tong
    Ding, Jianhao
    Yu, Zhaofei
    Huang, Tiejun
    [J]. THIRTY-SIXTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE / THIRTY-FOURTH CONFERENCE ON INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE / THE TWELVETH SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2022, : 11 - 20
  • [39] Throughput Analysis of Low-Latency IoT Systems With QoS Constraints and Finite Blocklength Codes
    Hu, Yulin
    Li, Yi
    Gursoy, M. Cenk
    Velipasalar, Senem
    Schmeink, Anke
    [J]. IEEE TRANSACTIONS ON VEHICULAR TECHNOLOGY, 2020, 69 (03) : 3093 - 3104
  • [40] EdgeDRNN: Enabling Low-latency Recurrent Neural Network Edge Inference
    Gao, Chang
    Rios-Navarro, Antonio
    Chen, Xi
    Delbruck, Tobi
    Liu, Shih-Chii
    [J]. 2020 2ND IEEE INTERNATIONAL CONFERENCE ON ARTIFICIAL INTELLIGENCE CIRCUITS AND SYSTEMS (AICAS 2020), 2020, : 41 - 45