All You Need Is Feedback: Communication with Block Attention Feedback Codes

被引:8
|
作者
Ozfatura E. [1 ]
Shao Y. [1 ]
Perotti A.G. [2 ]
Popovic B.M. [2 ]
Gunduz D. [1 ]
机构
[1] Imperial College London, Information Processing and Communications Lab, Department of Electrical and Electronic Engineering, London
[2] Huawei Technologies Sweden AB, Radio Transmission Technology Lab, Kista
基金
欧盟地平线“2020”;
关键词
attention mechanism; channel coding; deep learning; deep neural networks; Feedback code; self-attention; transformer; ultra-reliable short-packet communications;
D O I
10.1109/JSAIT.2022.3223901
中图分类号
学科分类号
摘要
Deep neural network (DNN)-based channel code designs have recently gained interest as an alternative to conventional coding schemes, particularly for channels in which existing codes do not provide satisfactory performance. Coding in the presence of feedback is one such problem, for which promising results have recently been obtained by various DNN-based coding architectures. In this paper, we introduce a novel learning-aided feedback code design, dubbed generalized block attention feedback (GBAF) codes, that achieves orders-of-magnitude improvements in block error rate (BLER) compared to existing solutions. Sequence-to-sequence encoding and block-by-block processing of the message bits are the two important design principles of the GBAF codes, which not only reduce the communication overhead, due to fewer interactions between the transmitter and receiver, but also enable flexible coding rates. GBAF codes also have a modular structure that can be implemented using different neural network architectures. In this work, we employ the popular transformer architecture, which outperforms all the prior DNN-based code designs in terms in terms of BLER BLER in the low signal-to-noise ratio regime when the feedback channel is noiseless. © 2020 IEEE.
引用
收藏
页码:587 / 602
页数:15
相关论文
共 50 条
  • [41] Is attention to bounding boxes all you need for pedestrian action prediction?
    Achaji, Lina
    Moreau, Julien
    Fouqueray, Thibault
    Aioun, Francois
    Charpillet, Francois
    2022 IEEE INTELLIGENT VEHICLES SYMPOSIUM (IV), 2022, : 895 - 902
  • [42] NEED FOR FEEDBACK TO AUTHORS
    GARDOS, G
    AMERICAN JOURNAL OF PSYCHIATRY, 1973, 130 (07): : 824 - 825
  • [43] Space-Time Block Codes with Limited Feedback Using Antenna Grouping
    Chae, Chan-Byoung
    Shim, Seijoon
    Heath, Robert W., Jr.
    IEICE TRANSACTIONS ON COMMUNICATIONS, 2008, E91B (10) : 3387 - 3390
  • [44] Performance Bounds for Erasure, List, and Decision Feedback Schemes With Linear Block Codes
    Hof, Eran
    Sason, Igal
    Shamai, Shlomo
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2010, 57 (08) : 3754 - 3778
  • [45] The Truncated Transmission of Spinal Codes with Imperfect Feedback in Block-Fading Channel
    Li, Zhiyuan
    Liu, Rongke
    Duan, Reifeng
    Hou, Yi
    2015 INTERNATIONAL CONFERENCE ON WIRELESS COMMUNICATIONS & SIGNAL PROCESSING (WCSP), 2015,
  • [46] Error Exponents for variable-length block codes with feedback and cost constraints
    Nakiboglu, Baris
    Gallager, Robert G.
    Win, Moe Z.
    2006 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, VOLS 1-6, PROCEEDINGS, 2006, : 74 - +
  • [47] Error exponents for variable-length block codes with feedback and cost constraints
    Nakiboglu, Baris
    Gallager, Robert G.
    IEEE TRANSACTIONS ON INFORMATION THEORY, 2008, 54 (03) : 945 - 963
  • [48] Limited feedback unitary precoding for orthogonal space-time block codes
    Love, DJ
    Heath, RW
    IEEE TRANSACTIONS ON SIGNAL PROCESSING, 2005, 53 (01) : 64 - 73
  • [49] Sphere-packing Bound for Block-codes with Feedback and Finite Memory
    Como, Giacomo
    Nakiboglu, Baris
    2010 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY, 2010, : 251 - 255
  • [50] Are you getting the feedback you deserve?
    Koonce, R
    TRAINING & DEVELOPMENT, 1998, 52 (07): : 18 - 18