All You Need Is Feedback: Communication with Block Attention Feedback Codes

被引:8
|
作者
Ozfatura E. [1 ]
Shao Y. [1 ]
Perotti A.G. [2 ]
Popovic B.M. [2 ]
Gunduz D. [1 ]
机构
[1] Imperial College London, Information Processing and Communications Lab, Department of Electrical and Electronic Engineering, London
[2] Huawei Technologies Sweden AB, Radio Transmission Technology Lab, Kista
基金
欧盟地平线“2020”;
关键词
attention mechanism; channel coding; deep learning; deep neural networks; Feedback code; self-attention; transformer; ultra-reliable short-packet communications;
D O I
10.1109/JSAIT.2022.3223901
中图分类号
学科分类号
摘要
Deep neural network (DNN)-based channel code designs have recently gained interest as an alternative to conventional coding schemes, particularly for channels in which existing codes do not provide satisfactory performance. Coding in the presence of feedback is one such problem, for which promising results have recently been obtained by various DNN-based coding architectures. In this paper, we introduce a novel learning-aided feedback code design, dubbed generalized block attention feedback (GBAF) codes, that achieves orders-of-magnitude improvements in block error rate (BLER) compared to existing solutions. Sequence-to-sequence encoding and block-by-block processing of the message bits are the two important design principles of the GBAF codes, which not only reduce the communication overhead, due to fewer interactions between the transmitter and receiver, but also enable flexible coding rates. GBAF codes also have a modular structure that can be implemented using different neural network architectures. In this work, we employ the popular transformer architecture, which outperforms all the prior DNN-based code designs in terms in terms of BLER BLER in the low signal-to-noise ratio regime when the feedback channel is noiseless. © 2020 IEEE.
引用
收藏
页码:587 / 602
页数:15
相关论文
共 50 条
  • [1] All you need is feedback
    Kamil L. Ekinci
    Nature Nanotechnology, 2008, 3 : 319 - 320
  • [2] Feedback is Good, Active Feedback is Better: Block Attention Active Feedback Codes
    Ozfatura, Emre
    Shao, Yulin
    Ghazanfari, Amin
    Perotti, Alberto
    Popovic, Branislav
    Gunduz, Deniz
    ICC 2023-IEEE INTERNATIONAL CONFERENCE ON COMMUNICATIONS, 2023, : 6652 - 6657
  • [3] NEMS: All you need is feedback
    Ekinci, Kamil L.
    NATURE NANOTECHNOLOGY, 2008, 3 (06) : 319 - 320
  • [4] Feedback is all you need: from ChatGPT to autonomous driving
    Chen, Hong
    Yuan, Kang
    Huang, Yanjun
    Guo, Lulu
    Wang, Yulei
    Chen, Jie
    SCIENCE CHINA-INFORMATION SCIENCES, 2023, 66 (06)
  • [5] Feedback is all you need:from ChatGPT to autonomous driving
    Hong CHEN
    Kang YUAN
    Yanjun HUANG
    Lulu GUO
    Yulei WANG
    Jie CHEN
    ScienceChina(InformationSciences), 2023, 66 (06) : 290 - 292
  • [6] Attention Is All You Need
    Vaswani, Ashish
    Shazeer, Noam
    Parmar, Niki
    Uszkoreit, Jakob
    Jones, Llion
    Gomez, Aidan N.
    Kaiser, Lukasz
    Polosukhin, Illia
    ADVANCES IN NEURAL INFORMATION PROCESSING SYSTEMS 30 (NIPS 2017), 2017, 30
  • [7] Moral Attention Is All You Need
    Graves, Mark
    THEOLOGY AND SCIENCE, 2025,
  • [8] Extending orthogonal block codes with partial feedback
    Akhtar, J
    Gesbert, D
    IEEE TRANSACTIONS ON WIRELESS COMMUNICATIONS, 2004, 3 (06) : 1959 - 1962
  • [9] Attention Is (not) All You Need for Commonsense Reasoning
    Klein, Tassilo
    Nabi, Moin
    57TH ANNUAL MEETING OF THE ASSOCIATION FOR COMPUTATIONAL LINGUISTICS (ACL 2019), 2019, : 4831 - 4836
  • [10] ATTENTION IS ALL YOU NEED IN SPEECH SEPARATION
    Subakan, Cem
    Ravanelli, Mirco
    Cornell, Samuele
    Bronzi, Mirko
    Zhong, Jianyuan
    2021 IEEE INTERNATIONAL CONFERENCE ON ACOUSTICS, SPEECH AND SIGNAL PROCESSING (ICASSP 2021), 2021, : 21 - 25