All You Need Is Feedback: Communication with Block Attention Feedback Codes

被引:8
|
作者
Ozfatura E. [1 ]
Shao Y. [1 ]
Perotti A.G. [2 ]
Popovic B.M. [2 ]
Gunduz D. [1 ]
机构
[1] Imperial College London, Information Processing and Communications Lab, Department of Electrical and Electronic Engineering, London
[2] Huawei Technologies Sweden AB, Radio Transmission Technology Lab, Kista
基金
欧盟地平线“2020”;
关键词
attention mechanism; channel coding; deep learning; deep neural networks; Feedback code; self-attention; transformer; ultra-reliable short-packet communications;
D O I
10.1109/JSAIT.2022.3223901
中图分类号
学科分类号
摘要
Deep neural network (DNN)-based channel code designs have recently gained interest as an alternative to conventional coding schemes, particularly for channels in which existing codes do not provide satisfactory performance. Coding in the presence of feedback is one such problem, for which promising results have recently been obtained by various DNN-based coding architectures. In this paper, we introduce a novel learning-aided feedback code design, dubbed generalized block attention feedback (GBAF) codes, that achieves orders-of-magnitude improvements in block error rate (BLER) compared to existing solutions. Sequence-to-sequence encoding and block-by-block processing of the message bits are the two important design principles of the GBAF codes, which not only reduce the communication overhead, due to fewer interactions between the transmitter and receiver, but also enable flexible coding rates. GBAF codes also have a modular structure that can be implemented using different neural network architectures. In this work, we employ the popular transformer architecture, which outperforms all the prior DNN-based code designs in terms in terms of BLER BLER in the low signal-to-noise ratio regime when the feedback channel is noiseless. © 2020 IEEE.
引用
收藏
页码:587 / 602
页数:15
相关论文
共 50 条
  • [21] ATTENTION IN A LITTLE NETWORK IS ALL YOU NEED TO GO GREEN
    Dewan, Dipayan
    Borthakur, Anupam
    Sheet, Debdoot
    2023 IEEE 20TH INTERNATIONAL SYMPOSIUM ON BIOMEDICAL IMAGING, ISBI, 2023,
  • [22] A Transcription Is All You Need: Learning to Align Through Attention
    Torras, Pau
    Ali Souibgui, Mohamed
    Chen, Jialuo
    Fornes, Alicia
    DOCUMENT ANALYSIS AND RECOGNITION, ICDAR 2021 WORKSHOPS, PT I, 2021, 12916 : 141 - 146
  • [23] Attention in a Little Network is All You Need to Go Green
    Dewan, Dipayan
    Borthakur, Anupam
    Sheet, Debdoot
    Proceedings - International Symposium on Biomedical Imaging, 2023, 2023-April
  • [24] Channel Attention Is All You Need for Video Frame Interpolation
    Choi, Myungsub
    Kim, Heewon
    Han, Bohyung
    Xu, Ning
    Lee, Kyoung Mu
    THIRTY-FOURTH AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, THE THIRTY-SECOND INNOVATIVE APPLICATIONS OF ARTIFICIAL INTELLIGENCE CONFERENCE AND THE TENTH AAAI SYMPOSIUM ON EDUCATIONAL ADVANCES IN ARTIFICIAL INTELLIGENCE, 2020, 34 : 10663 - 10671
  • [25] Equity-premium prediction: Attention is all you need
    Lima, Luiz Renato
    Godeiro, Lucas Lucio
    JOURNAL OF APPLIED ECONOMETRICS, 2023, 38 (01) : 105 - 122
  • [26] Looking at CTR Prediction Again: Is Attention All You Need?
    Cheng, Yuan
    Xue, Yanbo
    SIGIR '21 - PROCEEDINGS OF THE 44TH INTERNATIONAL ACM SIGIR CONFERENCE ON RESEARCH AND DEVELOPMENT IN INFORMATION RETRIEVAL, 2021, : 1279 - 1287
  • [27] Yes, "Attention Is All You Need", for Exemplar based Colorization
    Yin, Wang
    Lu, Peng
    Zhao, Zhaoran
    Peng, Xujun
    PROCEEDINGS OF THE 29TH ACM INTERNATIONAL CONFERENCE ON MULTIMEDIA, MM 2021, 2021, : 2243 - 2251
  • [28] Rateless Feedback Codes
    Sorensen, Jesper H.
    Koike-Akino, Toshiaki
    Orlik, Philip
    2012 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS (ISIT), 2012,
  • [29] Limited feedback precoding for orthogonal space-time block codes
    Love, DJ
    Heath, RW
    GLOBECOM '04: IEEE GLOBAL TELECOMMUNICATIONS CONFERENCE, VOLS 1-6, 2004, : 561 - 565
  • [30] Performance Bounds for Erasure, List and Feedback Schemes with Linear Block Codes
    Hof, Eran
    Sason, Igal
    Shamai, Shlomo
    2009 IEEE INFORMATION THEORY WORKSHOP (ITW 2009), 2009, : 303 - 307