All You Need Is Feedback: Communication with Block Attention Feedback Codes

被引:8
|
作者
Ozfatura E. [1 ]
Shao Y. [1 ]
Perotti A.G. [2 ]
Popovic B.M. [2 ]
Gunduz D. [1 ]
机构
[1] Imperial College London, Information Processing and Communications Lab, Department of Electrical and Electronic Engineering, London
[2] Huawei Technologies Sweden AB, Radio Transmission Technology Lab, Kista
基金
欧盟地平线“2020”;
关键词
attention mechanism; channel coding; deep learning; deep neural networks; Feedback code; self-attention; transformer; ultra-reliable short-packet communications;
D O I
10.1109/JSAIT.2022.3223901
中图分类号
学科分类号
摘要
Deep neural network (DNN)-based channel code designs have recently gained interest as an alternative to conventional coding schemes, particularly for channels in which existing codes do not provide satisfactory performance. Coding in the presence of feedback is one such problem, for which promising results have recently been obtained by various DNN-based coding architectures. In this paper, we introduce a novel learning-aided feedback code design, dubbed generalized block attention feedback (GBAF) codes, that achieves orders-of-magnitude improvements in block error rate (BLER) compared to existing solutions. Sequence-to-sequence encoding and block-by-block processing of the message bits are the two important design principles of the GBAF codes, which not only reduce the communication overhead, due to fewer interactions between the transmitter and receiver, but also enable flexible coding rates. GBAF codes also have a modular structure that can be implemented using different neural network architectures. In this work, we employ the popular transformer architecture, which outperforms all the prior DNN-based code designs in terms in terms of BLER BLER in the low signal-to-noise ratio regime when the feedback channel is noiseless. © 2020 IEEE.
引用
收藏
页码:587 / 602
页数:15
相关论文
共 50 条
  • [31] Orthogonal Space-Time Block Codes for Analog Channel Feedback
    Chen, Jinhui
    Slock, Dirk T. M.
    2008 IEEE INTERNATIONAL SYMPOSIUM ON INFORMATION THEORY PROCEEDINGS, VOLS 1-6, 2008, : 1473 - 1477
  • [32] Feedback Strategies for Online Fountain Codes With Limited Feedback
    Cai, Peixiang
    Zhang, Yu
    Wu, Yichen
    Chang, Xiaohua
    Pan, Changyong
    IEEE COMMUNICATIONS LETTERS, 2020, 24 (09) : 1870 - 1874
  • [33] Attention is all you need: utilizing attention in AI-enabled drug discovery
    Zhang, Yang
    Liu, Caiqi
    Liu, Mujiexin
    Liu, Tianyuan
    Lin, Hao
    Huang, Cheng-Bing
    Ning, Lin
    BRIEFINGS IN BIOINFORMATICS, 2024, 25 (01)
  • [34] "Click it, when you need it": On-Demand Feedback for Online Settings
    Topali, Paraskevi
    Hilgemann, Rene
    Chounta, Irene-Angelica
    30TH INTERNATIONAL CONFERENCE ON COMPUTERS IN EDUCATION, ICCE 2022, VOL 2, 2022, : 641 - 643
  • [35] Attention is not all you need: pure attention loses rank doubly exponentially with depth
    Dong, Yihe
    Cordonnier, Jean-Baptiste
    Loukas, Andreas
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [36] Diversified SLT Codes Based on Feedback for Communication over Wireless Networks
    Zhang, Lei
    Liao, Jianxin
    Wang, Jingyu
    Qi, Qi
    Xu, Tong
    Tian, Shengwen
    Liao, Minyan
    2013 GLOBAL INFORMATION INFRASTRUCTURE SYMPOSIUM, 2013,
  • [38] Attention And Positional Encoding Are (Almost) All You Need For Shape Matching
    Raganato, Alessandro
    Pasi, Gabriella
    Melzi, Simone
    COMPUTER GRAPHICS FORUM, 2023, 42 (05)
  • [39] Is Space-Time Attention All You Need for Video Understanding?
    Bertasius, Gedas
    Wang, Heng
    Torresani, Lorenzo
    INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 139, 2021, 139
  • [40] Fake News Spreaders Detection: Sometimes Attention Is Not All You Need
    Siino, Marco
    Di Nuovo, Elisa
    Tinnirello, Ilenia
    La Cascia, Marco
    INFORMATION, 2022, 13 (09)