NOMA Codebook Optimization by Batch Gradient Descent

被引:8
|
作者
Si, Zhongwei [1 ]
Wen, Shaoguo [1 ]
Dong, Bing [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Key Lab Universal Wireless Commun, Minist Educ, Beijing 100876, Peoples R China
基金
中国国家自然科学基金;
关键词
Batch gradient descent; codebook optimization; neural network; non-orthogonal multiple access; pairwise error probability;
D O I
10.1109/ACCESS.2019.2936483
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The non-orthogonal multiple access (NOMA) has the potential to improve the spectrum efficiency and the user connectivity compared to the orthogonal schemes. The codebook design is crucial for the the performance of the NOMA system. In this paper, we comprehensively investigate the NOMA codebook design involving the characteristics from multiple signal domains. The minimizing of the pairwise error probability is considered as the target of the optimization. The neural network framework is explored for the optimization, and the mapping functions on the edges are considered as weights. The method of batch gradient descent is applied for optimizing the weights and correspondingly the codebook. The simulation results reveal that with the optimized codebook the error performance is significantly improved compared to the schemes in the literature.
引用
收藏
页码:117274 / 117281
页数:8
相关论文
共 50 条
  • [41] Hybrid Gradient Descent for Robust Global Optimization on the Circle
    Strizic, Tom
    Poveda, Jorge I.
    Teel, Andrew R.
    [J]. 2017 IEEE 56TH ANNUAL CONFERENCE ON DECISION AND CONTROL (CDC), 2017,
  • [42] A Cost-based Optimizer for Gradient Descent Optimization
    Kaoudi, Zoi
    Quiane-Ruiz, Jorge-Arnulfo
    Thirumuruganathan, Saravanan
    Chawla, Sanjay
    Agrawal, Divy
    [J]. SIGMOD'17: PROCEEDINGS OF THE 2017 ACM INTERNATIONAL CONFERENCE ON MANAGEMENT OF DATA, 2017, : 977 - 992
  • [43] ON THE PRIVACY OF NOISY STOCHASTIC GRADIENT DESCENT FOR CONVEX OPTIMIZATION
    Altschuler, Jason M.
    Bok, Jinho
    Talwar, Kunal
    [J]. SIAM JOURNAL ON COMPUTING, 2024, 53 (04) : 969 - 1001
  • [44] A DESCENT PRP CONJUGATE GRADIENT METHOD FOR UNCONSTRAINED OPTIMIZATION
    Nosratipour, H.
    Amini, K.
    [J]. TWMS JOURNAL OF APPLIED AND ENGINEERING MATHEMATICS, 2019, 9 (03): : 535 - 548
  • [45] Automatic Prompt Optimization with Gradient Descent and Beam Search
    Pryzant, Reid
    Iter, Dan
    Li, Jerry
    Lee, Yin Tat
    Zhu, Chenguang
    Zeng, Michael
    [J]. EMNLP 2023 - 2023 Conference on Empirical Methods in Natural Language Processing, Proceedings, 2023, : 7957 - 7968
  • [46] Two descent hybrid conjugate gradient methods for optimization
    Zhang, Li
    Zhou, Weijun
    [J]. JOURNAL OF COMPUTATIONAL AND APPLIED MATHEMATICS, 2008, 216 (01) : 251 - 264
  • [47] Gradient Descent Optimization Algorithms for Decoding SCMA Signals
    Vidal-Beltran, Sergio
    Bonilla, Jose Luis Lopez
    Pinon, Fernando Martinez
    Yalja-Montiel, Jesus
    [J]. INTERNATIONAL JOURNAL OF COMPUTATIONAL INTELLIGENCE AND APPLICATIONS, 2021, 20 (01)
  • [48] Acceleration for Compressed Gradient Descent in Distributed and Federated Optimization
    Li, Zhize
    Kovalev, Dmitry
    Qian, Xun
    Richtarik, Peter
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 119, 2020, 119
  • [49] Evolutionary Gradient Descent for Non-convex Optimization
    Xue, Ke
    Qian, Chao
    Xu, Ling
    Fei, Xudong
    [J]. PROCEEDINGS OF THE THIRTIETH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE, IJCAI 2021, 2021, : 3221 - 3227
  • [50] An acceleration of gradient descent algorithm with backtracking for unconstrained optimization
    Neculai Andrei
    [J]. Numerical Algorithms, 2006, 42 : 63 - 73