NOMA Codebook Optimization by Batch Gradient Descent

被引:8
|
作者
Si, Zhongwei [1 ]
Wen, Shaoguo [1 ]
Dong, Bing [1 ]
机构
[1] Beijing Univ Posts & Telecommun, Key Lab Universal Wireless Commun, Minist Educ, Beijing 100876, Peoples R China
来源
IEEE ACCESS | 2019年 / 7卷
基金
中国国家自然科学基金;
关键词
Batch gradient descent; codebook optimization; neural network; non-orthogonal multiple access; pairwise error probability;
D O I
10.1109/ACCESS.2019.2936483
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
The non-orthogonal multiple access (NOMA) has the potential to improve the spectrum efficiency and the user connectivity compared to the orthogonal schemes. The codebook design is crucial for the the performance of the NOMA system. In this paper, we comprehensively investigate the NOMA codebook design involving the characteristics from multiple signal domains. The minimizing of the pairwise error probability is considered as the target of the optimization. The neural network framework is explored for the optimization, and the mapping functions on the edges are considered as weights. The method of batch gradient descent is applied for optimizing the weights and correspondingly the codebook. The simulation results reveal that with the optimized codebook the error performance is significantly improved compared to the schemes in the literature.
引用
收藏
页码:117274 / 117281
页数:8
相关论文
共 50 条
  • [1] Quantile Stein Variational Gradient Descent for Batch Bayesian Optimization
    Gong, Chengyue
    Peng, Jian
    Liu, Qiang
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [2] Adadb: Adaptive Diff-Batch Optimization Technique for Gradient Descent
    Khan, Muhammad U. S.
    Jawad, Muhammad
    Khan, Samee U.
    [J]. IEEE ACCESS, 2021, 9 : 99581 - 99588
  • [3] GRADIENT DESCENT BATCH CLUSTERING FOR IMAGE CLASSIFICATION
    Park, Jae Sam
    [J]. IMAGE ANALYSIS & STEREOLOGY, 2023, 42 (02): : 133 - 144
  • [4] Asynchronous Mini-Batch Gradient Descent with Variance Reduction for Non-Convex Optimization
    Huo, Zhouyuan
    Huang, Heng
    [J]. THIRTY-FIRST AAAI CONFERENCE ON ARTIFICIAL INTELLIGENCE, 2017, : 2043 - 2049
  • [5] Comparing Stochastic Gradient Descent and Mini-batch Gradient Descent Algorithms in Loan Risk Assessment
    Adigun, Abodunrin AbdulGafar
    Yinka-Banjo, Chika
    [J]. INFORMATICS AND INTELLIGENT APPLICATIONS, 2022, 1547 : 283 - 296
  • [6] Solving Prediction Games with Parallel Batch Gradient Descent
    Grosshans, Michael
    Scheffer, Tobias
    [J]. MACHINE LEARNING AND KNOWLEDGE DISCOVERY IN DATABASES, ECML PKDD 2015, PT I, 2015, 9284 : 152 - 167
  • [7] A Quantitative Analysis of the Effect of Batch Normalization on Gradient Descent
    Cai, Yongqiang
    Li, Qianxiao
    Shen, Zuowei
    [J]. INTERNATIONAL CONFERENCE ON MACHINE LEARNING, VOL 97, 2019, 97
  • [8] The general inefficiency of batch training for gradient descent learning
    Wilson, DR
    Martinez, TR
    [J]. NEURAL NETWORKS, 2003, 16 (10) : 1429 - 1451
  • [9] Gain Optimization for SMCSPO with Gradient Descent
    Lee, Jin Hyeok
    Ryu, Hyeon Jae
    Lee, Min Cheol
    [J]. 2021 21ST INTERNATIONAL CONFERENCE ON CONTROL, AUTOMATION AND SYSTEMS (ICCAS 2021), 2021, : 575 - 578
  • [10] Complexity of gradient descent for multiobjective optimization
    Fliege, J.
    Vaz, A. I. F.
    Vicente, L. N.
    [J]. OPTIMIZATION METHODS & SOFTWARE, 2019, 34 (05): : 949 - 959