An Empirical Analysis of Generative Adversarial Network Training Times with Varying Batch

被引:0
|
作者
Ghosh, Bhaskar [1 ]
Dutta, Indira Kalyan [1 ]
Carlson, Albert
Totaro, Michael [1 ]
Bayoumi, Magdy [1 ]
机构
[1] Univ Louisiana Lafayette, Lafayette, LA 70504 USA
关键词
Generative Adversarial Networks; Training; Hyper-parameter; Neural Networks; Artificial Intelligence;
D O I
暂无
中图分类号
TP [自动化技术、计算机技术];
学科分类号
0812 ;
摘要
Increasing the performance of a Generative Adversarial Network (GAN) requires experimentation in choosing the suitable training hyper-parameters of learning rate and batch size. There is no consensus on learning rates or batch sizes in GANs, which makes it a "trial-and-error" process to get acceptable output. Researchers have differing views regarding the effect of batch sizes on run time. This paper investigates the impact of these training parameters of GANs with respect to actual elapsed training time. In our initial experiments, we study the effects of batch sizes, learning rates, loss function, and optimization algorithm on training using the MNIST dataset over 30,000 epochs. The simplicity of the MNIST dataset allows for a starting point in initial studies to understand if the parameter changes have any significant impact on the training times. The goal is to analyze and understand the results of varying loss functions, batch sizes, optimizer algorithms, and learning rates on GANs and address the key issue of batch size and learning rate selection.
引用
收藏
页码:643 / 648
页数:6
相关论文
共 50 条
  • [1] Batch equalization with a generative adversarial network
    Qian, Wesley Wei
    Xia, Cassandra
    Venugopalan, Subhashini
    Narayanaswamy, Arunachalam
    Dimon, Michelle
    Ashdown, George W.
    Baum, Jake
    Peng, Jian
    Ando, D. Michael
    BIOINFORMATICS, 2020, 36 : I875 - I883
  • [2] Early prediction for mode anomaly in generative adversarial network training: An empirical study
    Guo, Chenkai
    Huang, Dengrong
    Zhang, Jianwen
    Xu, Jing
    Bai, Guangdong
    Dong, Naipeng
    INFORMATION SCIENCES, 2020, 534 (534) : 117 - 138
  • [3] Spatial Coevolution for Generative Adversarial Network Training
    Hemberg E.
    Toutouh J.
    Al-Dujaili A.
    Schmiedlechner T.
    O'Reilly U.-M.
    ACM Transactions on Evolutionary Learning and Optimization, 2021, 1 (02):
  • [4] Training dataset reduction on generative adversarial network
    Nuha, Fajar Ulin
    Afiahayati
    INNS CONFERENCE ON BIG DATA AND DEEP LEARNING, 2018, 144 : 133 - 139
  • [5] Safe batch constrained deep reinforcement learning with generative adversarial network
    Dong, Wenbo
    Liu, Shaofan
    Sun, Shiliang
    INFORMATION SCIENCES, 2023, 634 : 259 - 270
  • [6] Empirical Evaluation on Synthetic Data Generation with Generative Adversarial Network
    Lu, Pei-Hsuan
    Wang, Pang-Chieh
    Yu, Chia-Mu
    PROCEEDINGS OF THE 9TH INTERNATIONAL CONFERENCE ON WEB INTELLIGENCE, MINING AND SEMANTICS (WIMS 2019), 2019,
  • [7] Soft Generative Adversarial Network: Combating Mode Collapse in Generative Adversarial Network Training via Dynamic Borderline Softening Mechanism
    Li, Wei
    Tang, Yongchuan
    APPLIED SCIENCES-BASEL, 2024, 14 (02):
  • [8] Successive training of a generative adversarial network for the design of an optical cloak
    Blanchard-Dionne, Andre-Pierre
    Martin, Olivier J. F.
    OSA CONTINUUM, 2021, 4 (01): : 87 - 95
  • [9] A Reconfigurable Accelerator for Generative Adversarial Network Training Based on FPGA
    Yin, Tongtong
    Mao, Wendong
    Lu, Jinming
    Wang, Zhongfeng
    2021 IEEE COMPUTER SOCIETY ANNUAL SYMPOSIUM ON VLSI (ISVLSI 2021), 2021, : 144 - 149
  • [10] A generative adversarial network for travel times imputation using trajectory data
    Zhang, Kunpeng
    He, Zhengbing
    Zheng, Liang
    Zhao, Liang
    Wu, Lan
    COMPUTER-AIDED CIVIL AND INFRASTRUCTURE ENGINEERING, 2021, 36 (02) : 197 - 212