Optimization of Choreography Teaching with Deep Learning and Neural Networks

被引:0
|
作者
Zhou, Qianling [1 ]
Tong, Yan [2 ]
Si, Hongwei [3 ]
Zhou, Kai [4 ]
机构
[1] School of Music and Dance, Hunan Women's University, Changsha, Hunan,410004, China
[2] School of Music, South China Normal University, Guangzhou, Guangdong,510631, China
[3] Department of the History of Science, Tsinghua University, Beijing, China
[4] School of Social Development and Management, Hunan Women's University, Changsha, Hunan,410004, China
关键词
Deep learning - Learning systems - Memory architecture - Network architecture - Stamping;
D O I
暂无
中图分类号
学科分类号
摘要
To improve the development level of intelligent dance education and choreography network technology, the research mainly focuses on the automatic formation system of continuous choreography by using the deep learning method. Firstly, it overcomes the technical difficulty that the dynamic segmentation and process segmentation of the automatic generation architecture in traditional choreography cannot achieve global optimization. Secondly, it is an automatic generation architecture for end-to-end continuous dance notation with access to temporal classifiers. Based on this, a dynamic time-stamping model is designed for frame clustering. Finally, it is concluded through experiments that the model successfully achieves high-performance movement time-stamping. And combined with continuous motion recognition technology, it realizes the refined production of continuous choreography with global motion recognition and then marks motion duration. This research effectively realizes the efficient and refined production of digital continuous choreography, provides advanced technical means for choreography education, and provides useful experience for school network choreography education. © 2022 Qianling Zhou et al.
引用
收藏
相关论文
共 50 条
  • [41] Speeding Up Automatic Hyperparameter Optimization of Deep Neural Networks by Extrapolation of Learning Curves
    Domhan, Tobias
    Springenberg, Jost Tobias
    Hutter, Frank
    PROCEEDINGS OF THE TWENTY-FOURTH INTERNATIONAL JOINT CONFERENCE ON ARTIFICIAL INTELLIGENCE (IJCAI), 2015, : 3460 - 3468
  • [42] Constrained Bayesian Optimization of VANET Safety Messaging Using Deep Learning Neural Networks
    Wright, Aidan Samuel
    Philip, Sandeep John
    Ma, Xiaomin
    2024 INTERNATIONAL CONFERENCE ON COMPUTING, NETWORKING AND COMMUNICATIONS, ICNC, 2024, : 1000 - 1005
  • [43] Multi-Adaptive Optimization for multi-task learning with deep neural networks
    Hervella, alvaro S.
    Rouco, Jose
    Novo, Jorge
    Ortega, Marcos
    NEURAL NETWORKS, 2024, 170 : 254 - 265
  • [44] Plant identification using deep neural networks via optimization of transfer learning parameters
    Ghazi, Mostafa Mehdipour
    Yanikoglu, Berrin
    Aptoula, Erchan
    NEUROCOMPUTING, 2017, 235 : 228 - 235
  • [45] Teaching Deep Neural Networks to Find Cerebral Aneurysms
    Payabvash, Seyedmehdi
    RADIOLOGY, 2024, 312 (02)
  • [46] Constrained Nonlinear Predictive Control Using Neural Networks and Teaching–Learning-Based Optimization
    Mohamed Benrabah
    Kamel Kara
    Oussama AitSahed
    Mohamed Laid Hadjili
    Journal of Control, Automation and Electrical Systems, 2021, 32 : 1228 - 1243
  • [47] Constrained Nonlinear Predictive Control Using Neural Networks and Teaching-Learning-Based Optimization
    Benrabah, Mohamed
    Kara, Kamel
    AitSahed, Oussama
    Hadjili, Mohamed Laid
    JOURNAL OF CONTROL AUTOMATION AND ELECTRICAL SYSTEMS, 2021, 32 (05) : 1228 - 1243
  • [48] Deep learning with coherent VCSEL neural networks
    Zaijun Chen
    Alexander Sludds
    Ronald Davis
    Ian Christen
    Liane Bernstein
    Lamia Ateshian
    Tobias Heuser
    Niels Heermeier
    James A. Lott
    Stephan Reitzenstein
    Ryan Hamerly
    Dirk Englund
    Nature Photonics, 2023, 17 : 723 - 730
  • [49] Learning deep neural networks for node classification
    Li, Bentian
    Pi, Dechang
    EXPERT SYSTEMS WITH APPLICATIONS, 2019, 137 : 324 - 334
  • [50] Inspecting the behaviour of Deep Learning Neural Networks
    Duer, Alexander
    Filzmoser, Peter
    Rauber, Andreas
    ERCIM NEWS, 2019, (116): : 18 - 19