A Study on Generative Model-based Dance Movement Creation and Virtual Dancer Generation
DOI:
https://doi.org/10.70917/ijcisim-2025-0304Keywords:
Dance movement generation; TransGAN; Motion capture technology; Virtual human driving algorithmAbstract
With the development of digital entertainment, obtaining intelligent dance creation under musical conditions is currently a popular research field. Traditional dance creation has the problems of long creation cycle and limited creative inspiration. This study proposes an end-to-end generative framework to realize automatic generation from music to dance movements with real-time driving of virtual dancers. Firstly, the dance data is obtained by motion capture technology, and polygonal modeling technology is used for virtual character model construction. Then the audio and motion features are obtained from the audio signal and human body motion sequences respectively. In which the core section adopts TransGAN-based intelligent dance generation network, which uses generative adversarial network as a framework to combine Transformer and up-sampling layers to realize multi-level action coding. Experiments show that the coherence and rationality of the dance movements generated by this model are higher than the comparison model. The error of the hip joint of the virtual dancer's walking movement is only 1.006, which is able to realize stable, accurate and diverse dance performance. This study provides a feasible technical solution to promote the automated production of dance content.
Downloads
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2025 Yiheng Li

This work is licensed under a Creative Commons Attribution 4.0 International License.