Performance-based control interface for character animation

被引:28
|
作者
Ishigaki, Satoru [1 ]
White, Timothy [1 ]
Zordan, Victor B. [2 ]
Liu, C. Karen [1 ]
机构
[1] Georgia Tech, Atlanta, GA USA
[2] UC Riverside, Riverside, CA USA
来源
ACM TRANSACTIONS ON GRAPHICS | 2009年 / 28卷 / 03期
基金
美国国家科学基金会;
关键词
Character animation; Motion capture;
D O I
10.1145/1531326.1531367
中图分类号
TP31 [计算机软件];
学科分类号
081202 ; 0835 ;
摘要
Most game interfaces today are largely symbolic, translating simplified input such as keystrokes into the choreography of full-body character movement. In this paper, we describe a system that directly uses human motion performance to provide a radically different, and much more expressive interface for controlling virtual characters. Our system takes a data feed from a motion capture system as input, and in real-time translates the performance into corresponding actions in a virtual world. The difficulty with such an approach arises from the need to manage the discrepancy between the real and virtual world, leading to two important subproblems 1) recognizing the user's intention, and 2) simulating the appropriate action based on the intention and virtual context. We solve this issue by first enabling the virtual world's designer to specify possible activities in terms of prominent features of the world along with associated motion clips depicting interactions. We then integrate the prerecorded motions with online performance and dynamic simulation to synthesize seamless interaction of the virtual character in a simulated virtual world. The result is a flexible interface through which a user can make freeform control choices while the resulting character motion maintains both physical realism and the user's personal style.
引用
收藏
页数:8
相关论文
共 50 条
  • [1] Performance-based Expressive Character Animation
    Aneja, Deepali
    [J]. ADJUNCT PUBLICATION OF THE 32ND ANNUAL ACM SYMPOSIUM ON USER INTERFACE SOFTWARE AND TECHNOLOGY (UIST'19 ADJUNCT), 2019, : 166 - 169
  • [2] Surface capture for performance-based animation
    Starck, Jonathan
    Hilton, Adrian
    [J]. IEEE COMPUTER GRAPHICS AND APPLICATIONS, 2007, 27 (03) : 21 - 31
  • [3] Realtime Performance-Based Facial Animation
    Weise, Thibaut
    Bouaziz, Sofien
    Li, Hao
    Pauly, Mark
    [J]. ACM TRANSACTIONS ON GRAPHICS, 2011, 30 (04):
  • [4] Interactive editing of performance-based facial animation
    Seol, Yeongho
    Cozens, Michael
    [J]. SA'19: SIGGRAPH ASIA 2019 TECHNICAL BRIEFS, 2019, : 61 - 64
  • [5] A Multimodal Interface for Virtual Character Animation Based on Live Performance and Natural Language Processing
    Lamberti, Fabrizio
    Gatteschi, Valentina
    Sanna, Andrea
    Cannavo, Alberto
    [J]. INTERNATIONAL JOURNAL OF HUMAN-COMPUTER INTERACTION, 2019, 35 (18) : 1655 - 1671
  • [6] Muscle-Based Control for Character Animation
    Ruiz, A. L. Cruz
    Pontonnier, C.
    Pronost, N.
    Dumont, G.
    [J]. COMPUTER GRAPHICS FORUM, 2017, 36 (06) : 122 - 147
  • [7] Interaction with a Virtual Character through Performance Based Animation
    Wu, Qiong
    Kazakevich, Maryia
    Taylor, Robyn
    Boulanger, Pierre
    [J]. SMART GRAPHICS, PROCEEDINGS, 2010, 6133 : 285 - 288
  • [8] Performance-Based Animation Using Constraints for Virtual Object Manipulation
    Hwang, Jaepyung
    Kim, Kwanguk
    Suh, Il Hong
    Kwon, Taesoo
    [J]. IEEE COMPUTER GRAPHICS AND APPLICATIONS, 2017, 37 (04) : 95 - 102
  • [9] MienCap: Performance-based Facial Animation with Live Mood Dynamics
    Pan, Ye
    Zhang, Ruisi
    Wang, Jingying
    Chen, Nengfu
    Qiu, Yilin
    Ding, Yu
    Mitchell, Kenny
    [J]. 2022 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS (VRW 2022), 2022, : 645 - 646
  • [10] Control of motion in character animation
    Xiao, ZD
    Zhang, JJ
    Bell, S
    [J]. EIGHTH INTERNATIONAL CONFERENCE ON INFORMATION VISUALISATION, PROCEEDINGS, 2004, : 841 - 848