Modosc: A Library of Real-Time Movement Descriptors for Marker-Based Motion Capture

被引:3
|
作者
Dahl, Luke [1 ]
Visi, Federico [2 ]
机构
[1] Univ Virginia, Dept Mus, Charlottesville, VA 22903 USA
[2] Univ Hamburg, Inst Systemat Musicol, Hamburg, Germany
基金
欧洲研究理事会;
关键词
Motion capture; motion descriptors; motion analysis; expressive movement; interaction design; Max; Open Sound Control; modosc;
D O I
10.1145/3212721.3212842
中图分类号
TP3 [计算技术、计算机技术];
学科分类号
0812 ;
摘要
Marker-based motion capture systems that stream precise movement data in real-time afford interaction scenarios that can be subtle, detailed, and immediate. However, challenges to effectively utilizing this data include having to build bespoke processing systems which may not scale well, and a need for higher-level representations of movement and movement qualities. We present modosc, a set of Max abstractions for computing motion descriptors from raw motion capture data in real time. Modosc is designed to address the data handling and synchronization issues that arise when working with complex marker sets, and to structure data streams in a meaningful and easily accessible manner. This is achieved by adopting a multiparadigm programming approach using o.dot and Open Sound Control. We describe an initial set of motion descriptors, the addressing system employed, and design decisions and challenges.
引用
收藏
页数:4
相关论文
共 50 条
  • [41] Reconstruction of occluded pelvis markers during marker-based motion capture with industrial exoskeletons
    Johns, Jasper
    Bender, A.
    Glitsch, U.
    Schmidt-Bleek, L.
    Dymke, J.
    Brandl, C.
    Damm, P.
    Heinrich, K.
    [J]. COMPUTER METHODS IN BIOMECHANICS AND BIOMEDICAL ENGINEERING, 2024, : 79 - 89
  • [42] Comparison Of Marker-based And Markerless Motion Capture During Single And Double Leg Jumping
    Mullen, Kevin J.
    Taylor, Jeffrey B.
    Westbrook, Audrey E.
    Pexa, Brett S.
    Ford, Kevin R.
    [J]. MEDICINE & SCIENCE IN SPORTS & EXERCISE, 2022, 54 (09) : 293 - 294
  • [43] From Motion Capture to Real-Time Character Animation
    Multon, Franck
    Kulpa, Richard
    Hoyet, Ludovic
    Komura, Taku
    [J]. MOTION IN GAMES, FIRST INTERNATIONAL WORKSHOP, MIG 2008, 2008, 5277 : 72 - +
  • [44] Towards Scalable and Real-time Markerless Motion Capture
    Albanis, Georgios
    Chatzitofis, Anargyros
    Thermos, Spyridon
    Zioulis, Nikolaos
    Kolomvatsos, Kostas
    [J]. 2022 IEEE CONFERENCE ON VIRTUAL REALITY AND 3D USER INTERFACES ABSTRACTS AND WORKSHOPS (VRW 2022), 2022, : 715 - 716
  • [45] A Tandem Marker-Based Motion Capture Method for Dynamic Small Displacement Distribution Analysis
    Aliansyah, Zulhaj
    Shimasaki, Kohei
    Jiang, Mingjun
    Takaki, Takeshi
    Ihsii, Idaku
    Yang, Hua
    Umemoto, Chikako
    Matsuda, Hiroshi
    [J]. JOURNAL OF ROBOTICS AND MECHATRONICS, 2019, 31 (05) : 671 - 685
  • [46] Real-time video based motion capture system based on color and edge distributions
    Akazawa, Y
    Okada, Y
    Niijima, K
    [J]. IEEE INTERNATIONAL CONFERENCE ON MULTIMEDIA AND EXPO, VOL I AND II, PROCEEDINGS, 2002, : A333 - A336
  • [47] Robust Real-time Stereo-based Markerless Human Motion Capture
    Azad, Pedram
    Asfour, Tamim
    Dillmann, Ruediger
    [J]. 2008 8TH IEEE-RAS INTERNATIONAL CONFERENCE ON HUMANOID ROBOTS (HUMANOIDS 2008), 2008, : 347 - 354
  • [48] Real-time control of 7R manipulator based on motion capture
    College of Mechanical Engineering and Applied Electronics Technology, Beijing University of Technology, Beijing 100124, China
    [J]. Jixie Gongcheng Xuebao, 23 (68-73):
  • [49] Real-Time Human Motion Capture Based on Wearable Inertial Sensor Networks
    Li, Jie
    Liu, Xiaofeng
    Wang, Zhelong
    Zhao, Hongyu
    Zhang, Tingting
    Qiu, Sen
    Zhou, Xu
    Cai, Huili
    Ni, Rongrong
    Cangelosi, Angelo
    [J]. IEEE INTERNET OF THINGS JOURNAL, 2022, 9 (11): : 8953 - 8966
  • [50] MIMIC: Real-time marker-free motion capture system to create an agent in the virtual space
    Kim, SE
    Lee, RH
    Park, CJ
    Lee, IH
    [J]. INTERNATIONAL CONFERENCE ON COMPUTERS IN EDUCATION, VOLS I AND II, PROCEEDINGS, 2002, : 48 - 49