Data-driven human motion synthesis based on angular momentum analysis

Ping Hu, Qi Sun, Xiangxu Meng, Jingliang Peng

    Research output: Chapter in Book/Report/Conference proceedingConference contribution

    Abstract

    In this paper, we present a novel method for realtime synthesis of human motion under external perturbations. The proposed method is data-driven and based on angular momentum analysis. When an external force is applied on the virtual human body, we analyze the change in the joints' angular momentums in a short period of time, predict the human body response, find an appropriate motion sequence from the pre-built motion capture (MoCap) database, and make a smooth transition between the current and the retrieved motion sequences to obtain the synthesized motion. The most important contributions of our method include that we propose a complete momentum analysis solution for the human body and that we make effective MoCap data organization based on the major characteristics of the body motion and the external force. As a result, realistic and real-time human motion synthesis is achieved, as experimentally demonstrated with the walking, the running and the jumping sequences.

    Original languageEnglish (US)
    Title of host publication2013 IEEE International Symposium on Circuits and Systems, ISCAS 2013
    Pages929-932
    Number of pages4
    DOIs
    StatePublished - 2013
    Event2013 IEEE International Symposium on Circuits and Systems, ISCAS 2013 - Beijing, China
    Duration: May 19 2013May 23 2013

    Publication series

    NameProceedings - IEEE International Symposium on Circuits and Systems
    ISSN (Print)0271-4310

    Other

    Other2013 IEEE International Symposium on Circuits and Systems, ISCAS 2013
    CountryChina
    CityBeijing
    Period5/19/135/23/13

    ASJC Scopus subject areas

    • Electrical and Electronic Engineering

    Fingerprint Dive into the research topics of 'Data-driven human motion synthesis based on angular momentum analysis'. Together they form a unique fingerprint.

    Cite this