��動��マッ�ング�よる会話��分����

Takaaki Kuratate, Eric Vatikiotis-Bateson

    Research output: Contribution to journalArticle

    Abstract

    Natural face motion during speech is one of the most important factors in not only improving the realism of talking face animation but also in inducing correct perception of auditory information. In this paper, we propose a new technique called Facial Motion Mapping that maps human face motion data to any target character based on the similarity of deformation characteristics between the real people and the target characters. This technique requires only a set of face postures with the same mesh topology per set from each person or character, and it will map face motion to characters of different topologies according to the results of principal component analysis (PCA). Even a small number of mapping parameters controlling the deformation can create a realistic talking face animation using deformation features already contained in each set. We demonstrate this technique for a 3D human face, a 3D dog face, and a 2D cartoon face.

    Keywords

    • animation (cinematography)
    • computer simulation
    • facial expression
    • motion
    • principal components analysis

    Cite this