Cross-subject face animation driven by facial motion mapping

Takaaki Kuratate, Eric Vatikiotis-Bateson, Hani Camille Yehia

    Research output: Chapter in Book / Conference PaperConference Paper

    Abstract

    ![CDATA[Here we present a new technique for efficiently generating high-quality talking head animations for multiple targets starting from one person’s data. We use time-varying values for a few locations on the face to control subject-specific deformation parameters, which are linearly derived from static face data. We describe how the motion data for one face can easily be used to control the deformation parameters of other faces by exploiting the similarity of static parameters across models. We present results of this method for realistic, cartoon and animal faces.]]
    Original languageEnglish
    Title of host publicationAdvanced Design, Production and Management Systems : Proceedings of the 10th ISPE International Conference on Concurrent Engineering : Research and Applications, 26-30 July 2003, Madeira, Portugal
    PublisherSwets & Zeitlinger
    Number of pages9
    ISBN (Electronic)9058095246
    ISBN (Print)9058096238
    Publication statusPublished - 2003
    EventISPE International Conference on Concurrent Engineering : Research and Applications -
    Duration: 1 Jan 2003 → …

    Conference

    ConferenceISPE International Conference on Concurrent Engineering : Research and Applications
    Period1/01/03 → …

    Fingerprint

    Dive into the research topics of 'Cross-subject face animation driven by facial motion mapping'. Together they form a unique fingerprint.

    Cite this