Control of speech-related facial movements of an avatar from video

Guillaume Gibert, Catherine J. Stevens

Research output: Chapter in Book / Conference PaperConference Paperpeer-review

Abstract

Several puppetry techniques have been recently proposed to transfer emotional facial expressions to an avatar from a user's video stream. Correspondence functions between landmarks extracted from tracking and MPEG-4 Facial Animation Parameters driving the 3D avatar's facial expressions [1] have been proposed. More recently, Saragih and colleagues [2] proposed a real-time puppetry method using only a single image of the avatar and user.

Original languageEnglish
Title of host publicationIntelligent Virtual Agents - 11th International Conference, IVA 2011, Proceedings
Pages445-446
Number of pages2
DOIs
Publication statusPublished - 2011
Event11th International Conference on Intelligent Virtual Agents, IVA 2011 - Reykjavik, Iceland
Duration: 15 Sept 201117 Sept 2011

Publication series

NameLecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
Volume6895 LNAI
ISSN (Print)0302-9743
ISSN (Electronic)1611-3349

Conference

Conference11th International Conference on Intelligent Virtual Agents, IVA 2011
Country/TerritoryIceland
CityReykjavik
Period15/09/1117/09/11

Fingerprint

Dive into the research topics of 'Control of speech-related facial movements of an avatar from video'. Together they form a unique fingerprint.

Cite this