TY - GEN
T1 - Control of speech-related facial movements of an avatar from video
AU - Gibert, Guillaume
AU - Stevens, Catherine J.
PY - 2011
Y1 - 2011
N2 - Several puppetry techniques have been recently proposed to transfer emotional facial expressions to an avatar from a user's video stream. Correspondence functions between landmarks extracted from tracking and MPEG-4 Facial Animation Parameters driving the 3D avatar's facial expressions [1] have been proposed. More recently, Saragih and colleagues [2] proposed a real-time puppetry method using only a single image of the avatar and user.
AB - Several puppetry techniques have been recently proposed to transfer emotional facial expressions to an avatar from a user's video stream. Correspondence functions between landmarks extracted from tracking and MPEG-4 Facial Animation Parameters driving the 3D avatar's facial expressions [1] have been proposed. More recently, Saragih and colleagues [2] proposed a real-time puppetry method using only a single image of the avatar and user.
UR - http://www.scopus.com/inward/record.url?scp=80053196631&partnerID=8YFLogxK
U2 - 10.1007/978-3-642-23974-8_55
DO - 10.1007/978-3-642-23974-8_55
M3 - Conference Paper
AN - SCOPUS:80053196631
SN - 9783642239731
T3 - Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
SP - 445
EP - 446
BT - Intelligent Virtual Agents - 11th International Conference, IVA 2011, Proceedings
T2 - 11th International Conference on Intelligent Virtual Agents, IVA 2011
Y2 - 15 September 2011 through 17 September 2011
ER -