Aprendizado e utilização do estilo de movimento facial na animação de avatares

Detalhes bibliográficos
Ano de defesa: 2014
Autor(a) principal: Braun, Adriana lattes
Orientador(a): Musse, Soraia Raupp lattes
Banca de defesa: Não Informado pela instituição
Tipo de documento: Tese
Tipo de acesso: Acesso aberto
Idioma: por
Instituição de defesa: Pontifícia Universidade Católica do Rio Grande do Sul
Programa de Pós-Graduação: Programa de Pós-Graduação em Ciência da Computação
Departamento: Faculdade de Informáca
País: BR
Palavras-chave em Português:
Área do conhecimento CNPq:
Link de acesso: http://tede2.pucrs.br/tede2/handle/tede/5267
Resumo: This work presents a methodology, named Persona, for learning and transfer of facial motion style of an actor in order to provide parameters for avatar facial animation. Through this methodology, we can drive the facial animation of avatars using the motion style of a particular actor, through the performance of any user. Thus, the avatar can express the facial movements that the user is performing, but replicating the particularities of the actor s facial movements, based on his or her Persona. In order to build the facial motion style model of an actor, we used points tracked on image sequences of the actor performance as input data. We also used a database of threedimensional facial expressions, annotated according to the Facial Action Coding System (FACS). Principal components analysis was performed using these data. Afterwards, artificial neural networks were trained to recognize action units both in the actor and user s performances. Given these classifiers, we can automatically recognize action units in the user s expression and find the equivalent parameters in the actor s motion style model. The result of the process is the provision of such parameters to facial animation systems. The prototype developed as proof of concept has been used in case studies, whose results are discussed. Future work are also addressed.