Indexed by:
Abstract:
Facial expression synthesis is an important part of the visual human-computer interaction. In order to establish a highly realistic, self-adaptive and automatic real-time facial expression synthesis system, this paper proposes a method based on MPEG-4 to generate three-dimensional facial expression animation. The key steps include: firstly, mark some feature points in three-dimensional human faces and evaluate how these points move from neutral to other facial expression to get animation parameters of these points, secondly, calculate the motility factor of non-feature points in meshed human faces using the interpolation algorithm in this paper and the animation parameters of feature points, lastly, simulate the movements of feature points along time axis to realize facial expression animation. The finally synthesized facial expression looks real and natural, which confirms the validity of the method proposed in this paper. © 2011 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2011
Volume: 2
Page: 131-134
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 2