Browse > Article

Automatic Synchronization of Separately-Captured Facial Expression and Motion Data  

Jeong, Tae-Wan (Department of Digital Contents, Sejong University)
Park, Sang-II (Department of Digital Contents, Sejong University)
Abstract
In this paper, we present a new method for automatically synchronize captured facial expression data with its corresponding motion data. In a usual optical motion capture set-up, a detailed facial expression can not be captured simultaneously in the motion capture session because its resolution requirement is higher than that of the motion capture. Therefore, those are captured in two separate sessions and need to be synchronized in the post-process to be used for generating a convincing character animation. Based on the patterns of the actor's neck movement extracted from those two data, we present a non-linear time warping method for the automatic synchronization. We justify our method with the actual examples to show the viability of the method.
Keywords
3D Character Animation; Motion Capture; Facial Expression Capture; Motion Editing; Automatic Synchronization;
Citations & Related Records
Times Cited By KSCI : 1  (Citation Analysis)
연도 인용수 순위
1 M. Brand, "Voice puppertry," in Proceedings of ACM SIGGRAPH 92, 1992, pp. 21-28.
2 Z. Deng and U. Neumann, "efase: expressive facial animation synthesis and editing with phoneme-isomap controls," in Proceedings of ACM SIGGRAPH/Eurographics Symposium on Computer Animation, 2006, pp. 251-260.
3 S. Kshirsagar and N. Magnenat-Thalmann, "Visyllable based speech animation," Computer Graphics Forum, vol. 22, no. 3, pp. 632-640, 2003.
4 E. Chuang and C. Bregler, "Mood swings: expressive speech animation," ACM Transactions on Graphics, vol. 24, no. 2, pp. 331-347, 2005.
5 P. Joshi, W. C. Tien, M. Desbrun, and F. Pighin, "Learning controls for blend shape based realistic facial animation," in Proceedings of ACM SIGGRAPH/Eurographics Symposium on Computer Animation, 2003, pp. 187-192.
6 H. Pyun, Y. Kim, W. Chae, H. W. Kang, and S. Y. Shin, "An example-based approach for facial expression cloning," in Proceedings of ACM SIGGRAPH/Eurographics Symposium on Computer Animation, 2003, pp. 167-176.
7 E. Ju and J. Lee, "Expressive facial gestures from motion capture data," Computer Graphics Forum, vol. 27, no. 2, pp. 381-388, 2008.
8 F. Pighin, J. Hecker, D. Lischinski, D. Salesin, and R. Szeliski, "Synthesizing realistic facial expressions," in Proceedings of ACM SIGGRAPH 98, 1998, pp. 75-84.
9 W. Ma, A. Jones, J. Chiang, T. Hawkins, S. Frederiksen, P. Peers, M. Vukovic, M. Ouhyoung, and P. Debevec, "Facial performance synthesis using deformation-driven polynomial displacement maps," ACM Transactions on Graphics, vol. 27, no. 5, p. 121, 2008.
10 B. Horn, "Closed-form solution of absolute orientation using unit quaternions," Journal of the Optical Society of America, vol. A, no. 4, pp. 629-642, 1987.
11 J. Lee and S. Y. Shin, "General construction of time-domain filters for orientation data," IEEE Transactions on Visualization and Computer Graphics, vol. 8, no. 2, pp. 119-128, 2002.   DOI
12 A. Bruderlin and L. Williams, "Motion signal processing," in Proceedings of ACM SIGGRAPH 95, 1995.
13 박상일, "광학식 동작 포착 장비를 이용한 노이즈에 강건한 얼굴 애니메이션 제작," 한국게임저널, vol. 10, no. 5, pp. 103-114, 2010.