DOI QR코드

DOI QR Code

Applying MetaHuman Facial Animation with MediaPipe: An Alternative Solution to Live Link iPhone.

  • Balgum Song (Department of International College, Dongseo University) ;
  • Arminas Baronas (Department of International College, Dongseo University)
  • Received : 2024.07.25
  • Accepted : 2024.08.07
  • Published : 2024.09.30

Abstract

This paper presents an alternative solution for applying MetaHuman facial animations using MediaPipe, providing a versatile option to the Live Link iPhone system. Our approach involves capturing facial expressions with various camera devices, including webcams, laptop cameras, and Android phones, processing the data for landmark detection, and applying these landmarks in Unreal Engine Blueprint to animate MetaHuman characters in real-time. Techniques such as the Eye Aspect Ratio (EAR) for blink detection and the One Euro Filter for data smoothing ensure accurate and responsive animations. Experimental results demonstrate that our system provides a cost-effective and flexible alternative for iPhone non-users, enhancing the accessibility of advanced facial capture technology for applications in digital media and interactive environments. This research offers a practical and adaptable method for real-time facial animation, with future improvements aimed at integrating more sophisticated emotion detection features.

Keywords

References

  1. Yihao Zhang, Xiangzhen He, Yerong Hu, Jia Zeng, Huaiyuan Yang, and Shuaihang Zhou, "Face animation making method based on facial motion capture," in 2021 IEEE International Conference on Emergency Science and Information Technology (ICESIT), pp. 84-88, IEEE, 2021. DOI: https://doi.org/10.1109/icesit53460.2021.9696547.
  2. Xiaoting Wang, Lu Wang, and Guosheng Wu, "Body and Face Animation Based on Motion Capture," International Journal of Information Engineering and Electronic Business, Vol. 3, No. 2, pp. 28, 2011. DOI: https://doi.org/10.5815/ijieeb.2011.02.04.
  3. Joel McKim, "Animation without animators: from motion capture to MetaHumans," Animation Studies 2.0, 2022.
  4. Chris Bregler, "Motion capture technology for entertainment [in the spotlight]," IEEE Signal Processing Magazine, Vol. 24, No. 6, pp. 160-158, 2007. DOI: https://doi.org/10.1109/msp.2007.4317482.
  5. Nicole Dagnes, Federica Marcolin, Enrico Vezzetti, Francois-Regis Sarhan, Stephanie Dakpe, Frederic Marin, Francesca Nonis, and Khalil Ben Mansour, "Optimal marker set assessment for motion capture of 3D mimic facial movements," Journal of Biomechanics, Vol. 93, pp. 86-93, 2019. DOI: https://doi.org/10.1016/j.jbiomech.2019.06.012.
  6. Bernd Bickel, Mario Botsch, Roland Angst, Wojciech Matusik, Miguel Otaduy, Hanspeter Pfister, and Markus Gross, "Multi-scale capture of facial geometry and motion," ACM Transactions on Graphics (TOG), Vol. 26, No. 3, pp. 33-es, 2007.
  7. Demetri Terzopoulos and Keith Waters, "Techniques for realistic facial modeling and animation," in Computer Animation'91, pp. 59-74, Springer Japan, 1991. DOI: https://doi.org/10.1007/978-4-431-66890-9_5.
  8. Stefano Corazza, Lars Mundermann, Emiliano Gambaretto, Giancarlo Ferrigno, and Thomas P. Andriacchi, "Markerless motion capture through visual hull, articulated icp and subject specific model generation," International Journal of Computer Vision, Vol. 87, pp. 156-169, 2010. DOI: https://doi.org/10.1007/s11263-009-0284-3.
  9. M. Rahul, "Review on motion capture technology," Global Journal of Computer Science and Technology, Vol. 18, No. 1, pp. 23-26, 2018.
  10. Bradley Scott, Martin Seyres, Fraser Philp, Edward K. Chadwick, and Dimitra Blana, "Healthcare applications of single camera markerless motion capture: a scoping review," PeerJ, Vol. 10, e13517, 2022. DOI: https://doi.org/10.31219/osf.io/2e8nz.
  11. Carlos Vilchis, Sharon Ramirez, Armando Rodriguez, and Miguel Gonzalez, "Driving the future faces: Benchmarking state-of-the-art facial tracking technology for Digital Humans," 2022.
  12. Jan Cech and Tereza Soukupova, "Real-time eye blink detection using facial landmarks," Cent. Mach. Perception, Dep. Cybern. Fac. Electr. Eng. Czech Tech. Univ. Prague, pp. 1-8, 2016.
  13. Gery Casiez, Nicolas Roussel, and Daniel Vogel, "1€ filter: a simple speed-based low-pass filter for noisy input in interactive systems," in Proceedings of the SIGCHI Conference on Human Factors in Computing Systems, pp. 2527-2530, 2012. DOI: https://doi.org/10.1145/2207676.2208639.