• Title/Summary/Keyword: Facial Avatar

Search Result 59, Processing Time 0.029 seconds

Spectrum-Based Color Reproduction Algorithm for Makeup Simulation of 3D Facial Avatar

  • Jang, In-Su;Kim, Jae Woo;You, Ju-Yeon;Kim, Jin Seo
    • ETRI Journal
    • /
    • v.35 no.6
    • /
    • pp.969-979
    • /
    • 2013
  • Various simulation applications for hair, clothing, and makeup of a 3D avatar can provide more useful information to users before they select a hairstyle, clothes, or cosmetics. To enhance their reality, the shapes, textures, and colors of the avatars should be similar to those found in the real world. For a more realistic 3D avatar color reproduction, this paper proposes a spectrum-based color reproduction algorithm and color management process with respect to the implementation of the algorithm. First, a makeup color reproduction model is estimated by analyzing the measured spectral reflectance of the skin samples before and after applying the makeup. To implement the model for a makeup simulation system, the color management process controls all color information of the 3D facial avatar during the 3D scanning, modeling, and rendering stages. During 3D scanning with a multi-camera system, spectrum-based camera calibration and characterization are performed to estimate the spectrum data. During the virtual makeup process, the spectrum data of the 3D facial avatar is modified based on the makeup color reproduction model. Finally, during 3D rendering, the estimated spectrum is converted into RGB data through gamut mapping and display characterization.

Makeup Design and the Application of 3D Facial Avatar Makeup Simulation

  • Barng, Keejung
    • Journal of Fashion Business
    • /
    • v.18 no.6
    • /
    • pp.57-66
    • /
    • 2014
  • The purpose of this study is to design appropriate digital tools for the production of makeup designs. In this study, we used a three-dimensional facial avatar simulation program developed by the Electronics and Telecommunications Research. This study is based on the creation of three-dimensional CG digital art of facial avatar makeup, produced by using simulation technology. First, the actual application and the tools for digital-optimization and media features were created, leading to the research and cleanup. Second, the theoretical background was applied to the formative elements of oriental colors in the designing process. Makeup design elements include point, line, surface, color, and texture. In this study, effective makeup design was interpreted to be based on the representation of particular elements, notably the design principles of balance, proportion, rhythm, repetition, emphasis, contrast, harmony, and unity. In Asia, design is based on the visibility of red, blue, black, yellow, and white-the colors of the five elements-and the use of points, lines, and shapes. This study was recently under scrutiny in relations to digital simulation and various three-dimensional designs, in terms of how to take advantage of a wide range of applications, and how to apply the findings through media and the dissemination of basic research. This study applies the characteristics of the limited existing stereoscopic three-dimensional and digital simulation programs in order to take advantage of the empirical research, providing a basis to implement this research in a meaningful way. A follow-up study is needed to extend these findings and theoretical foundation through continuous observation and in-depth technical development and research.

Realtime Facial Expression Control of 3D Avatar by PCA Projection of Motion Data (모션 데이터의 PCA투영에 의한 3차원 아바타의 실시간 표정 제어)

  • Kim Sung-Ho
    • Journal of Korea Multimedia Society
    • /
    • v.7 no.10
    • /
    • pp.1478-1484
    • /
    • 2004
  • This paper presents a method that controls facial expression in realtime of 3D avatar by having the user select a sequence of facial expressions in the space of facial expressions. The space of expression is created from about 2400 frames of facial expressions. To represent the state of each expression, we use the distance matrix that represents the distances between pairs of feature points on the face. The set of distance matrices is used as the space of expressions. Facial expression of 3D avatar is controled in real time as the user navigates the space. To help this process, we visualized the space of expressions in 2D space by using the Principal Component Analysis(PCA) projection. To see how effective this system is, we had users control facial expressions of 3D avatar by using the system. This paper evaluates the results.

  • PDF

A Comic Facial Expression Using Cheeks and Jaws Movements for Intelligent Avatar Communications (지적 아바타 통신에서 볼과 턱 움직임을 사용한 코믹한 얼굴 표정)

  • ;;Yoshinao Aoki
    • Proceedings of the IEEK Conference
    • /
    • 2001.06c
    • /
    • pp.121-124
    • /
    • 2001
  • In this paper, a method of generating the facial gesture CG animation on different avatar models is provided. At first, to edit emotional expressions efficiently, regeneration of the comic expression on different polygonal mesh models is carried out, where the movements of the cheeks and numerical methods. Experimental results show a possibility that the method could be used for intelligent avatar communications between Korea and Japan.

  • PDF

Understanding the Importance of Presenting Facial Expressions of an Avatar in Virtual Reality

  • Kim, Kyulee;Joh, Hwayeon;Kim, Yeojin;Park, Sohyeon;Oh, Uran
    • International journal of advanced smart convergence
    • /
    • v.11 no.4
    • /
    • pp.120-128
    • /
    • 2022
  • While online social interactions have been more prevalent with the increased popularity of Metaverse platforms, little has been studied the effects of facial expressions in virtual reality (VR), which is known to play a key role in social contexts. To understand the importance of presenting facial expressions of a virtual avatar under different contexts, we conducted a user study with 24 participants where they were asked to have a conversation and play a charades game with an avatar with and without facial expressions. The results show that participants tend to gaze at the face region for the majority of the time when having a conversation or trying to guess emotion-related keywords when playing charades regardless of the presence of facial expressions. Yet, we confirmed that participants prefer to see facial expressions in virtual reality as well as in real-world scenarios as it helps them to better understand the contexts and to have more immersive and focused experiences.

Realistic Expression Factor to Visual Presence of Virtual Avatar in Eye Reflection (가상 아바타의 각막면에 비친 반사영상의 시각적 실재감에 대한 실감표현 요소)

  • Won, Myoung Ju;Lee, Eui Chul;Whang, Min-Cheol
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.7
    • /
    • pp.9-15
    • /
    • 2013
  • In the VnR (Virtual and Real Worlds) of recent virtual reality convergence, the modelling of realistic human face is focused on the facial appearance such as the shape of facial parts and muscle movement. However, the facial changing parameters caused by environmental factors beyond the facial appearance factors can be regarded as important ones in terms of effectively representing virtual avatar. Therefore, this study evaluates user's visual feeling response according to the opacity variation of eye reflection of virtual avatar which is considered as a new parameter for reprenting realistic avatar. Experimental result showed that more clear eye reflection induced more realistic visual feeling of subjects. This result can be regarded as a basis for designing realistic virtual avatar by supporting a new visual realistic representing factor (eye reflection) and its degree of representation (reflectance ratio).

A Generation Methodology of Facial Expressions for Avatar Communications (아바타 통신에서의 얼굴 표정의 생성 방법)

  • Kim Jin-Yong;Yoo Jae-Hwi
    • Journal of the Korea Society of Computer and Information
    • /
    • v.10 no.3 s.35
    • /
    • pp.55-64
    • /
    • 2005
  • The avatar can be used as an auxiliary methodology of text and image communications in cyber space. An intelligent communication method can also be utilized to achieve real-time communication, where intelligently coded data (joint angles for arm gestures and action units for facial emotions) are transmitted instead of real or compressed pictures. In this paper. for supporting the action of arm and leg gestures, a method of generating the facial expressions that can represent sender's emotions is provided. The facial expression can be represented by Action Unit(AU), in this paper we suggest the methodology of finding appropriate AUs in avatar models that have various shape and structure. And, to maximize the efficiency of emotional expressions, a comic-style facial model having only eyebrows, eyes, nose, and mouth is employed. Then generation of facial emotion animation with the parameters is also investigated.

  • PDF

Gesture Communications Between Different Avatar Models Using FBML (FBML을 이용한 서로 다른 아바타 모델간의 제스처 통신)

  • ;;Yoshiki Arakawa
    • Proceedings of the IEEK Conference
    • /
    • 2003.11b
    • /
    • pp.57-60
    • /
    • 2003
  • In order to overcome the limitation based on different avatar models, in this paper, we propose gesture communications between different avatar models using FBML (Facial Body Markup Language). The experimental results demonstrate a possibility that the proposed method could be used as an efficient means to overcome the problem.

  • PDF

A Study on Interactive Avatar in Mobile device using facial expression of Animation Character (모바일 기기에서 애니메이션 캐릭터의 얼굴표현을 이용한 인터랙티브 아바타에 관한 연구)

  • Oh Jeong-Seok;Youn Ho-Chang;Jeon Hong-Jun
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2005.05a
    • /
    • pp.229-236
    • /
    • 2005
  • This paper is study about emotional Interactive avatar in cellular phone. When user ask what he want to the avatar, it answer with facial expression based on animation Charac- ter. So the user can approach more friendly to the avatar.

  • PDF

Action-Based Audit with Relational Rules to Avatar Interactions for Metaverse Ethics

  • Bang, Junseong;Ahn, Sunghee
    • Smart Media Journal
    • /
    • v.11 no.6
    • /
    • pp.51-63
    • /
    • 2022
  • Metaverse provides a simulated environment where a large number of users can participate in various activities. In order for Metaverse to be sustainable, it is necessary to study ethics that can be applied to a Metaverse service platform. In this paper, Metaverse ethics and the rules for applying to the platform are explored. And, in order to judge the ethicality of avatar actions in social Metaverse, the identity, interaction, and relationship of an avatar are investigated. Then, an action-based audit approach to avatar interactions (e.g., dialogues, gestures, facial expressions) is introduced in two cases that an avatar enters a digital world and that an avatar requests the auditing to subjects, e.g., avatars controlled by human users, artificial intelligence (AI) avatars (e.g., as conversational bots), and virtual objects. Pseudocodes for performing the two cases in a system are presented and they are examined based on the description of the avatars' actions.