• Title/Summary/Keyword: 얼굴 색 제어

Search Result 5, Processing Time 0.025 seconds

Facial Color Control based on Emotion-Color Theory (정서-색채 이론에 기반한 게임 캐릭터의 동적 얼굴 색 제어)

  • Park, Kyu-Ho;Kim, Tae-Yong
    • Journal of Korea Multimedia Society
    • /
    • v.12 no.8
    • /
    • pp.1128-1141
    • /
    • 2009
  • Graphical expressions are continuously improving, spurred by the astonishing growth of the game technology industry. Despite such improvements, users are still demanding a more natural gaming environment and true reflections of human emotions. In real life, people can read a person's moods from facial color and expression. Hence, interactive facial colors in game characters provide a deeper level of reality. In this paper we propose a facial color adaptive technique, which is a combination of an emotional model based on human emotion theory, emotional expression pattern using colors of animation contents, and emotional reaction speed function based on human personality theory, as opposed to past methods that expressed emotion through blood flow, pulse, or skin temperature. Experiments show this of expression of the Facial Color Model based on facial color adoptive technique and expression of the animation contents is effective in conveying character emotions. Moreover, the proposed Facial Color Adaptive Technique can be applied not only to 2D games, but to 3D games as well.

  • PDF

Real-time Expression Control of Vision Based 3 Dimensional Face Model (비전 기반 3차원 얼굴 모델의 실시간 표정 제어)

  • 김정기;민경필;전준철
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2004.10b
    • /
    • pp.748-750
    • /
    • 2004
  • 본 논문은 연속적으로 입력되는 2차원 얼굴 영상에서 얼굴의 특징 영역들을 추출하여 3차원 얼굴 모델의 표정을 실시간으로 제어하는 방법에 관한 연구이다. 2차원 얼굴 영상에서 얼굴을 추출하기 위해 Hue, Saturation 색상 값을 사용하며, 두 가지 색상 값을 이용하여 피부색과 배경색을 분리함으로써 얼굴 영역을 추출 할 수 있다. 추출 된 얼굴에서 특징 영역인 눈 코, 입술 영역 등의 일지를 각각의 영역에 적합한 추출 방법을 이용하여 추출한 뒤, 프레임 별로 영역들의 움직임을 비교함으로써 영역의 움직임 정보를 획득 할 수 있다. 이 정보를 3차원 얼굴 모델에 적용하여 2차원 동영상에서 획득된 대상의 얼굴의 표정을 3차원 얼굴 모델에 실시간으로 표현 할 수 있도록 한다.

  • PDF

Color and Blinking Control to Support Facial Expression of Robot for Emotional Intensity (로봇 감정의 강도를 표현하기 위한 LED 의 색과 깜빡임 제어)

  • Kim, Min-Gyu;Lee, Hui-Sung;Park, Jeong-Woo;Jo, Su-Hun;Chung, Myung-Jin
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.547-552
    • /
    • 2008
  • Human and robot will have closer relation in the future, and we can expect that the interaction between human and robot will be more intense. To take the advantage of people's innate ability of communication, researchers concentrated on the facial expression so far. But for the robot to express emotional intensity, other modalities such as gesture, movement, sound, color are also needed. This paper suggests that the intensity of emotion can be expressed with color and blinking so that it is possible to apply the result to LED. Color and emotion definitely have relation, however, the previous results are difficult to implement due to the lack of quantitative data. In this paper, we determined color and blinking period to express the 6 basic emotions (anger, sadness, disgust, surprise, happiness, fear). It is implemented on avatar and the intensities of emotions are evaluated through survey. We figured out that the color and blinking helped to express the intensity of emotion for sadness, disgust, anger. For fear, happiness, surprise, the color and blinking didn't play an important role; however, we may improve them by adjusting the color or blinking.

  • PDF

Development of Smart-Car Safety Management System Focused on Drunk Driving Control (음주제어를 중심으로 한 스마트 자동차 안전 관리 시스템 개발)

  • Lee, Se-Hwan;Cho, Dong-Uk
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.37 no.7C
    • /
    • pp.565-575
    • /
    • 2012
  • In the modern everyday life, cars the largest proportion of smart features that require mounting in a variety of smart devices and smart methods on have been developed. In this paper, the smart car among the main core of the safety management system optional for the control of drinking and drowsiness, as part of system development, will be drinking if you start your car automatically is to develop a system to avoid driving. For this, through image processing to analyze the driver's seat of the driver's facial color how to determine whether or not drinking alcohol is proposed. In particular, the system developed in this paper determines whether or not drinking alcohol before the face images without the need for alcohol after only a unique color change of the face appears to target only way to determine whether drinking and actual alcohol control center of a smart car safety control management system can be applied effectively. The experiment was done in 30 patients after drinking appears face color changes of them. We also perform an analysis on the statistical significance of the experimental results to verify the effectiveness of the proposed method.

Direction Recognition of Tongue through Pixel Distribution Estimation after Preprocessing Filtering (전처리 필터링 후 픽셀 분포 평가를 통한 혀 방향 인식)

  • Kim, Chang-dae;Lee, Jae-sung
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2013.10a
    • /
    • pp.73-76
    • /
    • 2013
  • This paper proposes a tongue and its direction recognition algorithm which compares and estimates pixel distribution in the mouth area. As the size of smart phones grows, facial gesture control technology for a smart phone is required. Firstly, the nose area is detected and the mouth area is detected based on the ratio of the nose to mouth. After detecting the mouth area, it is divided by a pattern of grid and the distribution of pixels having the similar color to the tongue is tested for each segment. The recognition rate was nearly 80% in the experiments performed with five researchers among our laboratory members.

  • PDF