• Title/Summary/Keyword: 주의 깜빡임

Search Result 6, Processing Time 0.026 seconds

Color and Blinking Control to Support Facial Expression of Robot for Emotional Intensity (로봇 감정의 강도를 표현하기 위한 LED 의 색과 깜빡임 제어)

  • Kim, Min-Gyu;Lee, Hui-Sung;Park, Jeong-Woo;Jo, Su-Hun;Chung, Myung-Jin
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.547-552
    • /
    • 2008
  • Human and robot will have closer relation in the future, and we can expect that the interaction between human and robot will be more intense. To take the advantage of people's innate ability of communication, researchers concentrated on the facial expression so far. But for the robot to express emotional intensity, other modalities such as gesture, movement, sound, color are also needed. This paper suggests that the intensity of emotion can be expressed with color and blinking so that it is possible to apply the result to LED. Color and emotion definitely have relation, however, the previous results are difficult to implement due to the lack of quantitative data. In this paper, we determined color and blinking period to express the 6 basic emotions (anger, sadness, disgust, surprise, happiness, fear). It is implemented on avatar and the intensities of emotions are evaluated through survey. We figured out that the color and blinking helped to express the intensity of emotion for sadness, disgust, anger. For fear, happiness, surprise, the color and blinking didn't play an important role; however, we may improve them by adjusting the color or blinking.

  • PDF

Real-time Notification System of Webcam Monitor for Preventing Computer Vision Syndrome (컴퓨터시각증후군 예방을 위한 웹캠모니터의 실시간알림 시스템)

  • Ha, Sangwon;Yoo, Dohyeob;Moon, Mikyeong
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2015.05a
    • /
    • pp.754-755
    • /
    • 2015
  • Computer Vision Syndrome(CVS) is a temporary condition resulting from focusing the eyes on a computer display for protracted, uninterrupted periods of time. To prevent CVS, you have to blink your eyes frequently, also have to keep distances from monitor. In this paper, real-time notification system for preventing CVS by checking user's distance between eyes and monitor and user's frequency of nictation in real time through monitor webcam is described.

  • PDF

A Test of Attentional Blink: Hemifield Independence and Interaction (주의 깜박임 현상의 검증: 주의 자원의 반시야 독립성과 상호작용)

  • Kim, Jung-Yul;Lee, Guk-Hee;Lee, Hyung-Chul O.;Kim, ShinWoo
    • Science of Emotion and Sensibility
    • /
    • v.20 no.2
    • /
    • pp.127-136
    • /
    • 2017
  • Attentional blink is observed in an identification task of multiple targets during rapid serial visual presentation (RSVP) where performance for the second target (T2) that follows within 500ms of the first (T1) shows systematic decrease although that for T1 remains highly accurate. Theories accounting for attentional blink can be classified into two broad categories of resource depletion model and disruption of input filter model. Meanwhile, visual attention capacity shows hemifield independence between left and right visual fields, and many studies reported bilateral advantage in a range of visual working memory tasks. The current research tested two major theories of attentional blink using bilateral independence of attentional capacity. To this end, we conducted two experiments where two RSVPs were presented in either bilateral or unilateral visual fields. Experiment 1 presented two RSVPs which contained both T1 and T2 in either bilateral or unilateral visual fields and tested interaction between attentional blink and bilateral advantage. Experiment 2 removed T1 in one of the two RSVPs to test whether attentional blink obtains when identification of T1 and T2 utilize independent sources of attention across two visual fields. The results showed that subjects were more accurate when two RSVPs were presented in bilateral visual fields (i.e., bilateral advantage) although there was no interaction between attentional blink and bilateral advantage (Experiment 1). In addition, attentional blink for T2 was observed in a T1-absent RSVP even when two RSVPs were presented in bilateral visual fields (Experiment 2). These results support disruption of input filter model rather than resource depletion model.

An Efficient Video Dehazing to Without Flickering Artifacts (비디오에서 플리커 현상이 없는 효율적인 안개제거)

  • Kim, Young Min;Park, Ki Tae;Lee, Dong Seok;Choi, Wonju;Moon, Young Shik
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.51 no.8
    • /
    • pp.51-57
    • /
    • 2014
  • In this paper, we propose a novel method to effectively eliminate flickering artifacts caused by dehazing in video sequences. When applying a dehazing technique directly to each image in a video sequence, flicker artifacts may occur because atmospheric values are calculated without considering the relation of adjacent frames. Although some existing methods reduce flickering artifacts by calculating highly correlated transmission values between adjacent frames, flickering artifacts may still occur. Therefore, in order to effectively reduce flickering artifacts, we propose a novel approach considering temporal averages of atmospheric light values calculated from adjacent frames. Experimental results have shown that the proposed method achieves better performance of video dehazing with less flickering artifact than existing methods.

Comparative Study on Eye-Tracking Evaluation and Landscape Adjectives Evaluation - Focusing on the Nightscape of a University Campus - (아이트래킹 평가 방법과 경관 형용사 평가 비교 연구 - 대학 캠퍼스 야간경관을 대상으로 -)

  • Kang, Young-Eun;Kim, Song-Yi;Baek, Jae-Bong
    • Journal of the Korean Institute of Landscape Architecture
    • /
    • v.46 no.1
    • /
    • pp.38-48
    • /
    • 2018
  • The purpose of this study is to improve the understanding of visual perception and to extend the landscape evaluation area by comparing eye-tracking evaluation and landscape adjective evaluation methods towards various type of nightscapes. As a result of the study, it showed that 'blink count', 'fixation duration average', and 'saccade duration average' of eye tracking measurements have a significant correlation with 'beautiful', 'interesting', 'accessible', 'satisfying', and 'safe' regarding landscape adjectives. In addition, there was a tendency toward areas of interests (AOIs) depending on 12 different nightscapes, which showed that the gaze was fixated by focusing on certain landscape elements such as 'door' and 'signs'. These results suggest that the eye-tracking method is an effective tool to specify the evaluation of 'landscape elements' rather than the 'whole landscape' and can be used as a basis to support landscape preference theories, which has been presented as conceptual only. In this way, the results of this study demonstrated the possibility of various applications of eye tracking as an objective landscape evaluation technique, and it is possible to suggest specific implications to landscape planning through the accumulation of continuous research results.

Pupil Data Measurement and Social Emotion Inference Technology by using Smart Glasses (스마트 글래스를 활용한 동공 데이터 수집과 사회 감성 추정 기술)

  • Lee, Dong Won;Mun, Sungchul;Park, Sangin;Kim, Hwan-jin;Whang, Mincheol
    • Journal of Broadcast Engineering
    • /
    • v.25 no.6
    • /
    • pp.973-979
    • /
    • 2020
  • This study aims to objectively and quantitatively determine the social emotion of empathy by collecting pupillary response. 52 subjects (26 men and 26 women) voluntarily participated in the experiment. After the measurement of the reference of 30 seconds, the experiment was divided into the task of imitation and spontaneously self-expression. The two subjects were interacted through facial expressions, and the pupil images were recorded. The pupil data was processed through binarization and circular edge detection algorithm, and outlier detection and removal technique was used to reject eye-blinking. The pupil size according to the empathy was confirmed for statistical significance with test of normality and independent sample t-test. Statistical analysis results, the pupil size was significantly different between empathy (M ± SD = 0.050 ± 1.817)) and non-empathy (M ± SD = 1.659 ± 1.514) condition (t(92) = -4.629, p = 0.000). The rule of empathy according to the pupil size was defined through discriminant analysis, and the rule was verified (Estimation accuracy: 75%) new 12 subjects (6 men and 6 women, mean age ± SD = 22.84 ± 1.57 years). The method proposed in this study is non-contact camera technology and is expected to be utilized in various virtual reality with smart glasses.