• Title/Summary/Keyword: Human computer interaction

Search Result 629, Processing Time 0.03 seconds

Development of Facial Expression Recognition System based on Bayesian Network using FACS and AAM (FACS와 AAM을 이용한 Bayesian Network 기반 얼굴 표정 인식 시스템 개발)

  • Ko, Kwang-Eun;Sim, Kwee-Bo
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.19 no.4
    • /
    • pp.562-567
    • /
    • 2009
  • As a key mechanism of the human emotion interaction, Facial Expression is a powerful tools in HRI(Human Robot Interface) such as Human Computer Interface. By using a facial expression, we can bring out various reaction correspond to emotional state of user in HCI(Human Computer Interaction). Also it can infer that suitable services to supply user from service agents such as intelligent robot. In this article, We addresses the issue of expressive face modeling using an advanced active appearance model for facial emotion recognition. We consider the six universal emotional categories that are defined by Ekman. In human face, emotions are most widely represented with eyes and mouth expression. If we want to recognize the human's emotion from this facial image, we need to extract feature points such as Action Unit(AU) of Ekman. Active Appearance Model (AAM) is one of the commonly used methods for facial feature extraction and it can be applied to construct AU. Regarding the traditional AAM depends on the setting of the initial parameters of the model and this paper introduces a facial emotion recognizing method based on which is combined Advanced AAM with Bayesian Network. Firstly, we obtain the reconstructive parameters of the new gray-scale image by sample-based learning and use them to reconstruct the shape and texture of the new image and calculate the initial parameters of the AAM by the reconstructed facial model. Then reduce the distance error between the model and the target contour by adjusting the parameters of the model. Finally get the model which is matched with the facial feature outline after several iterations and use them to recognize the facial emotion by using Bayesian Network.

Developing Interactive Game Contents using 3D Human Pose Recognition (3차원 인체 포즈 인식을 이용한 상호작용 게임 콘텐츠 개발)

  • Choi, Yoon-Ji;Park, Jae-Wan;Song, Dae-Hyeon;Lee, Chil-Woo
    • The Journal of the Korea Contents Association
    • /
    • v.11 no.12
    • /
    • pp.619-628
    • /
    • 2011
  • Normally vision-based 3D human pose recognition technology is used to method for convey human gesture in HCI(Human-Computer Interaction). 2D pose model based recognition method recognizes simple 2D human pose in particular environment. On the other hand, 3D pose model which describes 3D human body skeletal structure can recognize more complex 3D pose than 2D pose model in because it can use joint angle and shape information of body part. In this paper, we describe a development of interactive game contents using pose recognition interface that using 3D human body joint information. Our system was proposed for the purpose that users can control the game contents with body motion without any additional equipment. Poses are recognized comparing current input pose and predefined pose template which is consist of 14 human body joint 3D information. We implement the game contents with the our pose recognition system and make sure about the efficiency of our proposed system. In the future, we will improve the system that can be recognized poses in various environments robustly.

An Art-Robot Expressing Emotion with Color Light and Behavior by Human-Object Interaction

  • Kwon, Yanghee;Kim, Sangwook
    • Journal of Multimedia Information System
    • /
    • v.4 no.2
    • /
    • pp.83-88
    • /
    • 2017
  • The era of the fourth industrial revolution, which will bring about a great wave of change in the 21st century, is the age of super-connection that links humans to humans, objects to objects, and humans to objects. In the smart city and the smart space which are evolving further, emotional engineering is a field of interdisciplinary researches that still attract attention with the development of technology. This paper proposes an emotional object prototype as a possibility of emotional interaction in the relation between human and object. By suggesting emotional objects that produce color changes and movements through the emotional interactions between humans and objects against the current social issue-loneliness of modern people, we have approached the influence of our lives in the relation with objects. It is expected that emotional objects that are approached from the fundamental view will be able to be in our lives as a viable cultural intermediary in our future living space.

3D Gaze-based Stereo Image Interaction Technique (3차원 시선기반 입체영상 인터랙션 기법)

  • Ki, Jeong-Seok;Jeon, Kyeong-Won;Jo, Sang-Woo;Kwon, Yong-Moo;Kim, Sung-Kyu
    • 한국HCI학회:학술대회논문집
    • /
    • 2007.02a
    • /
    • pp.512-517
    • /
    • 2007
  • There are several researches on 2D gaze tracking techniques for the 2D screen for the Human-Computer Interaction. However, the researches for the gaze-based interaction to the stereo images or contents are not reported. The 3D display techniques are emerging now for the reality service. Moreover, the 3D interaction techniques are much more needed in the 3D contents service environments. This paper addresses gaze-based 3D interaction techniques on stereo display, such as parallax barrier or lenticular stereo display. This paper presents our researches on 3D gaze estimation and gaze-based interaction to stereo display.

  • PDF

A study on human performance in graphic-aided scheduling tasks

  • 백동현;오상윤;윤완철
    • Proceedings of the Korean Operations and Management Science Society Conference
    • /
    • 1994.04a
    • /
    • pp.357-363
    • /
    • 1994
  • In many industrial situations the human acts as the primary scheduler since there often exist various constraints and considerations that may not be mathematically or quantitatively defined. For proper design of interactive scheduling systems, how human strategy and performance are affected by the fashion of human-computer interaction at various levels of task complexity should be investigated. In this study, two scheduling experiments were conducted. The first one showed that human schedulers could perform better than simple heuristic rules with each of typical performance measures such as average machine utilization, average tardiness, and maximum tardiness. In experiment 2, the effect of providing computer-generated initial solution was investigated. The results was that in complex problems the subjects performed significantly better when the initial solutions were generated by themselves, evidencing the importance of the continuity of strategic search through the problem.

Wireless EMG-based Human-Computer Interface for Persons with Disability

  • Lee, Myoung-Joon;Moon, In-Hyuk;Kim, Sin-Ki;Mun, Mu-Seong
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.1485-1488
    • /
    • 2003
  • This paper proposes a wireless EMG-based human-computer interface (HCI) for persons with disabilities. For the HCI, four interaction commands are defined by combining three elevation motions of shoulders such as left, right and both elevations. The motions are recognized by comparing EMG signals on the Levator scapulae muscles with double thresholds. A real-time EMG processing hardware is implemented for acquiring EMG signals and recognizing the motions. To achieve real-time processing, filters such as high- and low-pass filter and band-pass and -rejection filter, and a full rectifier and a mean absolute value circuit are embedded on a board with a high speed microprocessor. The recognized results are transferred to a wireless client system such as a mobile robot via a Bluetooth module. From experimental results using the implemented real-time EMG processing hardware, the proposed wireless EMG-based HCI is feasible for the disabled.

  • PDF

Automatic Human Emotion Recognition from Speech and Face Display - A New Approach (인간의 언어와 얼굴 표정에 통하여 자동적으로 감정 인식 시스템 새로운 접근법)

  • Luong, Dinh Dong;Lee, Young-Koo;Lee, Sung-Young
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2011.06b
    • /
    • pp.231-234
    • /
    • 2011
  • Audiovisual-based human emotion recognition can be considered a good approach for multimodal humancomputer interaction. However, the optimal multimodal information fusion remains challenges. In order to overcome the limitations and bring robustness to the interface, we propose a framework of automatic human emotion recognition system from speech and face display. In this paper, we develop a new approach for fusing information in model-level based on the relationship between speech and face expression to detect automatic temporal segments and perform multimodal information fusion.

Real-time Finger Gesture Recognition (실시간 손가락 제스처 인식)

  • Park, Jae-Wan;Song, Dae-Hyun;Lee, Chil-Woo
    • 한국HCI학회:학술대회논문집
    • /
    • 2008.02a
    • /
    • pp.847-850
    • /
    • 2008
  • On today, human is going to develop machine by using mutual communication to machine. Including vision - based HCI(Human Computer Interaction), the technique which to recognize finger and to track finger is important in HCI systems, in HCI systems. In order to divide finger, this paper uses more effectively dividing the technique using subtraction which is separation of background and foreground, as well as to divide finger from limited background and cluttered background. In order to divide finger, the finger is recognized to make "Template-Matching" by identified fingertip images. And, identified gestures be compared the tracked gesture after tracking recognized finger. In this paper, after obtaining interest area, not only using subtraction image and template-matching but to perform template-matching in the area. So, emphasis is placed on decreasing perform speed and reaction speed, and we propose technique which is more effectively recognizing gestures.

  • PDF

Hand Motion Recognition Algorithm Using Skin Color and Center of Gravity Profile (피부색과 무게중심 프로필을 이용한 손동작 인식 알고리즘)

  • Park, Youngmin
    • The Journal of the Convergence on Culture Technology
    • /
    • v.7 no.2
    • /
    • pp.411-417
    • /
    • 2021
  • The field that studies human-computer interaction is called HCI (Human-computer interaction). This field is an academic field that studies how humans and computers communicate with each other and recognize information. This study is a study on hand gesture recognition for human interaction. This study examines the problems of existing recognition methods and proposes an algorithm to improve the recognition rate. The hand region is extracted based on skin color information for the image containing the shape of the human hand, and the center of gravity profile is calculated using principal component analysis. I proposed a method to increase the recognition rate of hand gestures by comparing the obtained information with predefined shapes. We proposed a method to increase the recognition rate of hand gestures by comparing the obtained information with predefined shapes. The existing center of gravity profile has shown the result of incorrect hand gesture recognition for the deformation of the hand due to rotation, but in this study, the center of gravity profile is used and the point where the distance between the points of all contours and the center of gravity is the longest is the starting point. Thus, a robust algorithm was proposed by re-improving the center of gravity profile. No gloves or special markers attached to the sensor are used for hand gesture recognition, and a separate blue screen is not installed. For this result, find the feature vector at the nearest distance to solve the misrecognition, and obtain an appropriate threshold to distinguish between success and failure.

Effects of the Interaction with Computer Agents on Users' Psychological Experiences (컴퓨터 에이전트와의 상호작용이 사용자의 심리적 경험에 미치는 영향)

  • Park, Joo-Yeon
    • Science of Emotion and Sensibility
    • /
    • v.10 no.2
    • /
    • pp.155-168
    • /
    • 2007
  • Social and psychological experiences in human-agent interactions are becoming more important than the task-oriented efficiency, as the influence of computer agents increases and human-agent interaction develops similarly with interpersonal interaction. Many previous studies aimed to increase social presence in human-agent interaction, in order to derive users' positive psychological experiences, by applying the factors of interpersonal communication to verbal and non-verbal communication of the agents. This study examined the effects of the exchanges of mutual self-disclosure, one of the most important communication acts in interpersonal communication, between users and interface agents. Users' attachment styles towards the perception of social presence, the evaluations toward the agents, user experiences, and the intentions for future interaction were also studied. The mediating role of social presence in dependent variables was, also, examined in this research. The results showed that exchanging self-disclosures with an agent increased the perceptions of social experience, friendly evaluations toward the agent, positive user experience, and the intentions for future interaction. Participants' attachment styles, also, affected the perceptions of the dependent variables. The effects of the exchanges of self-disclosure and participants' attachment styles were mediated by perceived social presence toward the agent. The findings of this study imply that the social and communicational aspects need to be considered in design of the agents seriously. The results also suggest that there may be differences in the psychological effects of agents on users according to the users' personality.

  • PDF