• Title/Summary/Keyword: Human-computer Interaction

Search Result 623, Processing Time 0.038 seconds

A Structured Analysis Model of Customer Loyalty in Online Games (고객 충성도(Customer Loyalty)에 영향을 미치는 온라인 게임의 중요 요소에 대한 LISREL 모델 분석)

  • Choi, Dong-Seong;Park, Sung-June;Kim, Jin-Woo
    • Asia pacific journal of information systems
    • /
    • v.11 no.3
    • /
    • pp.1-21
    • /
    • 2001
  • In recent years, the market for online computer games has become an important part in the entertainment industry. New online games have been introduced every month and the numbers of game players who are playing online games have grown rapidly. However, only a few online games have been successful in making a good profit among many online games. Why are most players playing only a few online games repeatedly? To answer the question, this research focuses on the customer loyalty and their optimal experience(flow) in playing specific online games. This research hypothesizes that customer loyalty for specific online game can be increased by customers' optimal experience(flow) in playing it, and they would feel optimal experience because of mechanic and social interaction in online games. In order to validate the hypothesis, this research analyzes online survey data of players of various online games. According to this survey results, players' optimal experience is affected by their mechanic interaction between a player and an online game system, and their social interaction with other players who participated in the online game. And their optimal experience during playing the online game affects the degree of customer loyalty to the game. This paper ends with conclusions of the survey results and study limits.

  • PDF

Trends on Human/Robot Interface Research (휴먼/로봇 인터페이스 연구동향 분석)

  • Im, Chang-Ju;Im, Chi-Hwan
    • Journal of the Ergonomics Society of Korea
    • /
    • v.21 no.2
    • /
    • pp.101-111
    • /
    • 2002
  • An intelligent robot, which has been developed recently, is no more a conventional robot widely known as an industrial robot. It is a computer system embedded in a machine and utilizes the machine as a medium not only for the communication between the human and the computer but also for the physical interaction among the human, the computer and their environment. Recent advances in computer technology have made it possible to create several of new types of human-computer interaction which are realized by utilizing intelligent machines. There is a continuing need for better understanding of how to design human/robot interface(HRI) to make for a more natural and efficient flow of information and feedback between robot systems and their users in both directions. In this paper, we explain the concept and the scope of HRI and review the current research trends of domestic and foreign HRL. The recommended research directions in the near future are also discussed based upon a comparative study of domestic and foreign HRI technology.

Pictorial Model of Upper Body based Pose Recognition and Particle Filter Tracking (그림모델과 파티클필터를 이용한 인간 정면 상반신 포즈 인식)

  • Oh, Chi-Min;Islam, Md. Zahidul;Kim, Min-Wook;Lee, Chil-Woo
    • 한국HCI학회:학술대회논문집
    • /
    • 2009.02a
    • /
    • pp.186-192
    • /
    • 2009
  • In this paper, we represent the recognition method for human frontal upper body pose. In HCI(Human Computer Interaction) and HRI(Human Robot Interaction) when a interaction is established the human has usually frontal direction to the robot or computer and use hand gestures then we decide to focus on human frontal upper-body pose, The two main difficulties are firstly human pose is consist of many parts which cause high DOF(Degree Of Freedom) then the modeling of human pose is difficult. Secondly the matching between image features and modeling information is difficult. Then using Pictorial Model we model the human main poses which are mainly took the space of frontal upper-body poses and we recognize the main poses by making main pose database. using determined main pose we used the model parameters for particle filter which predicts the posterior distribution for pose parameters and can determine more specific pose by updating model parameters from the particle having the maximum likelihood. Therefore based on recognizing main poses and tracking the specific pose we recognize the human frontal upper body poses.

  • PDF

Using Spatial Ontology in the Semantic Integration of Multimodal Object Manipulation in Virtual Reality

  • Irawati, Sylvia;Calderon, Daniela;Ko, Hee-Dong
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.884-892
    • /
    • 2006
  • This paper describes a framework for multimodal object manipulation in virtual environments. The gist of the proposed framework is the semantic integration of multimodal input using spatial ontology and user context to integrate the interpretation results from the inputs into a single one. The spatial ontology, describing the spatial relationships between objects, is used together with the current user context to solve ambiguities coming from the user's commands. These commands are used to reposition the objects in the virtual environments. We discuss how the spatial ontology is defined and used to assist the user to perform object placements in the virtual environment as it will be in the real world.

  • PDF

A Brain-Computer Interface Based Human-Robot Interaction Platform (Brain-Computer Interface 기반 인간-로봇상호작용 플랫폼)

  • Yoon, Joongsun
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.16 no.11
    • /
    • pp.7508-7512
    • /
    • 2015
  • We propose a brain-machine interface(BMI) based human-robot interaction(HRI) platform which operates machines by interfacing intentions by capturing brain waves. Platform consists of capture, processing/mapping, and action parts. A noninvasive brain wave sensor, PC, and robot-avatar/LED/motor are selected as capture, processing/mapping, and action part(s), respectively. Various investigations to ensure the relations between intentions and brainwave sensing have been explored. Case studies-an interactive game, on-off controls of LED(s), and motor control(s) are presented to show the design and implementation process of new BMI based HRI platform.

HIML(Human Interaction Markup Language) Middleware for Context Awareness in Home Network (홈네트워크 시스템상에서 상황인식을 위한 HIML(Human Interaction Markup Language) 미들웨어)

  • Kim, Joon-Hyung;Son, Min-Woo;Shin, Dong-Kyoo;Shin, Dong-Il
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2006.10b
    • /
    • pp.38-43
    • /
    • 2006
  • 유비쿼터스 컴퓨팅 환경이 발달하면서 홈네트워크 환경의 사용자 상황을 쉽게 인식하고 사용자의 상황 정보에 따라 좀 더 지능적인 서비스를 제공 할 수 있게 되었다. 지능적 서비스를 효과적으로 구현하기 위해서는 상황 정보를 객관적으로 표현할 수 있어야 한다. 상황 정보는 개체의 상황을 특성화 하는데 사용 될 수 있는 정보를 의미한다. 본 논문에서는 홈네트워크 서비스를 효과적으로 제공하기 위하여 상태 정보를 User context, Device context, Proximity(유저와 장치간의 거리) context로 분류하고, 그 상황 정보를 효과적으로 표현할 수 있는 XML기반의 HIML(Human Interaction Markup Language)을 설계하였다. 또한 설계된 HIML 문서를 통해 다양한 플랫폼에서 작동하고, 여러 가전기기와 센서장비들의 상호작용이 가능하게 하는 미들웨어를 설계하고 그 기능을 실험하였다.

  • PDF

Tactile Sensation Display with Electrotactile Interface

  • Yarimaga, Oktay;Lee, Jun-Hun;Lee, Beom-Chan;Ryu, Je-Ha
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.145-150
    • /
    • 2005
  • This paper presents an Electrotactile Display System (ETCS). One of the most important human sensory systems for human computer interaction is the sense of touch, which can be displayed to human through tactile output devices. To realize the sense of touch, electrotactile display produces controlled, localized touch sensation on the skin by passing small electric current. In electrotactile stimulation, the mechanoreceptors in the skin may be stimulated individually in order to display the sense of vibration, touch, itch, tingle, pressure etc. on the finger, palm, arm or any suitable location of the body by using appropriate electrodes and waveforms. We developed an ETCS and investigated effectiveness of the proposed system in terms of the perception of roughness of a surface by stimulating the palmar side of hand with different waveforms and the perception of direction and location information through forearm. Positive and negative pulse trains were tested with different current intensities and electrode switching times on the forearm or finger of the user with an electrode-embedded armband in order to investigate how subjects recognize displayed patterns and directions of stimulation.

  • PDF

Human-Computer Interaction Survey for Intelligent Robot (지능형 로봇을 위한 인간-컴퓨터 상호작용(HCI) 연구동향)

  • Hong, Seok-Ju;Lee, Chil-Woo
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2006.11a
    • /
    • pp.507-511
    • /
    • 2006
  • Intelligent robot is defined as a system that it judges autonomously based on sensory organ of sight, hearing etc.. analogously with human. Human communicates using nonverbal means such as gesture in addition to language. If robot understands such nonverbal communication means, robot may become familiar with human . HCI (Human Computer Interaction) technologies are studied vigorously including face recognition and gesture recognition, but they are many problems that must be solved in real conditions. In this paper, we introduce the importance of contents and give application example of technology stressed on the recent research result about gesture recognition technology as one of most natural communication method with human.

  • PDF

Research on the Influence of Interaction Factors of mobile Phone Dance Live Broadcast on User's Intention of Use -Centered on Perceived Usefulness and Perceived Accessibility

  • Wu, Nuowa
    • Journal of the Korea Society of Computer and Information
    • /
    • v.24 no.8
    • /
    • pp.51-58
    • /
    • 2019
  • In this paper, we propose to characteristics of mobile phone dance live broadcast platform and the second-generation technology acceptance model TAM2, this paper established the user acceptance model of mobile phone dance live broadcast platform, aiming to study the influencing factors of users' acceptance on mobile phone dance live broadcast platform. Based on the empirical analysis of user survey, the model is validated, and the relationship between variables in the model is clarified. It is also confirmed that human-computer interaction, scene interaction, relationship interaction and other factors will affect the user's acceptance on mobile phone dance live broadcast platform. At the same time, based on the relationship among variables obtained in the research, this paper tries to analyze how the variables affect each other based on the actual practice of mobile phone dance live broadcast platform. In addition, the video design strategy and marketing strategy for further development of mobile phone dance live broadcast platform are given to help the platform and dance creators to carry out better promotion on the mobile side. In the end, this paper summarizes the shortcomings of this study and points out further research directions in the future, providing a reference for researchers in the field of mobile phone dance live broadcast platform acceptance.

Recognition and Generation of Facial Expression for Human-Robot Interaction (로봇과 인간의 상호작용을 위한 얼굴 표정 인식 및 얼굴 표정 생성 기법)

  • Jung Sung-Uk;Kim Do-Yoon;Chung Myung-Jin;Kim Do-Hyoung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.12 no.3
    • /
    • pp.255-263
    • /
    • 2006
  • In the last decade, face analysis, e.g. face detection, face recognition, facial expression recognition, is a very lively and expanding research field. As computer animated agents and robots bring a social dimension to human computer interaction, interest in this research field is increasing rapidly. In this paper, we introduce an artificial emotion mimic system which can recognize human facial expressions and also generate the recognized facial expression. In order to recognize human facial expression in real-time, we propose a facial expression classification method that is performed by weak classifiers obtained by using new rectangular feature types. In addition, we make the artificial facial expression using the developed robotic system based on biological observation. Finally, experimental results of facial expression recognition and generation are shown for the validity of our robotic system.