• Title/Summary/Keyword: Human-computer Interaction

Search Result 623, Processing Time 0.025 seconds

Comparing Initiating and Responding Joint Attention as a Social Learning Mechanism: A Study Using Human-Avatar Head/Hand Interaction (사회 학습 기제로서 IJA와 RJA의 비교: 인간-아바타 머리/손 상호작용을 이용한 연구)

  • Kim, Mingyu;Kim, So-Yeon;Kim, Kwanguk
    • Journal of KIISE
    • /
    • v.43 no.6
    • /
    • pp.645-652
    • /
    • 2016
  • Joint Attention (JA) has been known to play a key role in human social learning. However, relative impact of different interaction types has yet to be rigorously examined because of limitation of existing methodologies to simulate human-to-human interaction. In the present study, we designed a new JA paradigm with emulating human-avatar interaction and virtual reality technologies, and tested the paradigm in two experiments with healthy adults. Our results indicated that initiating JA (IJA) condition was more effective than responding JA (RJA) condition for social learning in both head and hand interactions. Moreover, the hand interaction involved better information processing than the head interaction. The implication of the results, the validity of the new paradigm, and limitations of this study were discussed.

A study on the increase of user gesture recognition rate using data preprocessing (데이터 전처리를 통한 사용자 제스처 인식률 증가 방안)

  • Kim, Jun Heon;Song, Byung Hoo;Shin, Dong Ryoul
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2017.07a
    • /
    • pp.13-16
    • /
    • 2017
  • 제스처 인식은 HCI(Human-Computer Interaction) 및 HRI(Human-Robot Interaction) 분야에서 활발히 연구되고 있는 기술이며, 제스처 데이터의 특징을 추출해내고 그에 따른 분류를 통하여 사용자의 제스처를 정확히 판별하는 것이 중요한 과제로 자리 잡았다. 본 논문에서는 EMG(Electromyography) 센서로 측정한 사용자의 손 제스처 데이터를 분석하는 방안에 대하여 서술한다. 수집된 데이터의 노이즈를 제거하고 데이터의 특징을 극대화시키기 위하여 연속적인 데이터로 변환하는 전처리 과정을 거쳐 이를 머신 러닝 알고리즘을 사용하여 분류하였다. 이 때, 기존의 raw 데이터와 전처리 과정을 거친 데이터의 성능을 decision-tree 알고리즘을 통하여 비교하였다.

  • PDF

A Novel Computer Human Interface to Remotely Pick up Moving Human's Voice Clearly by Integrating ]Real-time Face Tracking and Microphones Array

  • Hiroshi Mizoguchi;Takaomi Shigehara;Yoshiyasu Goto;Hidai, Ken-ichi;Taketoshi Mishima
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1998.10a
    • /
    • pp.75-80
    • /
    • 1998
  • This paper proposes a novel computer human interface, named Virtual Wireless Microphone (VWM), which utilizes computer vision and signal processing. It integrates real-time face tracking and sound signal processing. VWM is intended to be used as a speech signal input method for human computer interaction, especially for autonomous intelligent agent that interacts with humans like as digital secretary. Utilizing VWM, the agent can clearly listen human master's voice remotely as if a wireless microphone was put just in front of the master.

  • PDF

Development of TTS for a Human-Robot Interface (휴먼-로봇 인터페이스를 위한 TTS의 개발)

  • Bae Jae-Hyun;Oh Yung-Hwan
    • Proceedings of the KSPS conference
    • /
    • 2006.05a
    • /
    • pp.135-138
    • /
    • 2006
  • The communication method between human and robot is one of the important parts for a human-robot interaction. And speech is easy and intuitive communication method for human-being. By using speech as a communication method for robot, we can use robot as familiar way. In this paper, we developed TTS for human-robot interaction. Synthesis algorithms were modified for an efficient utilization of restricted resource in robot. And synthesis database were reconstructed for an efficiency. As a result, we could reduce the computation time with slight degradation of the speech quality.

  • PDF

Discriminant Analysis of Human's Implicit Intent based on Eyeball Movement (안구운동 기반의 사용자 묵시적 의도 판별 분석 모델)

  • Jang, Young-Min;Mallipeddi, Rammohan;Kim, Cheol-Su;Lee, Minho
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.50 no.6
    • /
    • pp.212-220
    • /
    • 2013
  • Recently, there has been tremendous increase in human-computer/machine interaction system, where the goal is to provide with an appropriate service to the user at the right time with minimal human inputs for human augmented cognition system. To develop an efficient human augmented cognition system based on human computer/machine interaction, it is important to interpret the user's implicit intention, which is vague, in addition to the explicit intention. According to cognitive visual-motor theory, human eye movements and pupillary responses are rich sources of information about human intention and behavior. In this paper, we propose a novel approach for the identification of human implicit visual search intention based on eye movement pattern and pupillary analysis such as pupil size, gradient of pupil size variation, fixation length/count for the area of interest. The proposed model identifies the human's implicit intention into three types such as navigational intent generation, informational intent generation, and informational intent disappearance. Navigational intent refers to the search to find something interesting in an input scene with no specific instructions, while informational intent refers to the search to find a particular target object at a specific location in the input scene. In the present study, based on the human eye movement pattern and pupillary analysis, we used a hierarchical support vector machine which can detect the transitions between the different implicit intents - navigational intent generation to informational intent generation and informational intent disappearance.

Robot Gesture Reconition System based on PCA algorithm (PCA 알고리즘 기반의 로봇 제스처 인식 시스템)

  • Youk, Yui-Su;Kim, Seung-Young;Kim, Sung-Ho
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2008.04a
    • /
    • pp.400-402
    • /
    • 2008
  • The human-computer interaction technology (HCI) that has played an important role in the exchange of information between human being and computer belongs to a key field for information technology. Recently, control studies through which robots and control devices are controlled by using the movements of a person's body or hands without using conventional input devices such as keyboard and mouse, have been going only in diverse aspects, and their importance has been steadily increasing. This study is proposing a recognition method of user's gestures by applying measurements from an acceleration sensor to the PCA algorithm.

  • PDF

Opportunities and Future Directions of Human-Metaverse Interaction (휴먼-메타버스 인터랙션의 기회와 발전방향)

  • Yoon, Hyoseok;Park, ChangJu;Park, Jung Yeon
    • Smart Media Journal
    • /
    • v.11 no.6
    • /
    • pp.9-17
    • /
    • 2022
  • In the COVID-19 pandemic era, non-contact services were demanded and the use of extended reality and metaverse services increased rapidly in various applications. In this paper, we analyze Gather.town, ifland, Roblox, and ZEPETO metaverse platforms in terms of user interaction, avatar-based interaction, and virtual world authoring. Especially, we distinguish interactions among user input techniques that occur in the real world, avatar representation techniques to represent users in the virtual world, and interaction types that create a virtual world through user participation. Based on this work, we highlight the current trends and needs of human-metaverse interaction and forecast future opportunities and research directions.

The human factors in user interface design of computer graphics (컴퓨터 그래픽 User Interface 설계에서의 Human Factor)

  • 최윤철
    • Journal of the Ergonomics Society of Korea
    • /
    • v.6 no.2
    • /
    • pp.29-37
    • /
    • 1987
  • This paper discusses the gereral principles to be considered in the design of usef interfaces of graphics packages and presents a top-down design process in systematic way. The effective and convenient user interfaces are analyzed based on human factors criteria and we discuss the properties and application requirements of typical interaction techniques which support primitive tasks. The choice of an interaction technique has a set of input device prerequisites to be met.

  • PDF

Survey: Gesture Recognition Techniques for Intelligent Robot (지능형 로봇 구동을 위한 제스처 인식 기술 동향)

  • Oh Jae-Yong;Lee Chil-Woo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.10 no.9
    • /
    • pp.771-778
    • /
    • 2004
  • Recently, various applications of robot system become more popular in accordance with rapid development of computer hardware/software, artificial intelligence, and automatic control technology. Formerly robots mainly have been used in industrial field, however, nowadays it is said that the robot will do an important role in the home service application. To make the robot more useful, we require further researches on implementation of natural communication method between the human and the robot system, and autonomous behavior generation. The gesture recognition technique is one of the most convenient methods for natural human-robot interaction, so it is to be solved for implementation of intelligent robot system. In this paper, we describe the state-of-the-art of advanced gesture recognition technologies for intelligent robots according to three methods; sensor based method, feature based method, appearance based method, and 3D model based method. And we also discuss some problems and real applications in the research field.

Psychomotorik-based Play Activities for Children by In-home Social Robot (어린이를 위한 소셜 로봇의 심리운동 기반 놀이 활동 개발)

  • Kim, Da-Young;Choi, Jihwan;Kim, Juhyun;Kim, Min-Gyu;Chung, Jae Hee;Seo, Kap-Ho;Lee, WonHyong
    • The Journal of Korea Robotics Society
    • /
    • v.17 no.4
    • /
    • pp.447-454
    • /
    • 2022
  • This paper presents the psychomotorik-based play activities executed by the social robot at home which helps children's social and emotional development. Based on the theory and practice of the psychomotorik therapy, the play activities were implemented in the close collaboration between psychmotorik experts, service designers and robotics engineers. The designed play activities are classified into four categories depending on the main areas of child development. The robotic system that can express verbal and nonverbal behaviors was developed in order to play games with children and but also to make children have continuous interest during the play activities with it. Finally, the psychomotorik-based play service scenario and interactive robot system were validated by the expert group from the domain of child psychotherapy. The evaluation results showed that the play service and the robot system were appropriately developed for children from the experts point of view.