• 제목/요약/키워드: human computer interface & interaction

검색결과 156건 처리시간 0.03초

Brain-Computer Interface 기반 인간-로봇상호작용 플랫폼 (A Brain-Computer Interface Based Human-Robot Interaction Platform)

  • 윤중선
    • 한국산학기술학회논문지
    • /
    • 제16권11호
    • /
    • pp.7508-7512
    • /
    • 2015
  • 뇌파로 의도를 접속하여 기계를 작동하는 뇌-기기 접속(Brain-Computer Interface, BCI) 기반의 인간-로봇상호작용(Human-Robot Interaction, HRI) 플랫폼을 제안한다. 사람의 뇌파로 의도를 포착하고 포착된 뇌파 신호에서 의도를 추출하거나 연관시키고 추출된 의도로 기기를 작동하게 하는 포착, 처리, 실행을 수행하는 플랫폼의 설계, 운용 및 구현 과정을 소개한다. 제안된 플랫폼의 구현 사례로 처리기에 구현된 상호작용 게임과 처리기를 통한 외부 장치 제어가 기술되었다. BCI 기반 플랫폼의 의도와 감지 사이의 신뢰성을 확보하기 위한 다양한 시도들을 소개한다. 제안된 플랫폼과 구현 사례는 BCI 기반의 새로운 기기 제어 작동 방식의 실현으로 확장될 것으로 기대된다.

Tactile Sensation Display with Electrotactile Interface

  • Yarimaga, Oktay;Lee, Jun-Hun;Lee, Beom-Chan;Ryu, Je-Ha
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2005년도 ICCAS
    • /
    • pp.145-150
    • /
    • 2005
  • This paper presents an Electrotactile Display System (ETCS). One of the most important human sensory systems for human computer interaction is the sense of touch, which can be displayed to human through tactile output devices. To realize the sense of touch, electrotactile display produces controlled, localized touch sensation on the skin by passing small electric current. In electrotactile stimulation, the mechanoreceptors in the skin may be stimulated individually in order to display the sense of vibration, touch, itch, tingle, pressure etc. on the finger, palm, arm or any suitable location of the body by using appropriate electrodes and waveforms. We developed an ETCS and investigated effectiveness of the proposed system in terms of the perception of roughness of a surface by stimulating the palmar side of hand with different waveforms and the perception of direction and location information through forearm. Positive and negative pulse trains were tested with different current intensities and electrode switching times on the forearm or finger of the user with an electrode-embedded armband in order to investigate how subjects recognize displayed patterns and directions of stimulation.

  • PDF

휴먼/로봇 인터페이스 연구동향 분석 (Trends on Human/Robot Interface Research)

  • 임창주;임치환
    • 대한인간공학회지
    • /
    • 제21권2호
    • /
    • pp.101-111
    • /
    • 2002
  • An intelligent robot, which has been developed recently, is no more a conventional robot widely known as an industrial robot. It is a computer system embedded in a machine and utilizes the machine as a medium not only for the communication between the human and the computer but also for the physical interaction among the human, the computer and their environment. Recent advances in computer technology have made it possible to create several of new types of human-computer interaction which are realized by utilizing intelligent machines. There is a continuing need for better understanding of how to design human/robot interface(HRI) to make for a more natural and efficient flow of information and feedback between robot systems and their users in both directions. In this paper, we explain the concept and the scope of HRI and review the current research trends of domestic and foreign HRL. The recommended research directions in the near future are also discussed based upon a comparative study of domestic and foreign HRI technology.

Human-Computer Interaction Based Only on Auditory and Visual Information

  • Sha, Hui;Agah, Arvin
    • Transactions on Control, Automation and Systems Engineering
    • /
    • 제2권4호
    • /
    • pp.285-297
    • /
    • 2000
  • One of the research objectives in the area of multimedia human-computer interaction is the application of artificial intelligence and robotics technologies to the development of computer interfaces. This involves utilizing many forms of media, integrating speed input, natural language, graphics, hand pointing gestures, and other methods for interactive dialogues. Although current human-computer communication methods include computer keyboards, mice, and other traditional devices, the two basic ways by which people communicate with each other are voice and gesture. This paper reports on research focusing on the development of an intelligent multimedia interface system modeled based on the manner in which people communicate. This work explores the interaction between humans and computers based only on the processing of speech(Work uttered by the person) and processing of images(hand pointing gestures). The purpose of the interface is to control a pan/tilt camera to point it to a location specified by the user through utterance of words and pointing of the hand, The systems utilizes another stationary camera to capture images of the users hand and a microphone to capture the users words. Upon processing of the images and sounds, the systems responds by pointing the camera. Initially, the interface uses hand pointing to locate the general position which user is referring to and then the interface uses voice command provided by user to fine-the location, and change the zooming of the camera, if requested. The image of the location is captured by the pan/tilt camera and sent to a color TV monitor to be displayed. This type of system has applications in tele-conferencing and other rmote operations, where the system must respond to users command, in a manner similar to how the user would communicate with another person. The advantage of this approach is the elimination of the traditional input devices that the user must utilize in order to control a pan/tillt camera, replacing them with more "natural" means of interaction. A number of experiments were performed to evaluate the interface system with respect to its accuracy, efficiency, reliability, and limitation.

  • PDF

Intelligent Emotional Interface for Personal Robot and Its Application to a Humanoid Robot, AMIET

  • Seo, Yong-Ho;Jeong, Il-Woong;Jung, Hye-Won;Yang, Hyun-S.
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.1764-1768
    • /
    • 2004
  • In the near future, robots will be used for the personal use. To provide useful services to humans, it will be necessary for robots to understand human intentions. Consequently, the development of emotional interfaces for robots is an important expansion of human-robot interactions. We designed and developed an intelligent emotional interface for the robot, and applied the interfaces to our humanoid robot, AMIET. Subsequent human-robot interaction demonstrated that our intelligent emotional interface is very intuitive and friendly

  • PDF

Integrated Approach of Multiple Face Detection for Video Surveillance

  • Kim, Tae-Kyun;Lee, Sung-Uk;Lee, Jong-Ha;Kee, Seok-Cheol;Kim, Sang-Ryong
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2003년도 하계종합학술대회 논문집 Ⅳ
    • /
    • pp.1960-1963
    • /
    • 2003
  • For applications such as video surveillance and human computer interface, we propose an efficiently integrated method to detect and track faces. Various visual cues are combined to the algorithm: motion, skin color, global appearance and facial pattern detection. The ICA (Independent Component Analysis)-SVM (Support Vector Machine based pattern detection is performed on the candidate region extracted by motion, color and global appearance information. Simultaneous execution of detection and short-term tracking also increases the rate and accuracy of detection. Experimental results show that our detection rate is 91% with very few false alarms running at about 4 frames per second for 640 by 480 pixel images on a Pentium IV 1㎓.

  • PDF

Stereo-Vision-Based Human-Computer Interaction with Tactile Stimulation

  • Yong, Ho-Joong;Back, Jong-Won;Jang, Tae-Jeong
    • ETRI Journal
    • /
    • 제29권3호
    • /
    • pp.305-310
    • /
    • 2007
  • If a virtual object in a virtual environment represented by a stereo vision system could be touched by a user with some tactile feeling on his/her fingertip, the sense of reality would be heightened. To create a visual impression as if the user were directly pointing to a desired point on a virtual object with his/her own finger, we need to align virtual space coordinates and physical space coordinates. Also, if there is no tactile feeling when the user touches a virtual object, the virtual object would seem to be a ghost. Therefore, a haptic interface device is required to give some tactile sensation to the user. We have constructed such a human-computer interaction system in the form of a simple virtual reality game using a stereo vision system, a vibro-tactile device module, and two position/orientation sensors.

  • PDF

Tangible Space Initiative

  • Ahn, Chong-Keun;Kim, Lae-Hyun;Ha, Sung-Do
    • 한국정밀공학회:학술대회논문집
    • /
    • 한국정밀공학회 2004년도 추계학술대회 논문집
    • /
    • pp.1053-1056
    • /
    • 2004
  • Research in Human Computer Interface (HCI) is towards development of an application environment able to deal with interactions of both human and computers that can be more intuitive and efficient. This can be achieved by bridging the gap between the synthetic virtual environment and the natural physical environment. Thus a project called Tangible Space Initiative (TSI) has been launched by KIST. TSI is subdivided into Tangible Interface (TI) which controls 3D cyber space with user's perspective, Responsive Cyber Space (RCS) which creates and controls the virtual environment and Tangible Agent (TA) which senses and acts upon the physical interface environment on behalf of any components of TSI or the user. This paper is a brief introduction to a new generation of Human Computer Interface that bring user to a new era of interaction with computers in the future.

  • PDF

Technology Requirements for Wearable User Interface

  • Cho, Il-Yeon
    • 대한인간공학회지
    • /
    • 제34권5호
    • /
    • pp.531-540
    • /
    • 2015
  • Objective: The objective of this research is to investigate the fundamentals of human computer interaction for wearable computers and derive technology requirements. Background: A wearable computer can be worn anytime with the support of unrestricted communications and a variety of services which provide maximum capability of information use. Key challenges in developing such wearable computers are the level of comfort that users do not feel what they wear, and easy and intuitive user interface. The research presented in this paper examines user interfaces for wearable computers. Method: In this research, we have classified the wearable user interface technologies and analyzed the advantages and disadvantages from the user's point of view. Based on this analysis, we issued a user interface technology to conduct research and development for commercialization. Results: Technology requirements are drawn to make wearable computers commercialized. Conclusion: The user interface technology for wearable system must start from the understanding of the ergonomic aspects of the end user, because users wear the system on their body. Developers do not try to develop a state-of-the-art technology without the requirement analysis of the end users. If people do not use the technology, it can't survive in the market. Currently, there is no dominant wearable user interface in the world. So, this area might try a new challenge for the technology beyond the traditional interface paradigm through various approaches and attempts. Application: The findings in this study are expected to be used for designing user interface for wearable systems, such as digital clothes and fashion apparel.

Human-Computer Natur al User Inter face Based on Hand Motion Detection and Tracking

  • Xu, Wenkai;Lee, Eung-Joo
    • 한국멀티미디어학회논문지
    • /
    • 제15권4호
    • /
    • pp.501-507
    • /
    • 2012
  • Human body motion is a non-verbal part for interaction or movement that can be used to involves real world and virtual world. In this paper, we explain a study on natural user interface (NUI) in human hand motion recognition using RGB color information and depth information by Kinect camera from Microsoft Corporation. To achieve the goal, hand tracking and gesture recognition have no major dependencies of the work environment, lighting or users' skin color, libraries of particular use for natural interaction and Kinect device, which serves to provide RGB images of the environment and the depth map of the scene were used. An improved Camshift tracking algorithm is used to tracking hand motion, the experimental results show out it has better performance than Camshift algorithm, and it has higher stability and accuracy as well.