• Title/Summary/Keyword: Gesture Interface

Search Result 231, Processing Time 0.029 seconds

A Study on the Practical Human Robot Interface Design for the Development of Shopping Service Support Robot (쇼핑 서비스 지원 로봇 개발을 위한 실체적인 Human Robot Interface 디자인 개발에 관한 연구)

  • Hong Seong-Soo;Heo Seong-Cheol;Kim Eok;Chang Young-Ju
    • Archives of design research
    • /
    • v.19 no.4 s.66
    • /
    • pp.81-90
    • /
    • 2006
  • Robot design serves as the crucial link between a human and a robot, the cutting edge technology. The importance of the robot design certainly will be more emphasized when the consumer robot market matures. For coexistence of a human and a robot, human friendly interface design and robot design with consideration of human interaction need to be developed. This research extracts series of functions in need which are consisted of series of case studies for planning and designing of 'A Shopping Support Robot'. The plan for the robot is carried out according to HRI aspects of Design and the designing process fellows. Definite results are derived by the application of series of HRI aspects such as gestures, expressions and sound. In order to verify the effectiveness of application of HRI aspects, this research suggests unified interaction that is consisted of motion-capture, animation, brain waves and sound between a human and a robot.

  • PDF

A New Eye Tracking Method as a Smartphone Interface

  • Lee, Eui Chul;Park, Min Woo
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.7 no.4
    • /
    • pp.834-848
    • /
    • 2013
  • To effectively use these functions many kinds of human-phone interface are used such as touch, voice, and gesture. However, the most important touch interface cannot be used in case of hand disabled person or busy both hands. Although eye tracking is a superb human-computer interface method, it has not been applied to smartphones because of the small screen size, the frequently changing geometric position between the user's face and phone screen, and the low resolution of the frontal cameras. In this paper, a new eye tracking method is proposed to act as a smartphone user interface. To maximize eye image resolution, a zoom lens and three infrared LEDs are adopted. Our proposed method has following novelties. Firstly, appropriate camera specification and image resolution are analyzed in order to smartphone based gaze tracking method. Secondly, facial movement is allowable in case of one eye region is included in image. Thirdly, the proposed method can be operated in case of both landscape and portrait screen modes. Fourthly, only two LED reflective positions are used in order to calculate gaze position on the basis of 2D geometric relation between reflective rectangle and screen. Fifthly, a prototype mock-up design module is made in order to confirm feasibility for applying to actual smart-phone. Experimental results showed that the gaze estimation error was about 31 pixels at a screen resolution of $480{\times}800$ and the average hit ratio of a $5{\times}4$ icon grid was 94.6%.

Emotional Interface Technologies for Service Robot (서비스 로봇을 위한 감성인터페이스 기술)

  • Yang, Hyun-Seung;Seo, Yong-Ho;Jeong, Il-Woong;Han, Tae-Woo;Rho, Dong-Hyun
    • The Journal of Korea Robotics Society
    • /
    • v.1 no.1
    • /
    • pp.58-65
    • /
    • 2006
  • The emotional interface is essential technology for the robot to provide the proper service to the user. In this research, we developed emotional components for the service robot such as a neural network based facial expression recognizer, emotion expression technologies based on 3D graphical face expression and joints movements, considering a user's reaction, behavior selection technology for emotion expression. We used our humanoid robots, AMI and AMIET as the test-beds of our emotional interface. We researched on the emotional interaction between a service robot and a user by integrating the developed technologies. Emotional interface technology for the service robot, enhance the performance of friendly interaction to the service robot, to increase the diversity of the service and the value-added of the robot for human. and it elevates the market growth and also contribute to the popularization of the robot. The emotional interface technology can enhance the performance of friendly interaction of the service robot. This technology can also increase the diversity of the service and the value-added of the robot for human. and it can elevate the market growth and also contribute to the popularization of the robot.

  • PDF

Design of dataglove based multimodal interface for 3D object manipulation in virtual environment (3 차원 오브젝트 직접조작을 위한 데이터 글러브 기반의 멀티모달 인터페이스 설계)

  • Lim, Mi-Jung;Park, Peom
    • 한국HCI학회:학술대회논문집
    • /
    • 2006.02a
    • /
    • pp.1011-1018
    • /
    • 2006
  • 멀티모달 인터페이스는 인간의 제스처, 시선, 손의 움직임, 행동의 패턴, 음성, 물리적인 위치 등 인간의 자연스러운 행동들에 대한 정보를 해석하고 부호화하는 인지기반 기술이다. 본 논문에서는 제스처와 음성, 터치를 이용한 3D 오브젝트 기반의 멀티모달 인터페이스를 설계, 구현한다. 서비스 도메인은 스마트 홈이며 사용자는 3D 오브젝트 직접조작을 통해 원격으로 가정의 오브젝트들을 모니터링하고 제어할 수 있다. 멀티모달 인터랙션 입출력 과정에서는 여러 개의 모달리티를 병렬적으로 인지하고 처리해야 하기 때문에 입출력 과정에서 각 모달리티의 조합과 부호화 방법, 입출력 형식 등이 문제시된다. 본 연구에서는 모달리티들의 특징과 인간의 인지구조 분석을 바탕으로 제스처, 음성, 터치 모달리티 간의 입력조합방식을 제시하고 멀티모달을 이용한 효율적인 3D Object 인터랙션 프로토타입을 설계한다.

  • PDF

A Deep Learning-based Hand Gesture Recognition Robust to External Environments (외부 환경에 강인한 딥러닝 기반 손 제스처 인식)

  • Oh, Dong-Han;Lee, Byeong-Hee;Kim, Tae-Young
    • The Journal of Korean Institute of Next Generation Computing
    • /
    • v.14 no.5
    • /
    • pp.31-39
    • /
    • 2018
  • Recently, there has been active studies to provide a user-friendly interface in a virtual reality environment by recognizing user hand gestures based on deep learning. However, most studies use separate sensors to obtain hand information or go through pre-process for efficient learning. It also fails to take into account changes in the external environment, such as changes in lighting or some of its hands being obscured. This paper proposes a hand gesture recognition method based on deep learning that is strong in external environments without the need for pre-process of RGB images obtained from general webcam. In this paper we improve the VGGNet and the GoogLeNet structures and compared the performance of each structure. The VGGNet and the GoogLeNet structures presented in this paper showed a recognition rate of 93.88% and 93.75%, respectively, based on data containing dim, partially obscured, or partially out-of-sight hand images. In terms of memory and speed, the GoogLeNet used about 3 times less memory than the VGGNet, and its processing speed was 10 times better. The results of this paper can be processed in real-time and used as a hand gesture interface in various areas such as games, education, and medical services in a virtual reality environment.

Augmented Reality Authoring Tool with Marker & Gesture Interactive Features (마커 및 제스처 상호작용이 가능한 증강현실 저작도구)

  • Shim, Jinwook;Kong, Minje;Kim, Hayoung;Chae, Seungho;Jeong, Kyungho;Seo, Jonghoon;Han, Tack-Don
    • Journal of Korea Multimedia Society
    • /
    • v.16 no.6
    • /
    • pp.720-734
    • /
    • 2013
  • In this paper, we suggest an augmented reality authoring tool system that users can easily make augmented reality contents using hand gesture and marker-based interaction methods. The previous augmented reality authoring tools are focused on augmenting a virtual object and to interact with this kind of augmented reality contents, user used the method utilizing marker or sensor. We want to solve this limited interaction method problem by applying marker based interaction method and gesture interaction method using depth sensing camera, Kinect. In this suggested system, user can easily develop simple form of marker based augmented reality contents through interface. Also, not just providing fragmentary contents, this system provides methods that user can actively interact with augmented reality contents. This research provides two interaction methods, one is marker based method using two markers and the other is utilizing marker occlusion. In addition, by recognizing and tracking user's bare hand, this system provides gesture interaction method which can zoom-in, zoom-out, move and rotate object. From heuristic evaluation about authoring tool and compared usability about marker and gesture interaction, this study confirmed a positive result.

피지컬 인터페이스의 구현에 관한 연구

  • 오병근
    • Archives of design research
    • /
    • v.16 no.2
    • /
    • pp.131-140
    • /
    • 2003
  • The input for computer interaction design is very limited for the users to control the interface by only using keyboard and mouse. However, using the basic electrical engineering, the input design can be different from the existing method. Interactive art using computer technology is recently emersed, which is completed by people's participation. The electric signal transmitted in digital and analogue type from the interface controled by people to the computer can be used in multimedia interaction design. The electric circuit design will be necessary to transmit very safe electric signal from the interface. Electric switch, sensor, and camera technologies can be applied to input interface design, which would be alternative physical interaction without computer keyboard and mouse. This type of interaction design using human's body language and gesture would convey the richness of humanity.

  • PDF

Gesture based MTPC Interface (제스처 기반의 HTPC 인터페이스)

  • 권경수;김상호;장재식;김항준
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2004.10b
    • /
    • pp.715-717
    • /
    • 2004
  • 본 논문에서는 HTPC를 제어하기 위한 제스처 기반의 인터페이스를 제안한다. 제안된 인터페이스를 이용하여 사용자는 HTPC와 떨어진 장소에서 쉽게 HTPC를 제어할 수 있다. 제스처를 인식하기 위해 인터페이스는 실시간 연속 영상으로부터 사용자의 손을 검출하고, 손의 움직임, 모양, 위치 정보를 추출한다. 사용자의 제스처를 인식하기 위해 추출된 정보와 HMMs 을 사용한다. 실험 결과는 제안한 인터페이스가 멀티미디어 응용프로그램뿐만 아니라 다른 종류의 컴퓨터 응용프로그램에서 사용자와 HTPC간에 상호작용하여 접근할 수 있음을 보인다.

  • PDF

State-of-the-Art on Gesture Sensing Technology Based on Infrared Proximity Sensor (적외선 근접센서 기반 제스처 센싱기술 동향)

  • Suk, J.H.;Jeon, Y.D.;Lyuh, C.G.
    • Electronics and Telecommunications Trends
    • /
    • v.30 no.6
    • /
    • pp.31-41
    • /
    • 2015
  • 사람은 User Interface(UI)를 통해 기기와 접촉하고 활용하며, 서비스를 제공받는다. 대부분의 입력 도구들은 사용자의 접촉을 필요로 하기 때문에 기기에 접촉이 불가능한 상황에서는 사용자의 의도를 기기에 전달하기 어렵다. 본고에서는 접촉 없이 사용자 입력을 가능하게 하는 기술 중 적외선 근접센서에 기반을 둔 제스처 센싱기술을 소개한다.

  • PDF

State-of-the-Art on Gesture Sensing Technology Based on Infrared Proximity Sensor (스마트폰 시장동향 - 적외선 근접센서 기반 제스처 센싱기술 동향)

  • Suk, J.H.;Jeon, J.D.;Lyuh, C.G.
    • The Optical Journal
    • /
    • s.161
    • /
    • pp.58-73
    • /
    • 2016
  • 사람은 User Interface(UI)를 통해 기기와 접촉하고, 활용하며 서비스를 제공받는다. 대부분의 입력 도구들은 사용자의 접촉을 필요로 하기 때문에 기기에 접촉이 불가능한 상황에서는 사용자의 의도를 기기에 전달하기 어렵다. 본고에서는 접촉 없이 사용자의 입력을 가능하게 하는 기술 중 적외선 근접센서에 기반을 둔 제스처 센싱기술을 소개한다.

  • PDF