• 제목/요약/키워드: Touch gesture

검색결과 73건 처리시간 0.024초

적외선 영상센서를 이용한 스마트 터치 프로젝터 시스템 기술 연구 (A Study on Smart Touch Projector System Technology Using Infrared (IR) Imaging Sensor)

  • 이국선;오상헌;전국휘;강성수;유동희;김병규
    • 한국멀티미디어학회논문지
    • /
    • 제15권7호
    • /
    • pp.870-878
    • /
    • 2012
  • 최근의 많은 컴퓨터 기술의 발전과 센서 기술의 발전의 결합으로 매우 다양한 형태의 사용자 경험에 기반한 사용자 인터페이스(User interface) 기술들이 출현하고 있다. 본 연구에서는 적외선 영상을 이용한 스마트 터치 프로젝터 시스템 기술에 관한 연구와 그 결과를 소개한다. 제안된 시스템에서는 사용자가 빔 프로젝터를 사용할 때 적외선 펜을 이용하여 이벤트를 발생시키면 적외선 영상센서를 통하여 그 이벤트를 인식하여 마우스 이벤트를 발생시키는 기법을 제안한다. 입력되는 펜의 움직임 추출과 추적을 기반으로 움직임 이벤트 패턴을 설계하였으며, 입력 영상 화면과 실제 사용하는 하드웨어의 해상도에 차이가 있기 때문에 이 오차를 최소화 하기 위해서 화면 좌표보정 알고리즘을 제안한다. 이러한 기술은 빔 프로젝터에 간단한 프로세서만 장착이 된다면 다른 이동식 노트북 등이 필요 없이 언제든지 회의나 발표 등을 진행할 수 있는 차세대 휴먼-컴퓨터 상호작용(Human-Computer Interaction) 기술이다.

Automatic Gesture Recognition for Human-Machine Interaction: An Overview

  • Nataliia, Konkina
    • International Journal of Computer Science & Network Security
    • /
    • 제22권1호
    • /
    • pp.129-138
    • /
    • 2022
  • With the increasing reliance of computing systems in our everyday life, there is always a constant need to improve the ways users can interact with such systems in a more natural, effective, and convenient way. In the initial computing revolution, the interaction between the humans and machines have been limited. The machines were not necessarily meant to be intelligent. This begged for the need to develop systems that could automatically identify and interpret our actions. Automatic gesture recognition is one of the popular methods users can control systems with their gestures. This includes various kinds of tracking including the whole body, hands, head, face, etc. We also touch upon a different line of work including Brain-Computer Interface (BCI), Electromyography (EMG) as potential additions to the gesture recognition regime. In this work, we present an overview of several applications of automated gesture recognition systems and a brief look at the popular methods employed.

Interacting with Touchless Gestures: Taxonomy and Requirements

  • Kim, Huhn
    • 대한인간공학회지
    • /
    • 제31권4호
    • /
    • pp.475-481
    • /
    • 2012
  • Objective: The aim of this study is to make the taxonomy for classifying diverse touchless gestures and establish the design requirements that should be considered in determining suitable gestures during gesture-based interaction design. Background: Recently, the applicability of touchless gestures is more and more increasing as relevant technologies are being advanced. However, before touchless gestures are widely applied to various devices or systems, the understanding on human gestures' natures and their standardization should be prerequisite. Method: In this study, diverse gesture types in various literatures were collected and, based on those, a new taxonomy for classifying touchless gestures was proposed. And many gesture-based interaction design cases and studies were analyzed. Results: The proposed taxonomy consisted of two dimensions: shape (deictic, manipulative, semantic, or descriptive) and motion(static or dynamic). The case analysis based on the taxonomy showed that manipulative and dynamic gestures were widely applied. Conclusion: Four core requirements for valuable touchless gestures were intuitiveness, learnability, convenience and discriminability. Application: The gesture taxonomy can be applied to produce alternatives of applicable touchless gestures, and four design requirements can be used as the criteria for evaluating the alternatives.

풀터치 휴대폰의 플릭(Flick) 성능에 대한 평가 및 가이드라인 (Guidelines for Satisfactory Flick Performances in Touch Screen Mobile Phone)

  • 김헌
    • 대한인간공학회지
    • /
    • 제29권4호
    • /
    • pp.541-546
    • /
    • 2010
  • The gesture 'Flick' is the most fundamental and important part for efficient interactions in the touch screen that are being extensively applied to mobile phones. This study investigated users' satisfaction of the flick operation in representative touch phones, and measured their performances with established three measures: gap between finger and initial cursor, the number of moved lists per 0.2 seconds, and the number of moved lists after ten continuous flicks. The measurement was performed with high speed camera and motion analysis software. The flick movement in mobile phone with high users' satisfaction showed that the gap between finger and cursor positions was less and the speed reached high within 0.6 seconds quickly and then was drastically slow down. Especially, maximal and common time intervals between continuous flicks were measured with an experiment. Based on the evaluation and measurement, several design guidelines for efficient flick performances were suggested.

제스처 형태의 한글입력을 위한 오토마타에 관한 연구 (A Study on the Automata for Hangul Input of Gesture Type)

  • 임양원;임한규
    • 한국산업정보학회논문지
    • /
    • 제16권2호
    • /
    • pp.49-58
    • /
    • 2011
  • 터치스크린을 이용한 스마트 디바이스의 보급이 활성화되어 한글 입력방식도 다양해지고 있다. 본 논문에서는 스마트 디바이스에 적합한 한글 입력방식을 조사 분석하고 오토마타 이론을 이용하여 터치 UI에 적합한 제스처 형태의 한글 입력방식에서 사용할 수 있는 간단하고 효율적인 오토마타를 제시하였다.

FTIR 멀티터치 테이블에서 효율적인 사용자 환경 개발 (The Design of Efficient User Environment on the FTIR Multi Touch)

  • 박상봉;안중서
    • 한국인터넷방송통신학회논문지
    • /
    • 제12권2호
    • /
    • pp.85-94
    • /
    • 2012
  • 본 논문에서는 FTIR 멀티터치 테이블에서 손가락을 이용하여 기존에 윈도우에서 제공되는 제스처 외에 간편한 방법으로 화면 제어를 할 수 있는 제스처를 개발하였다. 또한, 손가락을 사용하지 않고 적외선 카메라 제어를 통해 모바일 기기를 올려놓아도 인식할 수 있도록 구현하였다. HID 인터페이스가 제공되지 않는 FTIR 멀티터치 테이블의 환경 때문에 기존 블루투스는 연결이 어려웠으나, 본 논문에서 제안한 모바일 기기를 이용한 데이터 전송 방법은 인터페이스의 불편함을 해소하고 데이터 전송을 효과적으로 진행 할 수 있도록 구현하였다. 기존 블루투스 연결은 복잡한 페어링 과정을 지니지만, 본 연구에서는 페어링 과정을 단순화하고, 사용자가 친화적 제스처를 개발하여 특정 기기 연결을 가능토록 구현하였다. 또한, 자체 제작한 FTIR 멀티터치 테이블에서 테스트를 통해 검증하였다.

A Comparison of the Characteristics between Single and Double Finger Gestures for Web Browsers

  • Park, Jae-Kyu;Lim, Young-Jae;Jung, Eui-S.
    • 대한인간공학회지
    • /
    • 제31권5호
    • /
    • pp.629-636
    • /
    • 2012
  • Objective: The purpose of this study is to compare the characteristics of single and double finger gestures related on the web browser and to extract the appropriate finger gestures. Background: As electronic equipment emphasizes miniaturization for improving portability various interfaces are being developed as input devices. Electronic devices are made smaller, the gesture recognition technology using the touch-based interface is favored for easy editing. In addition, user focus primarily on the simplicity of intuitive interfaces which propels further research of gesture based interfaces. In particular, the fingers in these intuitive interfaces are simple and fast which are users friendly. Recently, the single and double finger gestures are becoming more popular so more applications for these gestures are being developed. However, systems and software that employ such finger gesture lack consistency in addition to having unclear standard and guideline development. Method: In order to learn the application of these gestures, we performed the sketch map method which happens to be a method for memory elicitation. In addition, we used the MIMA(Meaning in Mediated Action) method to evaluate gesture interface. Results: This study created appropriate gestures for intuitive judgment. We conducted a usability test which consisted of single and double finger gestures. The results showed that double finger gestures had less performance time faster than single finger gestures. Single finger gestures are a wide satisfaction difference between similar type and difference type. That is, single finger gestures can judge intuitively in a similar type but it is difficult to associate functions in difference type. Conclusion: This study was found that double finger gesture was effective to associate functions for web navigations. Especially, this double finger gesture could be effective on associating complex forms such as curve shaped gestures. Application: This study aimed to facilitate the design products which utilized finger and hand gestures.

Study on Gesture and Voice-based Interaction in Perspective of a Presentation Support Tool

  • Ha, Sang-Ho;Park, So-Young;Hong, Hye-Soo;Kim, Nam-Hun
    • 대한인간공학회지
    • /
    • 제31권4호
    • /
    • pp.593-599
    • /
    • 2012
  • Objective: This study aims to implement a non-contact gesture-based interface for presentation purposes and to analyze the effect of the proposed interface as information transfer assisted device. Background: Recently, research on control device using gesture recognition or speech recognition is being conducted with rapid technological growth in UI/UX area and appearance of smart service products which requires a new human-machine interface. However, few quantitative researches on practical effects of the new interface type have been done relatively, while activities on system implementation are very popular. Method: The system presented in this study is implemented with KINECT$^{(R)}$ sensor offered by Microsoft Corporation. To investigate whether the proposed system is effective as a presentation support tool or not, we conduct experiments by giving several lectures to 40 participants in both a traditional lecture room(keyboard-based presentation control) and a non-contact gesture-based lecture room(KINECT-based presentation control), evaluating their interests and immersion based on contents of the lecture and lecturing methods, and analyzing their understanding about contents of the lecture. Result: We check that whether the gesture-based presentation system can play effective role as presentation supporting tools or not depending on the level of difficulty of contents using ANOVA. Conclusion: We check that a non-contact gesture-based interface is a meaningful tool as a sportive device when delivering easy and simple information. However, the effect can vary with the contents and the level of difficulty of information provided. Application: The results presented in this paper might help to design a new human-machine(computer) interface for communication support tools.

Implementation of Gesture Interface for Projected Surfaces

  • Park, Yong-Suk;Park, Se-Ho;Kim, Tae-Gon;Chung, Jong-Moon
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제9권1호
    • /
    • pp.378-390
    • /
    • 2015
  • Image projectors can turn any surface into a display. Integrating a surface projection with a user interface transforms it into an interactive display with many possible applications. Hand gesture interfaces are often used with projector-camera systems. Hand detection through color image processing is affected by the surrounding environment. The lack of illumination and color details greatly influences the detection process and drops the recognition success rate. In addition, there can be interference from the projection system itself due to image projection. In order to overcome these problems, a gesture interface based on depth images is proposed for projected surfaces. In this paper, a depth camera is used for hand recognition and for effectively extracting the area of the hand from the scene. A hand detection and finger tracking method based on depth images is proposed. Based on the proposed method, a touch interface for the projected surface is implemented and evaluated.

멀티터치 기술과 영상인식 기술 기반의 스마트 팩토리 플랫폼 (Smart Factory Platform based on Multi-Touch and Image Recognition Technologies)

  • 홍요훈;송승준;장광문;노정규
    • 한국인터넷방송통신학회논문지
    • /
    • 제18권1호
    • /
    • pp.23-28
    • /
    • 2018
  • 본 연구에서는 팩토리 작업장에 설치된 여러 종류의 멀티터치 기술 기반 센서로부터 수집된 이벤트와 데이터를 제공함으로써 작업장의 상태 감시와 이벤트 관리를 용이하게 할 수 있는 플랫폼을 개발하였다. 영상인식 기술을 활용하여 팩토리 작업장 내 사람들의 얼굴을 인식하여 작업자별 맞춤형 콘텐츠를 제공하며, 얼굴인식을 통한 개별 작업자 인증으로 콘텐츠 보안을 강화하도록 하였다. 제스처 인식을 통한 콘텐츠 제어 기능을 구축하여 작업자가 간단하게 문서를 검색할 수 있도록 하였고, 모바일 장치에서도 얼굴인식 기능을 구현하여 작업자를 위한 콘텐츠 제공이 가능하게 하였다. 본 연구의 결과를 작업장 안전, 콘텐츠 보안, 작업자 편의 등을 향상시키는데 이용할 수 있으며 향후 스마트 팩토리 구축을 위한 기반기술로 활용할 수 있다.