• Title/Summary/Keyword: Touch gesture

Search Result 73, Processing Time 0.025 seconds

Design and Implementation of PC-Mechanic Education Application System Using Image Processing (영상처리를 이용한 PC 내부구조 학습 어플리케이션 설계 및 구현)

  • Kim, Won-Jin;Kim, Hyung-Ook;Jo, Sung-Eun;Jang, Soo-Jeong;Moon, Il-Young
    • The Journal of Korean Institute for Practical Engineering Education
    • /
    • v.3 no.2
    • /
    • pp.93-99
    • /
    • 2011
  • We introduce the application what using the MultiTouch-Table of the PC-mechanic Certification. Thesedays, People does't use the Mouse and Keyboard and use people gesture. We introduce Graphic and Image by addition. Theseday, MultiTouch-Table is so famous. We use it the multitouch-table to on 3D Maxs and C#. We help them to get the certification using the component Scale and Drags through the camera view and then include the PC-Mechanic question of domestic.

  • PDF

A Study on Touchless Panel based Interactive Contents Service using IrDA Matrix

  • Lee, Minwoo;Lee, Dongwoo;Kim, Daehyeon;Ann, Myungsuk;Lee, Junghoon;Lee, Seungyoun;Cho, Juphil;Shin, Jaekwon;Cha, Jaesang
    • International Journal of Internet, Broadcasting and Communication
    • /
    • v.7 no.2
    • /
    • pp.73-78
    • /
    • 2015
  • Touch panel is mainly applied to pressure type touch panel but it occur a low recognition rate and error during long-term use. So, it is partly applied to capacitive touch panel to compensate for these problems but it also can occur a same problems via pollutions. Touch technology has developed a various method but it is not used because of high costs and difficult installation process. So, in this paper, we proposed an input method of touchless panel using IrDA matrix. This method is conducted using an IrDA Matrix composed of depth sensor. It is possible to offer a various contents for multi user. The proposed technology need a development of a high sensitivity sensing method and high-speed processing method of position information for Seamless operation control. And, it is required high-precision drive technology. Also, we proposed a Seamless user recognition for interactive contents service through a touchless panel using IrDA matrix.

Development of Multi Card Touch based Interactive Arcade Game System (멀티 카드 터치기반 인터랙티브 아케이드 게임 시스템 구현)

  • Lee, Dong-Hoon;Jo, Jae-Ik;Yun, Tae-Soo
    • Journal of Korea Entertainment Industry Association
    • /
    • v.5 no.2
    • /
    • pp.87-95
    • /
    • 2011
  • Recently, the issue has been tangible game environment due to the various interactive interface developments. In this paper, we propose the multi card touch based interactive arcade system by using marker recognition interface and multi-touch interaction interface. For our system, the card's location and orientation information is recognized through DI-based recognition algorithm. In addition, the user's hand gesture tracking informations are provided by the various interaction metaphors. The system provides the user with a higher engagement offers a new experience. Therefore, our system will be used in the tangible arcade game machine.

Design of Gesture based Interfaces for Controlling GUI Applications (GUI 어플리케이션 제어를 위한 제스처 인터페이스 모델 설계)

  • Park, Ki-Chang;Seo, Seong-Chae;Jeong, Seung-Moon;Kang, Im-Cheol;Kim, Byung-Gi
    • The Journal of the Korea Contents Association
    • /
    • v.13 no.1
    • /
    • pp.55-63
    • /
    • 2013
  • NUI(Natural User Interfaces) has been developed through CLI(Command Line Interfaces) and GUI(Graphical User Interfaces). NUI uses many different input modalities, including multi-touch, motion tracking, voice and stylus. In order to adopt NUI to legacy GUI applications, he/she must add device libraries, modify relevant source code and debug it. In this paper, we propose a gesture-based interface model that can be applied without modification of the existing event-based GUI applications and also present the XML schema for the specification of the model proposed. This paper shows a method of using the proposed model through a prototype.

A Study on the VR Payment System using Hand Gesture Recognition (손 제스쳐 인식을 활용한 VR 결제 시스템 연구)

  • Kim, Kyoung Hwan;Lee, Won Hyung
    • Journal of the Korean Society for Computer Game
    • /
    • v.31 no.4
    • /
    • pp.129-135
    • /
    • 2018
  • Electronic signatures, QR codes, and bar codes are used in payment systems used in real life. Research has begun on the payment system implemented in the VR environment. This paper proposes a VR electronic sign system that uses hand gesture recognition to implement an existing payment system in a VR environment. In a VR system, you can not hit the keyboard or touch the mouse. There can be several ways to configure a payment system with a VR controller. Electronic signage using hand gesture recognition is one of them, and hand gesture recognition can be classified by the Warping Methods, Statistical Methods, and Template Matching methods. In this paper, the payment system was configured in VR using the $p algorithm belonging to the Template Matching method. To create a VR environment, we implemented a paypal system where actual payment is made using Unity3D and Vive equipment.

A Study on Implementing Kinect-Based Control for LCD Display Contents (LCD Display 설비 Contents의 Kinect기반 동작제어 기술 구현에 관한 연구)

  • Rho, Jungkyu
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.63 no.4
    • /
    • pp.565-569
    • /
    • 2014
  • Recently, various kinds of new computer controlled devices have been introduced in a wide range of areas, and convenient user interfaces for controlling the devices are strongly needed. To implement natural user interfaces(NUIs) on top of the devices, new technologies like a touch screen, Wii Remote, wearable interfaces, and Microsoft Kinect were presented. This paper presents a natural and intuitive gesture-based model for controlling contents of LCD display. Microsoft Kinect sensor and its SDK are used to recognize human gestures, and the gestures are interpreted into corresponding commands to be executed. A command dispatch model is also proposed in order to handle the commands more naturally. I expect the proposed interface can be used in various fields, including display contents control.

Development of Direct Teaching Control using ITO Touch Panel (ITO 터치 패널 이용한 교시 제어 연구)

  • Yoon, Jae Seok;Nam, Sang Yep;Kim, Ki Eun;Kim, Dong-Han
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.52 no.3
    • /
    • pp.206-212
    • /
    • 2015
  • This paper proposes the physical human-robot interaction method that controls the robot arms using ITO touch panel sensor as the skin of robot. To implement physical human-robot interaction, the method of using the force/torque sensor and the method of using tactile sensor created by arranging small element type of sensor have been studied. However, these sensors have the pros and cons in terms of price and performance. This study aims to demonstrate the economy of element type sensor and the accuracy of force/torque sensor through experiment by proposing the method of physical interaction using the touch panel as the skin of robot, and by constructing overall system. The experiment was carried out for the method of controlling the robot arm by installing end-effecter and the method of controlling robot arm by creating the gesture with reference point on the touch panel. Through this experiment, the possibility of teaching control using the touch panel was confirmed.

Image Processing Algorithms for DI-method Multi Touch Screen Controllers (DI 방식의 대형 멀티터치스크린을 위한 영상처리 알고리즘 설계)

  • Kang, Min-Gu;Jeong, Yong-Jin
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.48 no.3
    • /
    • pp.1-12
    • /
    • 2011
  • Large-sized multi-touch screen is usually made using infrared rays. That is because it has technical constraints or cost problems to make the screen with the other ways using such as existing resistive overlays, capacitive overlay, or acoustic wave. Using infrared rays to make multi-touch screen is easy, but is likely to have technical limits to be implemented. To make up for these technical problems, two other methods were suggested through Surface project, which is a next generation user-interface concept of Microsoft. One is Frustrated Total Internal Reflection (FTIR) which uses infrared cameras, the other is Diffuse Illumination (DI). FTIR and DI are easy to be implemented in large screens and are not influenced by the number of touch points. Although FTIR method has an advantage in detecting touch-points, it also has lots of disadvantages such as screen size limit, quality of the materials, the module for infrared LED arrays, and high consuming power. On the other hand, DI method has difficulty in detecting touch-points because of it's structural problems but makes it possible to solve the problem of FTIR. In this thesis, we study the algorithms for effectively correcting the distort phenomenon of optical lens, and image processing algorithms in order to solve the touch detecting problem of the original DI method. Moreover, we suggest calibration algorithms for improving the accuracy of multi-touch, and a new tracking technique for accurate movement and gesture of the touch device. To verify our approaches, we implemented a table-based multi touch screen.

Depth Camera Based Hand Gesture Spatial Touch System Implementation (깊이 카메라 기반 손동작 공간터치 시스템 구현)

  • Ahn, Yang-Keun;Jung, Kwnag-Mo
    • Annual Conference of KIPS
    • /
    • 2015.10a
    • /
    • pp.1679-1680
    • /
    • 2015
  • 본 논문에서는 Depth 카메라를 이용하여 검지 끝을 인식하고 공간 터치 손 제스처를 인식하는 방법에 대해 제안한다. 제안된 방법은 손의 형태학적으로 엄지 끝을 추정하고 보정하는 방법을 제안하고, 추정된 검지와 엄지 끝의 위치를 이용해 마우스 이동, 클릭을 구현하여 문자 입력 시스템에 적용하였다. 제안된 방법을 실험하기 위하여 실제 디스플레이와 Depth 카메라를 하드웨어적으로 구성하고 마우스 기반 콘텐츠를 제작하여 이용하였다.

A 3D Parametric CAD System for Smart Devices (스마트 디바이스를 위한 3D 파라메트릭 CAD 시스템)

  • Kang, Yuna;Han, Soonhung
    • Korean Journal of Computational Design and Engineering
    • /
    • v.19 no.2
    • /
    • pp.191-201
    • /
    • 2014
  • A 3D CAD system that can be used on a smart device is proposed. Smart devices are now a part of everyday life and also are widely being used in various industry domains. The proposed 3D CAD system would allow modeling in a rapid and intuitive manner on a smart device when an engineer makes a 3D model of a product while moving in an engineering site. There are several obstacles to develop a 3D CAD system on a smart device such as low computing power and the small screen of smart devices, imprecise touch inputs, and transfer problems of created 3D models between PCs and smart devices. The author discusses the system design of a 3D CAD system on a smart device. The selection of the modeling operations, the assignment of touch gestures to these operations, and the construction of a geometric kernel for creating both meshes and a procedural CAD model are introduced. The proposed CAD system has been implemented for validation by user tests and to demonstrate case studies using test examples. Using the proposed system, it is possible to produce a 3D editable model swiftly and simply in a smart device environment to reduce design time of engineers.