• Title/Summary/Keyword: Motion based interface

Search Result 308, Processing Time 0.033 seconds

Motion-based Controlling 4D Special Effect Devices to Activate Immersive Contents (실감형 콘텐츠 작동을 위한 모션 기반 4D 특수효과 장치 제어)

  • Kim, Kwang Jin;Lee, Chil Woo
    • Smart Media Journal
    • /
    • v.8 no.1
    • /
    • pp.51-58
    • /
    • 2019
  • This paper describes a gesture application to controlling the special effects of physical devices for 4D contents using the PWM (Pulse Width Modulation) method. The user operation recognized by the infrared sensor is interpreted as a command for 3D content control, several of which manipulate the device that generates the special effect to display the physical stimulus to the user. With the content controlled under the NUI (Natural User Interface) technique, the user can be directly put into an immersion experience, which leads to provision of the higher degree of interest and attention. In order to measure the efficiency of the proposed method, we implemented a PWM-based real-time linear control system that manages the parameters of the motion recognition and animation controller using the infrared sensor and transmits the event.

Multimodal Interface Based on Novel HMI UI/UX for In-Vehicle Infotainment System

  • Kim, Jinwoo;Ryu, Jae Hong;Han, Tae Man
    • ETRI Journal
    • /
    • v.37 no.4
    • /
    • pp.793-803
    • /
    • 2015
  • We propose a novel HMI UI/UX for an in-vehicle infotainment system. Our proposed HMI UI comprises multimodal interfaces that allow a driver to safely and intuitively manipulate an infotainment system while driving. Our analysis of a touchscreen interface-based HMI UI/UX reveals that a driver's use of such an interface while driving can cause the driver to be seriously distracted. Our proposed HMI UI/UX is a novel manipulation mechanism for a vehicle infotainment service. It consists of several interfaces that incorporate a variety of modalities, such as speech recognition, a manipulating device, and hand gesture recognition. In addition, we provide an HMI UI framework designed to be manipulated using a simple method based on four directions and one selection motion. Extensive quantitative and qualitative in-vehicle experiments demonstrate that the proposed HMI UI/UX is an efficient mechanism through which to manipulate an infotainment system while driving.

Human-Computer Natur al User Inter face Based on Hand Motion Detection and Tracking

  • Xu, Wenkai;Lee, Eung-Joo
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.4
    • /
    • pp.501-507
    • /
    • 2012
  • Human body motion is a non-verbal part for interaction or movement that can be used to involves real world and virtual world. In this paper, we explain a study on natural user interface (NUI) in human hand motion recognition using RGB color information and depth information by Kinect camera from Microsoft Corporation. To achieve the goal, hand tracking and gesture recognition have no major dependencies of the work environment, lighting or users' skin color, libraries of particular use for natural interaction and Kinect device, which serves to provide RGB images of the environment and the depth map of the scene were used. An improved Camshift tracking algorithm is used to tracking hand motion, the experimental results show out it has better performance than Camshift algorithm, and it has higher stability and accuracy as well.

A new study on hand gesture recognition algorithm using leap motion system (Leap Motion 시스템을 이용한 손동작 인식기반 제어 인터페이스 기술 연구)

  • Nam, Jae-Hyun;Yang, Seung-Hun;Hu, Woong;Kim, Byung-Gyu
    • Journal of Korea Multimedia Society
    • /
    • v.17 no.11
    • /
    • pp.1263-1269
    • /
    • 2014
  • As rapid development of new hardware control interface technology, new concepts have been being proposed and emerged. In this paper, a new approach based on leap motion system is proposed. While we employ a position information from sensor, the hand gesture recognition is suggested with the pre-defined patterns. To do this, we design a recognition algorithm with hand gesture and finger patterns. We apply the proposed scheme to 3-dimensional avatar controling and editing software tool for making animation in the cyber space as a representative application. This proposed algorithm can be used to control computer systems in medical treatment, game, education and other various areas.

Vision-based Human-Robot Motion Transfer in Tangible Meeting Space (실감만남 공간에서의 비전 센서 기반의 사람-로봇간 운동 정보 전달에 관한 연구)

  • Choi, Yu-Kyung;Ra, Syun-Kwon;Kim, Soo-Whan;Kim, Chang-Hwan;Park, Sung-Kee
    • The Journal of Korea Robotics Society
    • /
    • v.2 no.2
    • /
    • pp.143-151
    • /
    • 2007
  • This paper deals with a tangible interface system that introduces robot as remote avatar. It is focused on a new method which makes a robot imitate human arm motions captured from a remote space. Our method is functionally divided into two parts: capturing human motion and adapting it to robot. In the capturing part, we especially propose a modified potential function of metaballs for the real-time performance and high accuracy. In the adapting part, we suggest a geometric scaling method for solving the structural difference between a human and a robot. With our method, we have implemented a tangible interface and showed its speed and accuracy test.

  • PDF

Development of Chip-based Precision Motion Controller

  • Cho, Jung-Uk;Jeon, Jae-Wook
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.1022-1027
    • /
    • 2003
  • The Motion controllers provide the sophisticated performance and enhanced capabilities we can see in the movements of robotic systems. Several types of motion controllers are available, some based on the kind of overall control system in use. PLC (Programmable Logic Controller)-based motion controllers still predominate. The many peoples use MCU (Micro Controller Unit)-based board level motion controllers and will continue to in the near-term future. These motion controllers control a variety motor system like robotic systems. Generally, They consist of large and complex circuits. PLC-based motion controller consists of high performance PLC, development tool, and application specific software. It can be cause to generate several problems that are large size and space, much cabling, and additional high coasts. MCU-based motion controller consists of memories like ROM and RAM, I/O interface ports, and decoder in order to operate MCU. Additionally, it needs DPRAM to communicate with host PC, counter to get position information of motor by using encoder signal, additional circuits to control servo, and application specific software to generate a various velocity profiles. It can be causes to generate several problems that are overall system complexity, large size and space, much cabling, large power consumption and additional high costs. Also, it needs much times to calculate velocity profile because of generating by software method and don't generate various velocity profiles like arbitrary velocity profile. Therefore, It is hard to generate expected various velocity profiles. And further, to embed real-time OS (Operating System) is considered for more reliable motion control. In this paper, the structure of chip-based precision motion controller is proposed to solve above-mentioned problems of control systems. This proposed motion controller is designed with a FPGA (Field Programmable Gate Arrays) by using the VHDL (Very high speed integrated circuit Hardware Description Language) and Handel-C that is program language for deign hardware. This motion controller consists of Velocity Profile Generator (VPG) part to generate expected various velocity profiles, PCI Interface part to communicate with host PC, Feedback Counter part to get position information by using encoder signal, Clock Generator to generate expected various clock signal, Controller part to control position of motor with generated velocity profile and position information, and Data Converter part to convert and transmit compatible data to D/A converter.

  • PDF

Intelligent User Interface for Teleoperated Microassembly

  • Song, Eun-Ha;Kim, Deok-Ho;Kim, Kyunghwan;Lee, Jaehoon
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.168.2-168
    • /
    • 2001
  • Generally, operators suffer much difficulty in manipulating micro-nano-sized objects without assisting of human interface, due to the scaling effects in micro/nano world. Thus, the micromanipulation system based on the teleoperation techniques enables operators to manipulate the micro objects easily by transferring both human motion and manipulation skills to the micromanipulator. In teleoperated microassembly, it is necessary to provide a useful user interface for facilitating micro assembly task executed by human operators. In this paper, Intelligent User Interface (UI) for teleoperated microassembly is proposed. The proposed intelligent UI improves task performance by teleoperated micromanipulation as well as guides operators to succeed in desirable ...

  • PDF

A Review of Motion Capture Systems: Focusing on Clinical Applications and Kinematic Variables (모션 캡처 시스템에 대한 고찰: 임상적 활용 및 운동형상학적 변인 측정 중심으로)

  • Lim, Wootaek
    • Physical Therapy Korea
    • /
    • v.29 no.2
    • /
    • pp.87-93
    • /
    • 2022
  • To solve the pathological problems of the musculoskeletal system based on evidence, a sophisticated analysis of human motion is required. Traditional optical motion capture systems with high validity and reliability have been utilized in clinical practice for a long time. However, expensive equipment and professional technicians are required to construct optical motion capture systems, hence they are used at a limited capacity in clinical settings despite their advantages. The development of information technology has overcome the existing limit and paved the way for constructing a motion capture system that can be operated at a low cost. Recently, with the development of computer vision-based technology and optical markerless tracking technology, webcam-based 3D human motion analysis has become possible, in which the intuitive interface increases the user-friendliness to non-specialists. In addition, unlike conventional optical motion capture, with this approach, it is possible to analyze motions of multiple people at simultaneously. In a non-optical motion capture system, an inertial measurement unit is typically used, which is not significantly different from a conventional optical motion capture system in terms of its validity and reliability. With the development of markerless technology and advent of non-optical motion capture systems, it is a great advantage that human motion analysis is no longer limited to laboratories.

Real-time Multi-device Control System Implementation for Natural User Interactive Platform

  • Kim, Myoung-Jin;Hwang, Tae-min;Chae, Sung-Hun;Kim, Min-Joon;Moon, Yeon-Kug;Kim, SeungJun
    • Journal of Internet Computing and Services
    • /
    • v.23 no.1
    • /
    • pp.19-29
    • /
    • 2022
  • Natural user interface (NUI) is used for the natural motion interface without using a specific device or tool like a mouse, keyboards, and pens. Recently, as non-contact sensor-based interaction technologies for recognizing human motion, gestures, voice, and gaze have been actively studied, an environment has been prepared that can provide more diverse contents based on various interaction methods compared to existing methods. However, as the number of sensors device is rapidly increasing, the system using a lot of sensors can suffer from a lack of computational resources. To address this problem, we proposed a real-time multi-device control system for natural interactive platform. In the proposed system, we classified two types of devices as the HC devices such as high-end commercial sensor and the LC devices such astraditional monitoring sensor with low-cost. we adopt each device manager to control efficiently. we demonstrate a proposed system works properly with user behavior such as gestures, motions, gazes, and voices.

Interactive sound experience interface based on virtual concert hall (가상 콘서트홀 기반의 인터랙티브 음향 체험 인터페이스)

  • Cho, Hye-Seung;Kim, Hyoung-Gook
    • The Journal of the Acoustical Society of Korea
    • /
    • v.36 no.2
    • /
    • pp.130-135
    • /
    • 2017
  • In this paper, we propose an interface for interactive sound experience in the virtual concert hall. The proposed interface consists of two systems, called 'virtual acoustic position' and 'virtual active listening'. To provide these systems, we applied an artificial reverberation algorithm, multi-channel source separation and head-related transfer function. The proposed interface was implemented by using Unity. The interface provides the virtual concert hall to user through Oculus Rift, one of the virtual reality headsets. Moreover, we used Leap Motion as a control device to allow a user experience the system with free-hand. And user can experience the sound of the system through headphones.