• 제목/요약/키워드: Motion based interface

검색결과 309건 처리시간 0.049초

실감형 콘텐츠 작동을 위한 모션 기반 4D 특수효과 장치 제어 (Motion-based Controlling 4D Special Effect Devices to Activate Immersive Contents)

  • 김광진;이칠우
    • 스마트미디어저널
    • /
    • 제8권1호
    • /
    • pp.51-58
    • /
    • 2019
  • 본 논문은 펄스폭 변조(PWM, Pulse Width Modulation) 방법을 이용하여 4D 콘텐츠의 특수효과용 물리장치들을 제어하는 제스처 응용 방법에 대해 기술한다. 적외선 센서를 통해 인식되는 사용자 동작은 3D 콘텐츠 제어를 위한 명령어로 해석되고 특수효과를 발생시키는 장치를 제어하여 물리적인 자극을 사용자에게 재현한다. 이와 같이 NUI(Natural User Interface) 기법을 이용하여 콘텐츠를 제어하게 되면 사용자의 콘텐츠에 대한 직접적인 몰입감이 증대되어 사용자에게 고도의 흥미와 관심을 제공할 수 있다. 제안하는 방법의 효율성을 측정하기 위해 적외선 센서를 이용한 모션인식과 애니메이션 컨트롤러의 파라미터를 제어하여 이벤트를 전달하는 PWM 기반 실시간 선형제어 시스템을 구현하였다.

Multimodal Interface Based on Novel HMI UI/UX for In-Vehicle Infotainment System

  • Kim, Jinwoo;Ryu, Jae Hong;Han, Tae Man
    • ETRI Journal
    • /
    • 제37권4호
    • /
    • pp.793-803
    • /
    • 2015
  • We propose a novel HMI UI/UX for an in-vehicle infotainment system. Our proposed HMI UI comprises multimodal interfaces that allow a driver to safely and intuitively manipulate an infotainment system while driving. Our analysis of a touchscreen interface-based HMI UI/UX reveals that a driver's use of such an interface while driving can cause the driver to be seriously distracted. Our proposed HMI UI/UX is a novel manipulation mechanism for a vehicle infotainment service. It consists of several interfaces that incorporate a variety of modalities, such as speech recognition, a manipulating device, and hand gesture recognition. In addition, we provide an HMI UI framework designed to be manipulated using a simple method based on four directions and one selection motion. Extensive quantitative and qualitative in-vehicle experiments demonstrate that the proposed HMI UI/UX is an efficient mechanism through which to manipulate an infotainment system while driving.

Human-Computer Natur al User Inter face Based on Hand Motion Detection and Tracking

  • Xu, Wenkai;Lee, Eung-Joo
    • 한국멀티미디어학회논문지
    • /
    • 제15권4호
    • /
    • pp.501-507
    • /
    • 2012
  • Human body motion is a non-verbal part for interaction or movement that can be used to involves real world and virtual world. In this paper, we explain a study on natural user interface (NUI) in human hand motion recognition using RGB color information and depth information by Kinect camera from Microsoft Corporation. To achieve the goal, hand tracking and gesture recognition have no major dependencies of the work environment, lighting or users' skin color, libraries of particular use for natural interaction and Kinect device, which serves to provide RGB images of the environment and the depth map of the scene were used. An improved Camshift tracking algorithm is used to tracking hand motion, the experimental results show out it has better performance than Camshift algorithm, and it has higher stability and accuracy as well.

Leap Motion 시스템을 이용한 손동작 인식기반 제어 인터페이스 기술 연구 (A new study on hand gesture recognition algorithm using leap motion system)

  • 남재현;양승훈;허웅;김병규
    • 한국멀티미디어학회논문지
    • /
    • 제17권11호
    • /
    • pp.1263-1269
    • /
    • 2014
  • As rapid development of new hardware control interface technology, new concepts have been being proposed and emerged. In this paper, a new approach based on leap motion system is proposed. While we employ a position information from sensor, the hand gesture recognition is suggested with the pre-defined patterns. To do this, we design a recognition algorithm with hand gesture and finger patterns. We apply the proposed scheme to 3-dimensional avatar controling and editing software tool for making animation in the cyber space as a representative application. This proposed algorithm can be used to control computer systems in medical treatment, game, education and other various areas.

실감만남 공간에서의 비전 센서 기반의 사람-로봇간 운동 정보 전달에 관한 연구 (Vision-based Human-Robot Motion Transfer in Tangible Meeting Space)

  • 최유경;나성권;김수환;김창환;박성기
    • 로봇학회논문지
    • /
    • 제2권2호
    • /
    • pp.143-151
    • /
    • 2007
  • This paper deals with a tangible interface system that introduces robot as remote avatar. It is focused on a new method which makes a robot imitate human arm motions captured from a remote space. Our method is functionally divided into two parts: capturing human motion and adapting it to robot. In the capturing part, we especially propose a modified potential function of metaballs for the real-time performance and high accuracy. In the adapting part, we suggest a geometric scaling method for solving the structural difference between a human and a robot. With our method, we have implemented a tangible interface and showed its speed and accuracy test.

  • PDF

Development of Chip-based Precision Motion Controller

  • Cho, Jung-Uk;Jeon, Jae-Wook
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2003년도 ICCAS
    • /
    • pp.1022-1027
    • /
    • 2003
  • The Motion controllers provide the sophisticated performance and enhanced capabilities we can see in the movements of robotic systems. Several types of motion controllers are available, some based on the kind of overall control system in use. PLC (Programmable Logic Controller)-based motion controllers still predominate. The many peoples use MCU (Micro Controller Unit)-based board level motion controllers and will continue to in the near-term future. These motion controllers control a variety motor system like robotic systems. Generally, They consist of large and complex circuits. PLC-based motion controller consists of high performance PLC, development tool, and application specific software. It can be cause to generate several problems that are large size and space, much cabling, and additional high coasts. MCU-based motion controller consists of memories like ROM and RAM, I/O interface ports, and decoder in order to operate MCU. Additionally, it needs DPRAM to communicate with host PC, counter to get position information of motor by using encoder signal, additional circuits to control servo, and application specific software to generate a various velocity profiles. It can be causes to generate several problems that are overall system complexity, large size and space, much cabling, large power consumption and additional high costs. Also, it needs much times to calculate velocity profile because of generating by software method and don't generate various velocity profiles like arbitrary velocity profile. Therefore, It is hard to generate expected various velocity profiles. And further, to embed real-time OS (Operating System) is considered for more reliable motion control. In this paper, the structure of chip-based precision motion controller is proposed to solve above-mentioned problems of control systems. This proposed motion controller is designed with a FPGA (Field Programmable Gate Arrays) by using the VHDL (Very high speed integrated circuit Hardware Description Language) and Handel-C that is program language for deign hardware. This motion controller consists of Velocity Profile Generator (VPG) part to generate expected various velocity profiles, PCI Interface part to communicate with host PC, Feedback Counter part to get position information by using encoder signal, Clock Generator to generate expected various clock signal, Controller part to control position of motor with generated velocity profile and position information, and Data Converter part to convert and transmit compatible data to D/A converter.

  • PDF

Intelligent User Interface for Teleoperated Microassembly

  • Song, Eun-Ha;Kim, Deok-Ho;Kim, Kyunghwan;Lee, Jaehoon
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2001년도 ICCAS
    • /
    • pp.168.2-168
    • /
    • 2001
  • Generally, operators suffer much difficulty in manipulating micro-nano-sized objects without assisting of human interface, due to the scaling effects in micro/nano world. Thus, the micromanipulation system based on the teleoperation techniques enables operators to manipulate the micro objects easily by transferring both human motion and manipulation skills to the micromanipulator. In teleoperated microassembly, it is necessary to provide a useful user interface for facilitating micro assembly task executed by human operators. In this paper, Intelligent User Interface (UI) for teleoperated microassembly is proposed. The proposed intelligent UI improves task performance by teleoperated micromanipulation as well as guides operators to succeed in desirable ...

  • PDF

모션 캡처 시스템에 대한 고찰: 임상적 활용 및 운동형상학적 변인 측정 중심으로 (A Review of Motion Capture Systems: Focusing on Clinical Applications and Kinematic Variables)

  • 임우택
    • 한국전문물리치료학회지
    • /
    • 제29권2호
    • /
    • pp.87-93
    • /
    • 2022
  • To solve the pathological problems of the musculoskeletal system based on evidence, a sophisticated analysis of human motion is required. Traditional optical motion capture systems with high validity and reliability have been utilized in clinical practice for a long time. However, expensive equipment and professional technicians are required to construct optical motion capture systems, hence they are used at a limited capacity in clinical settings despite their advantages. The development of information technology has overcome the existing limit and paved the way for constructing a motion capture system that can be operated at a low cost. Recently, with the development of computer vision-based technology and optical markerless tracking technology, webcam-based 3D human motion analysis has become possible, in which the intuitive interface increases the user-friendliness to non-specialists. In addition, unlike conventional optical motion capture, with this approach, it is possible to analyze motions of multiple people at simultaneously. In a non-optical motion capture system, an inertial measurement unit is typically used, which is not significantly different from a conventional optical motion capture system in terms of its validity and reliability. With the development of markerless technology and advent of non-optical motion capture systems, it is a great advantage that human motion analysis is no longer limited to laboratories.

Real-time Multi-device Control System Implementation for Natural User Interactive Platform

  • 김명진;황태민;채승훈;김민준;문연국;김승준
    • 인터넷정보학회논문지
    • /
    • 제23권1호
    • /
    • pp.19-29
    • /
    • 2022
  • Natural user interface (NUI) is used for the natural motion interface without using a specific device or tool like a mouse, keyboards, and pens. Recently, as non-contact sensor-based interaction technologies for recognizing human motion, gestures, voice, and gaze have been actively studied, an environment has been prepared that can provide more diverse contents based on various interaction methods compared to existing methods. However, as the number of sensors device is rapidly increasing, the system using a lot of sensors can suffer from a lack of computational resources. To address this problem, we proposed a real-time multi-device control system for natural interactive platform. In the proposed system, we classified two types of devices as the HC devices such as high-end commercial sensor and the LC devices such astraditional monitoring sensor with low-cost. we adopt each device manager to control efficiently. we demonstrate a proposed system works properly with user behavior such as gestures, motions, gazes, and voices.

가상 콘서트홀 기반의 인터랙티브 음향 체험 인터페이스 (Interactive sound experience interface based on virtual concert hall)

  • 조혜승;김형국
    • 한국음향학회지
    • /
    • 제36권2호
    • /
    • pp.130-135
    • /
    • 2017
  • 본 논문에서는 가상 콘서트홀에서 사용자가 인터랙티브한 음향 체험을 할 수 있는 인터페이스에 대해 제안한다. 제안하는 인터페이스는 가상 콘서트홀을 기반으로 좌석 위치별 음향 체험 시스템과 악기 제어 및 음향 체험 시스템으로 구성된다. 제안하는 인터페이스의 각 시스템을 구현하기 위해 인공 잔향 알고리즘과 멀티채널 음원 분리, 머리전달함수를 적용하였다. 제안하는 인터페이스는 유니티(Unity)를 사용하여 구현되었으며 사용자는 가상현실기기인 오큘러스 리프트(Oculus Rift)를 통해 가상 콘서트홀을 체험할 수 있고 립 모션(Leap Motion)을 통해 별도의 입력 도구없이 손동작만으로 시스템을 제어할 수 있으며 헤드폰을 통해 시스템이 제공하는 음향을 체험할 수 있다.