• Title/Summary/Keyword: Motion Recognition Platform

Search Result 21, Processing Time 0.036 seconds

Android Platform based Gesture Recognition using Smart Phone Sensor Data (안드로이드 플랫폼기반 스마트폰 센서 정보를 활용한 모션 제스처 인식)

  • Lee, Yong Cheol;Lee, Chil Woo
    • Smart Media Journal
    • /
    • v.1 no.4
    • /
    • pp.18-26
    • /
    • 2012
  • The increase of the number of smartphone applications has enforced the importance of new user interface emergence and has raised the interest of research in the convergence of multiple sensors. In this paper, we propose a method for the convergence of acceleration, magnetic and gyro sensors to recognize the gesture from motion of user smartphone. The proposed method first obtain the 3D orientation of smartphone and recognize the gesture of hand motion by using HMM(Hidden Markov Model). The proposed method for the representation for 3D orientation of smartphone in spherical coordinate was used for quantization of smartphone orientation to be more sensitive in rotation axis. The experimental result shows that the success rate of our method is 93%.

  • PDF

Healthcare and Emergency Response Service Platform Based on Android Smartphone

  • Choi, Hoan-Suk;Rhee, Woo-Seop
    • International Journal of Contents
    • /
    • v.16 no.1
    • /
    • pp.75-86
    • /
    • 2020
  • As the elderly population is becoming an aging society, the elderly are experiencing many problems. Social security costs for the elderly are increasing and the un-linked social phenomenon is emerging. Thus, the social infrastructure and welfare system established in the past economic growth period are in danger of not functioning properly. People socially isolated or with chronic diseases among the elderly are exposed to various accidents. Thus, an active healthcare management service is imperative. Additionally, in the event of a dangerous situation, the system must have ways to notify guardians (family or medical personnel) regarding appropriate action. Thus, in this paper, we propose the smartphone-based healthcare and emergency response service platform. The proposed service platform aggregates movement of relevant data in real-time using a smartphone. Based on aggregated data, it will always recognize the user's movements and current state using the human motion recognition mechanism. Thus, the proposed service platform provides real-time status monitoring, activity reports, a health calendar, location-based hospital information, emergency situation detection, and cloud messaging server-based efficient notification to several subscribers such as family, guardians, and medical personnel. Through this service, users or guardians can augment the level of care for the elderly through the reports. Also, if an emergency situation is detected, the system immediately informs guardians so as to minimize the risk through immediate response.

Development of Motion Recognition and Real-time Positioning Technology for Radiotherapy Patients Using Depth Camera and YOLOAddSeg Algorithm (뎁스카메라와 YOLOAddSeg 알고리즘을 이용한 방사선치료환자 미세동작인식 및 실시간 위치보정기술 개발)

  • Ki Yong Park;Gyu Ha Ryu
    • Journal of Biomedical Engineering Research
    • /
    • v.44 no.2
    • /
    • pp.125-138
    • /
    • 2023
  • The development of AI systems for radiation therapy is important to improve the accuracy, effectiveness, and safety of cancer treatment. The current system has the disadvantage of monitoring patients using CCTV, which can cause errors and mistakes in the treatment process, which can lead to misalignment of radiation. Developed the PMRP system, an AI automation system that uses depth cameras to measure patient's fine movements, segment patient's body into parts, align Z values of depth cameras with Z values, and transmit measured feedback to positioning devices in real time, monitoring errors and treatments. The need for such a system began because the CCTV visual monitoring system could not detect fine movements, Z-direction movements, and body part movements, hindering improvement of radiation therapy performance and increasing the risk of side effects in normal tissues. This study could provide the development of a field of radiotherapy that lags in many parts of the world, along with the economic and social importance of developing an independent platform for radiotherapy devices. This study verified its effectiveness and efficiency with data through phantom experiments, and future studies aim to help improve treatment performance by improving the posture correction mechanism and correcting left and right up and down movements in real time.

Intelligent interface using hand gestures recognition based on artificial intelligence (인공지능 기반 손 체스처 인식 정보를 활용한 지능형 인터페이스)

  • Hangjun Cho;Junwoo Yoo;Eun Soo Kim;Young Jae Lee
    • Journal of Platform Technology
    • /
    • v.11 no.1
    • /
    • pp.38-51
    • /
    • 2023
  • We propose an intelligent interface algorithm using hand gesture recognition information based on artificial intelligence. This method is functionally an interface that recognizes various motions quickly and intelligently by using MediaPipe and artificial intelligence techniques such as KNN, LSTM, and CNN to track and recognize user hand gestures. To evaluate the performance of the proposed algorithm, it is applied to a self-made 2D top-view racing game and robot control. As a result of applying the algorithm, it was possible to control various movements of the virtual object in the game in detail and robustly. And the result of applying the algorithm to the robot control in the real world, it was possible to control movement, stop, left turn, and right turn. In addition, by controlling the main character of the game and the robot in the real world at the same time, the optimized motion was implemented as an intelligent interface for controlling the coexistence space of virtual and real world. The proposed algorithm enables sophisticated control according to natural and intuitive characteristics using the body and fine movement recognition of fingers, and has the advantage of being skilled in a short period of time, so it can be used as basic data for developing intelligent user interfaces.

  • PDF

Localization using Ego Motion based on Fisheye Warping Image (어안 워핑 이미지 기반의 Ego motion을 이용한 위치 인식 알고리즘)

  • Choi, Yun Won;Choi, Kyung Sik;Choi, Jeong Won;Lee, Suk Gyu
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.1
    • /
    • pp.70-77
    • /
    • 2014
  • This paper proposes a novel localization algorithm based on ego-motion which used Lucas-Kanade Optical Flow and warping image obtained through fish-eye lenses mounted on the robots. The omnidirectional image sensor is a desirable sensor for real-time view-based recognition of a robot because the all information around the robot can be obtained simultaneously. The preprocessing (distortion correction, image merge, etc.) of the omnidirectional image which obtained by camera using reflect in mirror or by connection of multiple camera images is essential because it is difficult to obtain information from the original image. The core of the proposed algorithm may be summarized as follows: First, we capture instantaneous $360^{\circ}$ panoramic images around a robot through fish-eye lenses which are mounted in the bottom direction. Second, we extract motion vectors using Lucas-Kanade Optical Flow in preprocessed image. Third, we estimate the robot position and angle using ego-motion method which used direction of vector and vanishing point obtained by RANSAC. We confirmed the reliability of localization algorithm using ego-motion based on fisheye warping image through comparison between results (position and angle) of the experiment obtained using the proposed algorithm and results of the experiment measured from Global Vision Localization System.

A Study of Hand Gesture Recognition for Human Computer Interface (컴퓨터 인터페이스를 위한 Hand Gesture 인식에 관한 연구)

  • Chang, Ho-Jung;Baek, Han-Wook;Chung, Chin-Hyun
    • Proceedings of the KIEE Conference
    • /
    • 2000.07d
    • /
    • pp.3041-3043
    • /
    • 2000
  • GUI(graphical user interface) has been the dominant platform for HCI(human computer interaction). The GUI-based style of interaction has made computers simpler and easier to use. However GUI will not easily support the range of interaction necessary to meet users' needs that are natural, intuitive, and adaptive. In this paper we study an approach to track a hand in an image sequence and recognize it, in each video frame for replacing the mouse as a pointing device to virtual reality. An algorithm for real time processing is proposed by estimating of the position of the hand and segmentation, considering the orientation of motion and color distribution of hand region.

  • PDF

Design and implement of the Educational Humanoid Robot D2 for Emotional Interaction System (감성 상호작용을 갖는 교육용 휴머노이드 로봇 D2 개발)

  • Kim, Do-Woo;Chung, Ki-Chull;Park, Won-Sung
    • Proceedings of the KIEE Conference
    • /
    • 2007.07a
    • /
    • pp.1777-1778
    • /
    • 2007
  • In this paper, We design and implement a humanoid robot, With Educational purpose, which can collaborate and communicate with human. We present an affective human-robot communication system for a humanoid robot, D2, which we designed to communicate with a human through dialogue. D2 communicates with humans by understanding and expressing emotion using facial expressions, voice, gestures and posture. Interaction between a human and a robot is made possible through our affective communication framework. The framework enables a robot to catch the emotional status of the user and to respond appropriately. As a result, the robot can engage in a natural dialogue with a human. According to the aim to be interacted with a human for voice, gestures and posture, the developed Educational humanoid robot consists of upper body, two arms, wheeled mobile platform and control hardware including vision and speech capability and various control boards such as motion control boards, signal processing board proceeding several types of sensors. Using the Educational humanoid robot D2, we have presented the successful demonstrations which consist of manipulation task with two arms, tracking objects using the vision system, and communication with human by the emotional interface, the synthesized speeches, and the recognition of speech commands.

  • PDF

Development of ROS2-on-Yocto-based Thin Client Robot for Cloud Robotics (클라우드 연동을 위한 ROS2 on Yocto 기반의 Thin Client 로봇 개발)

  • Kim, Yunsung;Lee, Dongoen;Jeong, Seonghoon;Moon, Hyeongil;Yu, Changseung;Lee, Kangyoung;Choi, Juneyoul;Kim, Youngjae
    • The Journal of Korea Robotics Society
    • /
    • v.16 no.4
    • /
    • pp.327-335
    • /
    • 2021
  • In this paper, we propose an embedded robot system based on "ROS2 on Yocto" that can support various robots. We developed a lightweight OS based on the Yocto Project as a next-generation robot platform targeting cloud robotics. Yocto Project was adopted for portability and scalability in both software and hardware, and ROS2 was adopted and optimized considering a low specification embedded hardware system. We developed SLAM, navigation, path planning, and motion for the proposed robot system validation. For verification of software packages, we applied it to home cleaning robot and indoor delivery robot that were already commercialized by LG Electronics and verified they can do autonomous driving, obstacle recognition, and avoidance driving. Memory usage and network I/O have been improved by applying the binary launch method based on shell and mmap application as opposed to the conventional Python method. Finally, we verified the possibility of mass production and commercialization of the proposed system through performance evaluation from CPU and memory perspective.

The Modified Block Matching Algorithm for a Hand Tracking of an HCI system (HCI 시스템의 손 추적을 위한 수정 블록 정합 알고리즘)

  • Kim Jin-Ok
    • Journal of Internet Computing and Services
    • /
    • v.4 no.4
    • /
    • pp.9-14
    • /
    • 2003
  • A GUI (graphical user interface) has been a dominant platform for HCI (human computer interaction). A GUI - based interaction has made computers simpler and easier to use. The GUI - based interaction, however, does not easily support the range of interaction necessary to meet users' needs that are natural. intuitive, and adaptive. In this paper, the modified BMA (block matching algorithm) is proposed to track a hand in a sequence of an image and to recognize it in each video frame in order to replace a mouse with a pointing device for a virtual reality. The HCI system with 30 frames per second is realized in this paper. The modified BMA is proposed to estimate a position of the hand and segmentation with an orientation of motion and a color distribution of the hand region for real - time processing. The experimental result shows that the modified BMA with the YCbCr (luminance Y, component blue, component red) color coordinate guarantees the real - time processing and the recognition rate. The hand tracking by the modified BMA can be applied to a virtual reclity or a game or an HCI system for the disable.

  • PDF

Developing Experiential Exhibitions Based on Conservation Science Content of Bronze Mirror

  • Jo, Young Hoon;Kim, Jikio;Yun, Yong Hyun;Cho, Nam Chul;Lee, Chan Hee
    • Journal of Conservation Science
    • /
    • v.37 no.4
    • /
    • pp.362-369
    • /
    • 2021
  • In museums, exhibition content focuses mostly on cultural heritage's historical values and functions, but doing so tends to limit visitors' interest and immersion. To counter this limitation, the study developed an experiential media art exhibition fusing bronze mirrors' traditional production technology and modern conservation science. First, for the exhibition system, scientific cultural heritage contents were projected on the three-dimensional (3D) printed bronze mirror through interactions between motion recognition digital information display (DID) and the projector. Then, a scenario of 17 missions in four stages (production process, corrosion mechanism, scientific analysis and diagnosis, and conservation treatment and restoration) was prepared according to the temporal spectrum. Additionally, various media art effects and interaction technologies were developed, so visitors could understand and become immersed in bronze mirrors' scientific content. A user test was evaluated through the living lab, reflecting generally high levels of satisfaction (90.2 points). Qualitative evaluation was generally positive, with comments such as "easy to understand and useful as the esoteric science exhibition was combined with media art" (16.7%), "wonderful and interesting" (11.7%), and "firsthand experience was good" (9.2%). By combining an esoteric science exhibition centered on principles and theories with visual media art and by developing an immersive directing method to provide high-level exhibition technology, the exhibition induced visitors' active participation. This exhibition's content can become an important platform for expanding universal museum exhibitions on archaeology, history, and art into conservation science.