• Title/Summary/Keyword: IMU(Inertial Measurement Unit)

Search Result 218, Processing Time 0.022 seconds

Design and Implementation of Unmanned Surface Vehicle JEROS for Jellyfish Removal (해파리 퇴치용 자율 수상 로봇의 설계 및 구현)

  • Kim, Donghoon;Shin, Jae-Uk;Kim, Hyongjin;Kim, Hanguen;Lee, Donghwa;Lee, Seung-Mok;Myung, Hyun
    • The Journal of Korea Robotics Society
    • /
    • v.8 no.1
    • /
    • pp.51-57
    • /
    • 2013
  • Recently, the number of jellyfish has been rapidly grown because of the global warming, the increase of marine structures, pollution, and etc. The increased jellyfish is a threat to the marine ecosystem and induces a huge damage to fishery industries, seaside power plants, and beach industries. To overcome this problem, a manual jellyfish dissecting device and pump system for jellyfish removal have been developed by researchers. However, the systems need too many human operators and their benefit to cost is not so good. Thus, in this paper, the design, implementation, and experiments of autonomous jellyfish removal robot system, named JEROS, have been presented. The JEROS consists of an unmanned surface vehicle (USV), a device for jellyfish removal, an electrical control system, an autonomous navigation system, and a vision-based jellyfish detection system. The USV was designed as a twin hull-type ship, and a jellyfish removal device consists of a net for gathering jellyfish and a blades-equipped propeller for dissecting jellyfish. The autonomous navigation system starts by generating an efficient path for jellyfish removal when the location of jellyfish is received from a remote server or recognized by a vision system. The location of JEROS is estimated by IMU (Inertial Measurement Unit) and GPS, and jellyfish is eliminated while tracking the path. The performance of the vision-based jellyfish recognition, navigation, and jellyfish removal was demonstrated through field tests in the Masan and Jindong harbors in the southern coast of Korea.

Gimbal System Control for Drone for 3D Image (입체영상 촬영을 위한 드론용 짐벌시스템 제어)

  • Kim, Min;Byun, Gi-Sig;Kim, Gwan-Hyung
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.20 no.11
    • /
    • pp.2107-2112
    • /
    • 2016
  • This paper is designed to develop a Gimbal control stabilizer for drones Gimbal system control for drone for 3D image to make sure clean image in the shaking and wavering environments of drone system. The stabilizer is made of tools which support camera modules and IMU(Inertial Measurement Unit) sensor modules follow exact angles, which can brock vibrations outside of the camera modules. It is difficult for the camera modules to get clean image, because of irregular movements and various vibrations produced by flying drones. Moreover, a general PID controller used for the movements of rolling, pitching and yawing in order to control the various vibrations of various frequencies needs often to readjust PID control parameters. Therefore, this paper aims to conduct the Intelligent-PID controller as well as design the Gimbal control stabilizer to get clean images and to improve irregular movements and various vibrations problems referenced above.

Method to Improve Localization and Mapping Accuracy on the Urban Road Using GPS, Monocular Camera and HD Map (GPS와 단안카메라, HD Map을 이용한 도심 도로상에서의 위치측정 및 맵핑 정확도 향상 방안)

  • Kim, Young-Hun;Kim, Jae-Myeong;Kim, Gi-Chang;Choi, Yun-Soo
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.5_1
    • /
    • pp.1095-1109
    • /
    • 2021
  • The technology used to recognize the location and surroundings of autonomous vehicles is called SLAM. SLAM standsfor Simultaneously Localization and Mapping and hasrecently been actively utilized in research on autonomous vehicles,starting with robotic research. Expensive GPS, INS, LiDAR, RADAR, and Wheel Odometry allow precise magnetic positioning and mapping in centimeters. However, if it can secure similar accuracy as using cheaper Cameras and GPS data, it will contribute to advancing the era of autonomous driving. In this paper, we present a method for converging monocular camera with RTK-enabled GPS data to perform RMSE 33.7 cm localization and mapping on the urban road.

Analysis and Training Contents of Body Balance Ability using Range of Motion of Lumbar Spine and Center of Body Pressure (요추 관절가동범위와 신체압력중심을 이용한 신체균형능력 분석 및 훈련 콘텐츠)

  • Goo, Sejin;Kim, Dong-Yeon;Shin, Sung-Wook;Chung, Sung-Taek
    • The Journal of the Institute of Internet, Broadcasting and Communication
    • /
    • v.19 no.1
    • /
    • pp.279-287
    • /
    • 2019
  • In this paper, we attempted to analyze the balance ability of the body by measuring changes in body motion and plantar pressure distribution. So we developed a program that can measure and analyze range of motion and center of body pressure using inertial measurement unit(IMU) and FSR(Force Sensing Resistor) sensor, we also produced a contents that can help improve the balance ability. The quantitative values of range of motion and center of body pressure measured by this program are visualized in real time so that the user can easily recognize the results. In addition, the contents were designed to be adjusted according to the direction of improving the balance ability by adjusting the difficulty level based on the measured balance information. This can be achieved by increasing the concentration and participation will by using visual feedback method that proceeds while watching moving objects according to the user's motion.

Physical Offset of UAVs Calibration Method for Multi-sensor Fusion (다중 센서 융합을 위한 무인항공기 물리 오프셋 검보정 방법)

  • Kim, Cheolwook;Lim, Pyeong-chae;Chi, Junhwa;Kim, Taejung;Rhee, Sooahm
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.6_1
    • /
    • pp.1125-1139
    • /
    • 2022
  • In an unmanned aerial vehicles (UAVs) system, a physical offset can be existed between the global positioning system/inertial measurement unit (GPS/IMU) sensor and the observation sensor such as a hyperspectral sensor, and a lidar sensor. As a result of the physical offset, a misalignment between each image can be occurred along with a flight direction. In particular, in a case of multi-sensor system, an observation sensor has to be replaced regularly to equip another observation sensor, and then, a high cost should be paid to acquire a calibration parameter. In this study, we establish a precise sensor model equation to apply for a multiple sensor in common and propose an independent physical offset estimation method. The proposed method consists of 3 steps. Firstly, we define an appropriate rotation matrix for our system, and an initial sensor model equation for direct-georeferencing. Next, an observation equation for the physical offset estimation is established by extracting a corresponding point between a ground control point and the observed data from a sensor. Finally, the physical offset is estimated based on the observed data, and the precise sensor model equation is established by applying the estimated parameters to the initial sensor model equation. 4 region's datasets(Jeon-ju, Incheon, Alaska, Norway) with a different latitude, longitude were compared to analyze the effects of the calibration parameter. We confirmed that a misalignment between images were adjusted after applying for the physical offset in the sensor model equation. An absolute position accuracy was analyzed in the Incheon dataset, compared to a ground control point. For the hyperspectral image, root mean square error (RMSE) for X, Y direction was calculated for 0.12 m, and for the point cloud, RMSE was calculated for 0.03 m. Furthermore, a relative position accuracy for a specific point between the adjusted point cloud and the hyperspectral images were also analyzed for 0.07 m, so we confirmed that a precise data mapping is available for an observation without a ground control point through the proposed estimation method, and we also confirmed a possibility of multi-sensor fusion. From this study, we expect that a flexible multi-sensor platform system can be operated through the independent parameter estimation method with an economic cost saving.

Development and Performance Evaluation of Multi-sensor Module for Use in Disaster Sites of Mobile Robot (조사로봇의 재난현장 활용을 위한 다중센서모듈 개발 및 성능평가에 관한 연구)

  • Jung, Yonghan;Hong, Junwooh;Han, Soohee;Shin, Dongyoon;Lim, Eontaek;Kim, Seongsam
    • Korean Journal of Remote Sensing
    • /
    • v.38 no.6_3
    • /
    • pp.1827-1836
    • /
    • 2022
  • Disasters that occur unexpectedly are difficult to predict. In addition, the scale and damage are increasing compared to the past. Sometimes one disaster can develop into another disaster. Among the four stages of disaster management, search and rescue are carried out in the response stage when an emergency occurs. Therefore, personnel such as firefighters who are put into the scene are put in at a lot of risk. In this respect, in the initial response process at the disaster site, robots are a technology with high potential to reduce damage to human life and property. In addition, Light Detection And Ranging (LiDAR) can acquire a relatively wide range of 3D information using a laser. Due to its high accuracy and precision, it is a very useful sensor when considering the characteristics of a disaster site. Therefore, in this study, development and experiments were conducted so that the robot could perform real-time monitoring at the disaster site. Multi-sensor module was developed by combining LiDAR, Inertial Measurement Unit (IMU) sensor, and computing board. Then, this module was mounted on the robot, and a customized Simultaneous Localization and Mapping (SLAM) algorithm was developed. A method for stably mounting a multi-sensor module to a robot to maintain optimal accuracy at disaster sites was studied. And to check the performance of the module, SLAM was tested inside the disaster building, and various SLAM algorithms and distance comparisons were performed. As a result, PackSLAM developed in this study showed lower error compared to other algorithms, showing the possibility of application in disaster sites. In the future, in order to further enhance usability at disaster sites, various experiments will be conducted by establishing a rough terrain environment with many obstacles.

Extraction of Sea Surface Temperature in Coastal Area Using Ground-Based Thermal Infrared Sensor On-Boarded to Aircraft (지상용 열적외선 센서의 항공기 탑재를 통한 연안 해수표층온도 추출)

  • Kang, Ki-Mook;Kim, Duk-Jin;Kim, Seung Hee;Cho, Yang-Ki;Lee, Sang-Ho
    • Korean Journal of Remote Sensing
    • /
    • v.30 no.6
    • /
    • pp.797-807
    • /
    • 2014
  • The Sea Surface Temperature (SST) is one of the most important oceanic environmental factors in determining the change of marine environments and ecological activities. Satellite thermal infrared images can be effective for understanding the global trend of sea surface temperature due to large scale. However, their low spatial resolution caused some limitations in some areas where complicated and refined coastal shapes due to many islands are present as in the Korean Peninsula. The coastal ocean is also very important because human activities interact with the environmental change of coastal area and most aqua farming is distributed in the coastal ocean. Thus, low-cost airborne thermal infrared remote sensing with high resolution capability is considered for verifying its possibility to extract SST and to monitor the changes of coastal environment. In this study, an airborne thermal infrared system was implemented using a low-cost and ground-based thermal infrared camera (FLIR), and more than 8 airborne acquisitions were carried out in the western coast of the Korean Peninsula during the periods between May 23, 2012 and December 7, 2013. The acquired thermal infrared images were radiometrically calibrated using an atmospheric radiative transfer model with a support from a temperature-humidity sensor, and geometrically calibrated using GPS and IMU sensors. In particular, the airborne sea surface temperature acquired in June 25, 2013 was compared and verified with satellite SST as well as ship-borne thermal infrared and in-situ SST data. As a result, the airborne thermal infrared sensor extracted SST with an accuracy of $1^{\circ}C$.

Development of Robot Platform for Autonomous Underwater Intervention (수중 자율작업용 로봇 플랫폼 개발)

  • Yeu, Taekyeong;Choi, Hyun Taek;Lee, Yoongeon;Chae, Junbo;Lee, Yeongjun;Kim, Seong Soon;Park, Sanghyun;Lee, Tae Hee
    • Journal of Ocean Engineering and Technology
    • /
    • v.33 no.2
    • /
    • pp.168-177
    • /
    • 2019
  • KRISO (Korea Research Institute of Ship & Ocean Engineering) started a project to develop the core algorithms for autonomous intervention using an underwater robot in 2017. This paper introduces the development of the robot platform for the core algorithms, which is an ROV (Remotely Operated Vehicle) type with one 7-function manipulator. Before the detailed design of the robot platform, the 7E-MINI arm of the ECA Group was selected as the manipulator. It is an electrical type, with a weight of 51 kg in air (30 kg in water) and a full reach of 1.4 m. To design a platform with a small size and light weight to fit in a water tank, the medium-size manipulator was placed on the center of platform, and the structural analysis of the body frame was conducted by ABAQUS. The robot had an IMU (Inertial Measurement Unit), a DVL (Doppler Velocity Log), and a depth sensor for measuring the underwater position and attitude. To control the robot motion, eight thrusters were installed, four for vertical and the rest for horizontal motion. The operation system was composed of an on-board control station and operation S/W. The former included devices such as a 300 VDC power supplier, Fiber-Optic (F/O) to Ethernet communication converter, and main control PC. The latter was developed using an ROS (Robot Operation System) based on Linux. The basic performance of the manufactured robot platform was verified through a water tank test, where the robot was manually operated using a joystick, and the robot motion and attitude variation that resulted from the manipulator movement were closely observed.