• Title/Summary/Keyword: IMU sensor

Search Result 250, Processing Time 0.027 seconds

Alignment and Navigation of Inertial Navigation and Guidance Unit using Inertial Explorer Software (Inertial Explorer 소프트웨어를 이용한 관성항법유도장치 정렬 및 항법계산)

  • Kim, Jeong-Yong;Oh, Jun-Seok;Roh, Woong-Rae
    • Aerospace Engineering and Technology
    • /
    • v.9 no.1
    • /
    • pp.50-59
    • /
    • 2010
  • In this paper, the alignment and navigation results by INGU(Inertial Navigation and Guidance Unit) onboard software and by Inertial Explorer which is a post-processing software specialized for IMU(Inertial Measurement Unit) are compared for identification of inertial sensor error models and estimation of alignment and navigation errors for KSLV-I INGU. For verification of the IMU error estimated by Kalman Filter of Inertial Explorer, the covariance parameters of inertial sensor error model state are identified by using stochastic error model of inertial sensors estimated by Allan variance and the alignment and navigation test with static condition and the land navigation test with dynamic condition are carried out. The validity of inertial sensor model for KSLV-I INGU is verified by comparison the alignment and navigation results of INGU on-board software and Inertial Explorer.

The Development of Sensor System and 3D World Modeling for Autonomous Vehicle (무인 차량을 위한 센서 시스템 개발 및 3차원 월드 모델링)

  • Kim, Si-Jong;Kang, Jung-Won;Choe, Yun-Geun;Park, Sang-Un;Shim, In-Wook;Ahn, Seung-Uk;Chung, Myung-Jin
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.17 no.6
    • /
    • pp.531-538
    • /
    • 2011
  • This paper describes a novel sensor system for 3D world modeling of an autonomous vehicle in large-scale outdoor environments. When an autonomous vehicle performs path planning and path following, well-constructed 3D world model of target environment is very important for analyze the environment and track the determined path. To generate well-construct 3D world model, we develop a novel sensor system. The proposed novel sensor system consists of two 2D laser scanners, two single cameras, a DGPS (Differential Global Positioning System) and an IMU (Inertial Measurement System). We verify the effectiveness of the proposed sensor system through experiment in large-scale outdoor environment.

An Analysis of Inertial Sensor Error Model (관성센서의 오차 모델 분석)

  • Kim, Dae-Young;Hong, Suk-Kyo;Go, Young-Gil
    • Proceedings of the KIEE Conference
    • /
    • 1997.07b
    • /
    • pp.571-574
    • /
    • 1997
  • 항법장치의 핵심요소인 가속도센서와 자이로센서는 선형거리추측(Linear position estimation)과 각 변위 추측(orientation estimation)시 출력 데이터에 포함된 오차성분의 적분에 의하여 시간이 증가함에 따라 선형거리 오차와 각 변위 오차가 누적된다. 이에 따라 본 논문에서는 정밀한 항법을 위한 저가의 IMU (Inertial Measurement Unit)를 설계하고, 오차성분의 사전해석을 통하여 정확한 오차모델을 찾는데 그 목적이 있다.

  • PDF

Design of a Compact GPS/MEMS IMU Integrated Navigation Receiver Module for High Dynamic Environment (고기동 환경에 적용 가능한 소형 GPS/MEMS IMU 통합항법 수신모듈 설계)

  • Jeong, Koo-yong;Park, Dae-young;Kim, Seong-min;Lee, Jong-hyuk
    • Journal of Advanced Navigation Technology
    • /
    • v.25 no.1
    • /
    • pp.68-77
    • /
    • 2021
  • In this paper, a GPS/MEMS IMU integrated navigation receiver module capable of operating in a high dynamic environment is designed and fabricated, and the results is confirmed. The designed module is composed of RF receiver unit, inertial measurement unit, signal processing unit, correlator, and navigation S/W. The RF receiver performs the functions of low noise amplification, frequency conversion, filtering, and automatic gain control. The inertial measurement unit collects measurement data from a MEMS class IMU applied with a 3-axis gyroscope, accelerometer, and geomagnetic sensor. In addition, it provides an interface to transmit to the navigation S/W. The signal processing unit and the correlator is implemented with FPGA logic to perform filtering and corrrelation value calculation. Navigation S/W is implemented using the internal CPU of the FPGA. The size of the manufactured module is 95.0×85.0×.12.5mm, the weight is 110g, and the navigation accuracy performance within the specification is confirmed in an environment of 1200m/s and acceleration of 10g.

MULTI-SENSOR DATA FUSION FOR FUTURE TELEMATICS APPLICATION

  • Kim, Seong-Baek;Lee, Seung-Yong;Choi, Ji-Hoon;Choi, Kyung-Ho;Jang, Byung-Tae
    • Journal of Astronomy and Space Sciences
    • /
    • v.20 no.4
    • /
    • pp.359-364
    • /
    • 2003
  • In this paper, we present multi-sensor data fusion for telematics application. Successful telematics can be realized through the integration of navigation and spatial information. The well-determined acquisition of vehicle's position plays a vital role in application service. The development of GPS is used to provide the navigation data, but the performance is limited in areas where poor satellite visibility environment exists. Hence, multi-sensor fusion including IMU (Inertial Measurement Unit), GPS(Global Positioning System), and DMI (Distance Measurement Indicator) is required to provide the vehicle's position to service provider and driver behind the wheel. The multi-sensor fusion is implemented via algorithm based on Kalman filtering technique. Navigation accuracy can be enhanced using this filtering approach. For the verification of fusion approach, land vehicle test was performed and the results were discussed. Results showed that the horizontal position errors were suppressed around 1 meter level accuracy under simulated non-GPS availability environment. Under normal GPS environment, the horizontal position errors were under 40㎝ in curve trajectory and 27㎝ in linear trajectory, which are definitely depending on vehicular dynamics.

A Time Synchronization Scheme for Vision/IMU/OBD by GPS (GPS를 활용한 Vision/IMU/OBD 시각동기화 기법)

  • Lim, JoonHoo;Choi, Kwang Ho;Yoo, Won Jae;Kim, La Woo;Lee, Yu Dam;Lee, Hyung Keun
    • Journal of Advanced Navigation Technology
    • /
    • v.21 no.3
    • /
    • pp.251-257
    • /
    • 2017
  • Recently, hybrid positioning system combining GPS, vision sensor, and inertial sensor has drawn many attentions to estimate accurate vehicle positions. Since accurate multi-sensor fusion requires efficient time synchronization, this paper proposes an efficient method to obtain time synchronized measurements of vision sensor, inertial sensor, and OBD device based on GPS time information. In the proposed method, the time and position information is obtained by the GPS receiver, the attitude information is obtained by the inertial sensor, and the speed information is obtained by the OBD device. The obtained time, position, speed, and attitude information is converted to the color information. The color information is inserted to several corner pixels of the corresponding image frame. An experiment was performed with real measurements to evaluate the feasibility of the proposed method.

Design of Solar Tracking CanSat (태양위치추적 캔위성의 개발)

  • Jung, In-Jee;Moon, Ji-Hwan;Kim, Min-Soo;Lim, Byoung-Duk
    • Journal of the Korean Society for Aeronautical & Space Sciences
    • /
    • v.41 no.4
    • /
    • pp.327-334
    • /
    • 2013
  • In August 2012 the first CanSat competition was hosted by the Satellite Research Center of KAIST under auspice of the Ministry of Education, Science and Technology. The present authors team won the first prize in the university session. In this paper the overall procedure of the CanSat project presented from the conceptual design stage to the final launch test. As the compulsory mission CanSat should send GPS data and attitude information to the ground station which in practice was performed via Bluetooth channel. In addition our CanSat is designed to trace the sun for the solar panels supplying electric power of satellite. IMU and servo motors are used for the attitude control in order that the solar sensor of the CanSat is always direct towards the sun. Launching of CanSat was simulated by dropping from a balloon at the height of around 150m via parachute. Launching test results showed that the attitude control of the CanSat and its solar sensing function were successful.

A Hybrid Navigation System for Underwater Unmanned Vehicles, Using a Range Sonar (초음파 거리계를 이용한 무인잠수정의 수중 복합 항법시스템)

  • LEE PAN-MOOK;JEON BONG-HWAN;KIM SEA-MOON;LEE CHONG-MOO;LIM YONG-KON;YANG SEUNG-IL
    • Journal of Ocean Engineering and Technology
    • /
    • v.18 no.4 s.59
    • /
    • pp.33-39
    • /
    • 2004
  • This paper presents a hybrid underwater navigation system for unmanned underwater vehicles, using an additional range sonar, where the navigation system is based on inertial and Doppler velocity sensors. Conventional underwater navigation systems are generally based on an inertial measurement unit (IMU) and a Doppler velocity log (DVL), accompanying a magnetic compass and a depth sensor. Although the conventional navigation systems update the bias errors of inertial sensors and the scale effects of DVL, the estimated position slowly drifts as time passes. This paper proposes a measurement model that uses the range sonar to improve the performance of the IMU-DVL navigation system, for extended operation of underwater vehicles. The proposed navigation model includes the bias errors of IMU, the scale effects of VL, and the bias error of the range sonar. An extended Kalman filter was adopted to propagate the error covariance, to update the measurement errors, and to correct the state equation, when the external measurements are available. To illustrate the effectiveness of the hybrid navigation system, simulations were conducted with the 6-d.o.f. equations of motion of an AUV in lawn-mowing survey mode.

Threshold-based Pre-impact Fall Detection and its Validation Using the Real-world Elderly Dataset (임계값 기반 충격 전 낙상검출 및 실제 노인 데이터셋을 사용한 검증)

  • Dongkwon Kim;Seunghee Lee;Bummo Koo;Sumin Yang;Youngho Kim
    • Journal of Biomedical Engineering Research
    • /
    • v.44 no.6
    • /
    • pp.384-391
    • /
    • 2023
  • Among the elderly, fatal injuries and deaths are significantly attributed to falls. Therefore, a pre-impact fall detection system is necessary for injury prevention. In this study, a robust threshold-based algorithm was proposed for pre-impact fall detection, reducing false positives in highly dynamic daily-living movements. The algorithm was validated using public datasets (KFall and FARSEEING) that include the real-world elderly fall. A 6-axis IMU sensor (Movella Dot, Movella, Netherlands) was attached to S2 of 20 healthy adults (aged 22.0±1.9years, height 164.9±5.9cm, weight 61.4±17.1kg) to measure 14 activities of daily living and 11 fall movements at a sampling frequency of 60Hz. A 5Hz low-pass filter was applied to the IMU data to remove high-frequency noise. Sum vector magnitude of acceleration and angular velocity, roll, pitch, and vertical velocity were extracted as feature vector. The proposed algorithm showed an accuracy 98.3%, a sensitivity 100%, a specificity 97.0%, and an average lead-time 311±99ms with our experimental data. When evaluated using the KFall public dataset, an accuracy in adult data improved to 99.5% compared to recent studies, and for the elderly data, a specificity of 100% was achieved. When evaluated using FARSEEING real-world elderly fall data without separate segmentation, it showed a sensitivity of 71.4% (5/7).

Test of Vision Stabilizer for Unmanned Vehicle Using Virtual Environment and 6 Axis Motion Simulator (가상 환경 및 6축 모션 시뮬레이터를 이용한 무인차량 영상 안정화 장치 시험)

  • Kim, Sunwoo;Ki, Sun-Ock;Kim, Sung-Soo
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.39 no.2
    • /
    • pp.227-233
    • /
    • 2015
  • In this study, an indoor test environment was developed for studying the vision stabilizer of an unmanned vehicle, using a virtual environment and a 6-axis motion simulator. The real driving environment was replaced by a virtual environment based on the Aberdeen Proving Ground bump test course for military tank testing. The vehicle motion was reproduced by a 6-axis motion simulator. Virtual reality driving courses were displayed in front of the vision stabilizer, which was located on the top of the motion simulator. The performance of the stabilizer was investigated by checking the image of the camera, and the pitch and roll angles of the stabilizer captured by the IMU sensor of the camera.