• Title/Summary/Keyword: Sensor fusion

Search Result 815, Processing Time 0.032 seconds

Design and Performance Evaluation of a Complementary Filter for Inverted Pendulum Control with Inertial Sensors (관성센서를 이용한 도립진자의 제어를 위한 상보필터 설계 및 성능평가)

  • Nakashima, Toshitaka;Chang, Mun-Che;Hong, Suk-Kyo
    • Proceedings of the KIEE Conference
    • /
    • 2004.11c
    • /
    • pp.544-546
    • /
    • 2004
  • This paper designs and evaluates a complementary filter for fusion of inertial sensor signals. Specifically, the designed filter is applied to inverted pendulum control where the pendulum's angle information is obtained from low-cost tilt and gyroscope sensors instead of an optical encoder. The complementary filter under consideration is a conventional one which consists of low- and high-pass filters. However, to improve the performance of the filter on the gyroscope, we use an integrator in the filter's outer loop. Frequency responses are obtained with both tilt and gyroscope sensors. Based on the frequency response results, we determine appropriate parameter values for the filter. The performance of the designed complementary filter is evaluated by applying the filter to inverted pendulum control. Experiments show that the performance of the designed filter is comparable to that of an optical encoder and low-cost inertial sensors can be used for inverted pendulum control with the heir of sensor fusion.

  • PDF

Development of System For Cell Fusion Detection (세포 전기 융합 감지 장치에 관한 연구)

  • Kwon, Ki-Jin;Kim, Min-Soo;Park, Se-Kwang
    • Proceedings of the KIEE Conference
    • /
    • 1994.07b
    • /
    • pp.1336-1338
    • /
    • 1994
  • Cell fusion device is an artificial equipment which fuses electrically two types of cells fed from the respective micropump to the fusion chamber by electric pulses. In this case, the detective sensor of flowing cell, along with passage, is required to control the time of pulses applied to cell and the injection of cells which are fed from inlet to micropump. There are two methods of detection of flowing cell; optical, impedance method. The difference of output for optical sensor is about 426mV for 805nm wavelength. about 37mV for 665nm wavelength. In impedance method, sensor output is 132.33mV at middle point and 117.10mV at edge point in the channel. Experimental results show that the optimal frequency range of sensor output is Iron 50Hz to 400Hz.

  • PDF

Fuzzy Neural Network Based Sensor Fusion and It's Application to Mobile Robot in Intelligent Robotic Space

  • Jin, Tae-Seok;Lee, Min-Jung;Hashimoto, Hideki
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.6 no.4
    • /
    • pp.293-298
    • /
    • 2006
  • In this paper, a sensor fusion based robot navigation method for the autonomous control of a miniature human interaction robot is presented. The method of navigation blends the optimality of the Fuzzy Neural Network(FNN) based control algorithm with the capabilities in expressing knowledge and learning of the networked Intelligent Robotic Space(IRS). States of robot and IR space, for examples, the distance between the mobile robot and obstacles and the velocity of mobile robot, are used as the inputs of fuzzy logic controller. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a sensor fusion technique is introduced, where the sensory data of ultrasonic sensors and a vision sensor are fused into the identification process. Preliminary experiment and results are shown to demonstrate the merit of the introduced navigation control algorithm.

The Improvement of Target Motion Analysis(TMA) for Submarine with Data Fusion (정보융합 기법을 활용한 잠수함 표적기동분석 성능향상 연구)

  • Lim, Young-Taek;Ko, Soon-Ju;Song, Taek-Lyul
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.12 no.6
    • /
    • pp.697-703
    • /
    • 2009
  • Target Motion Analysis(TMA) means to detect target position, velocity and course for using passive sonar system with bearing-only measurement. In this paper, we apply the TMA algorithm for a submarine with Multi-Sensor Data Fusion(MSDF) and we will decide the best TMA algorithm for a submarine by a series of computer simulation runs.

Performance enhancement of launch vehicle tracking using GPS-based multiple radar bias estimation and sensor fusion (GPS 기반 추적레이더 실시간 바이어스 추정 및 비동기 정보융합을 통한 발사체 추적 성능 개선)

  • Song, Ha-Ryong
    • Journal of Korea Society of Industrial Information Systems
    • /
    • v.20 no.6
    • /
    • pp.47-56
    • /
    • 2015
  • In the multi-sensor system, sensor registration errors such as a sensor bias must be corrected so that the individual sensor data are expressed in a common reference frame. If registration process is not properly executed, large tracking errors or formation of multiple track on the same target can be occured. Especially for launch vehicle tracking system, each multiple observation lies on the same reference frame and then fused trajectory can be the best track for slaving data. Hence, this paper describes an on-line bias estimation/correction and asynchronous sensor fusion for launch vehicle tracking. The bias estimation architecture is designed based on pseudo bias measurement which derived from error observation between GPS and radar measurements. Then, asynchronous sensor fusion is adapted to enhance tracking performance.

Pose Tracking of Moving Sensor using Monocular Camera and IMU Sensor

  • Jung, Sukwoo;Park, Seho;Lee, KyungTaek
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.8
    • /
    • pp.3011-3024
    • /
    • 2021
  • Pose estimation of the sensor is important issue in many applications such as robotics, navigation, tracking, and Augmented Reality. This paper proposes visual-inertial integration system appropriate for dynamically moving condition of the sensor. The orientation estimated from Inertial Measurement Unit (IMU) sensor is used to calculate the essential matrix based on the intrinsic parameters of the camera. Using the epipolar geometry, the outliers of the feature point matching are eliminated in the image sequences. The pose of the sensor can be obtained from the feature point matching. The use of IMU sensor can help initially eliminate erroneous point matches in the image of dynamic scene. After the outliers are removed from the feature points, these selected feature points matching relations are used to calculate the precise fundamental matrix. Finally, with the feature point matching relation, the pose of the sensor is estimated. The proposed procedure was implemented and tested, comparing with the existing methods. Experimental results have shown the effectiveness of the technique proposed in this paper.

Short Range Target Tracking Based on Data Fusion Method Using Asynchronous Dissimilar Sensors (비동기 이종 센서를 이용한 데이터 융합기반 근거리 표적 추적기법)

  • Lee, Eui-Hyuk
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.49 no.9
    • /
    • pp.335-343
    • /
    • 2012
  • This paper presents an target tracking algorithm for fusion of radar and infrared(IR) sensor measurement data. Generally, fusion methods with Kalman filter assume that processing data obtained by radar and IR sensor are synchronized. It has much limitation to apply the fusion methods to real systems. A key point which is taken into account in the proposed algorithm is the fact that two asynchronous dissimilar data are fused by compensating the time difference of the measurements using radar's ranges and track state vectors. The proposed fusion algorithm in the paper is evaluated via a computer simulation with the existing track fusion and measurement fusion methods.

Scaling Attack Method for Misalignment Error of Camera-LiDAR Calibration Model (카메라-라이다 융합 모델의 오류 유발을 위한 스케일링 공격 방법)

  • Yi-ji Im;Dae-seon Choi
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.33 no.6
    • /
    • pp.1099-1110
    • /
    • 2023
  • The recognition system of autonomous driving and robot navigation performs vision work such as object recognition, tracking, and lane detection after multi-sensor fusion to improve performance. Currently, research on a deep learning model based on the fusion of a camera and a lidar sensor is being actively conducted. However, deep learning models are vulnerable to adversarial attacks through modulation of input data. Attacks on the existing multi-sensor-based autonomous driving recognition system are focused on inducing obstacle detection by lowering the confidence score of the object recognition model.However, there is a limitation that an attack is possible only in the target model. In the case of attacks on the sensor fusion stage, errors in vision work after fusion can be cascaded, and this risk needs to be considered. In addition, an attack on LIDAR's point cloud data, which is difficult to judge visually, makes it difficult to determine whether it is an attack. In this study, image scaling-based camera-lidar We propose an attack method that reduces the accuracy of LCCNet, a fusion model (camera-LiDAR calibration model). The proposed method is to perform a scaling attack on the point of the input lidar. As a result of conducting an attack performance experiment by size with a scaling algorithm, an average of more than 77% of fusion errors were caused.