• Title/Summary/Keyword: Sensor Fusion System

Search Result 435, Processing Time 0.028 seconds

Navigation System of UUV Using Multi-Sensor Fusion-Based EKF (융합된 다중 센서와 EKF 기반의 무인잠수정의 항법시스템 설계)

  • Park, Young-Sik;Choi, Won-Seok;Han, Seong-Ik;Lee, Jang-Myung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.22 no.7
    • /
    • pp.562-569
    • /
    • 2016
  • This paper proposes a navigation system with a robust localization method for an underwater unmanned vehicle. For robust localization with IMU (Inertial Measurement Unit), a DVL (Doppler Velocity Log), and depth sensors, the EKF (Extended Kalman Filter) has been utilized to fuse multiple nonlinear data. Note that the GPS (Global Positioning System), which can obtain the absolute coordinates of the vehicle, cannot be used in the water. Additionally, the DVL has been used for measuring the relative velocity of the underwater vehicle. The DVL sensor measures the velocity of an object by using Doppler effects, which cause sound frequency changes from the relative velocity between a sound source and an observer. When the vehicle is moving, the motion trajectory to a target position can be recorded by the sensors attached to the vehicle. The performance of the proposed navigation system has been verified through real experiments in which an underwater unmanned vehicle reached a target position by using an IMU as a primary sensor and a DVL as the secondary sensor.

A Study on Odometry Error Compensation using Multisensor fusion for Mobile Robot Navigation (멀티센서 융합을 이용한 자율이동로봇의 주행기록계 에러 보상에 관한 연구)

  • Song, Sin-Woo;Park, Mun-Soo;Hong, Suk-Kyo
    • Proceedings of the KIEE Conference
    • /
    • 2001.11c
    • /
    • pp.288-291
    • /
    • 2001
  • This paper present effective odometry error compensation using multisensor fusion for the accurate positioning of mobile robot in navigation. During obstacle avoidance and wall following of mobile robot, position estimates obtained by odometry become unrealistic and useless because of its accumulated errors. To measure the position and heading direction of mobile robot accurately, odometry sensor a gyroscope and an azimuth sensor are mounted on mobile robot and Complementary-filter is designed and implemented in order to compensate complementary drawback of each sensor and fuse their information. The experimental results show that the multisensor fusion system is more accurate than odometry only in estimation of the position and direction of mobile robot.

  • PDF

Localization and Control of an Outdoor Mobile Robot Based on an Estimator with Sensor Fusion (센서 융합기반의 추측항법을 통한 야지 주행 이동로봇의 위치 추정 및 제어)

  • Jeon, Sang Woon;Jeong, Seul
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.4 no.2
    • /
    • pp.69-78
    • /
    • 2009
  • Localization is a very important technique for the mobile robot to navigate in outdoor environment. In this paper, the development of the sensor fusion algorithm for controlling mobile robots in outdoor environments is presented. The multi-sensorial dead-reckoning subsystem is established based on the optimal filtering by first fusing a heading angle reading data from a magnetic compass, a rate-gyro, and two encoders mounted on the robot wheels, thereby computing the dead-reckoned location. These data and the position data provided by a global sensing system are fused together by means of an extended Kalman filter. The proposed algorithm is proved by simulation studies of controlling a mobile robot controlled by a backstepping controller and a cascaded controller. Performances of each controller are compared.

  • PDF

Development of Multi-purpose Smart Sensor Using Presence Sensor (재실 감지 센서를 이용한 다용도 스마트 센서 개발)

  • Cha, Joo-Heon;Yong, Heong
    • Journal of the Korean Society of Manufacturing Technology Engineers
    • /
    • v.24 no.1
    • /
    • pp.103-109
    • /
    • 2015
  • This paper introduces a multi-purpose smart fusion sensor. Normally, this type of sensor can contribute to energy savings specifically related to lighting and heating/air conditioning systems by detecting individuals in an office building. If a fire occurs, the sensor can provide information regarding the presence and location of residents in the building to a management center. The system consists of four sensors: a thermopile sensor for detecting heat energy, an ultrasonic sensor for measuring the distance of objects from the sensor, a fire detection sensor, and a passive infrared sensor for detecting temperature change. The system has a wireless communication module to provide the management center with control information for lighting and heating/air conditioning systems. We have also demonstrated the usefulness of the proposed system by applying it to a real environment.

Design of attitude estimation for RC Helicopter by sensor fusion (센서융합에 의한 모형헬리콥터의 자세 추정기 설계)

  • Jung, Won-Jae;Park, Moon-Soo;Lee, Kwang-Won
    • Proceedings of the KIEE Conference
    • /
    • 2001.07d
    • /
    • pp.2317-2319
    • /
    • 2001
  • This paper presents a sensor fusion algorithm for the RC helicopter which uses a complementary filter. To measure the attitude angle of the helicopter, 3rate gyroscopes and a 3-axis accelerometer are mounted on the helicopter. The signals from them are passed though a complementary filter to produce estimation outputs. Experiments show that designed system is effective for the attitude estimation.

  • PDF

Intelligent Hexapod Mobile Robot using Image Processing and Sensor Fusion (영상처리와 센서융합을 활용한 지능형 6족 이동 로봇)

  • Lee, Sang-Mu;Kim, Sang-Hoon
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.15 no.4
    • /
    • pp.365-371
    • /
    • 2009
  • A intelligent mobile hexapod robot with various types of sensors and wireless camera is introduced. We show this mobile robot can detect objects well by combining the results of active sensors and image processing algorithm. First, to detect objects, active sensors such as infrared rays sensors and supersonic waves sensors are employed together and calculates the distance in real time between the object and the robot using sensor's output. The difference between the measured value and calculated value is less than 5%. This paper suggests effective visual detecting system for moving objects with specified color and motion information. The proposed method includes the object extraction and definition process which uses color transformation and AWUPC computation to decide the existence of moving object. We add weighing values to each results from sensors and the camera. Final results are combined to only one value which represents the probability of an object in the limited distance. Sensor fusion technique improves the detection rate at least 7% higher than the technique using individual sensor.

Characterization of Magnetic Abrasive Finishing Using Sensor Fusion (센서 융합을 이용한 MAF 공정 특성 분석)

  • Kim, Seol-Bim;Ahn, Byoung-Woon;Lee, Seoung-Hwan
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.33 no.5
    • /
    • pp.514-520
    • /
    • 2009
  • In configuring an automated polishing system, a monitoring scheme to estimate the surface roughness is necessary. In this study, a precision polishing process, magnetic abrasive finishing (MAF), along with an in-process monitoring setup was investigated. A magnetic tooling is connected to a CNC machining to polish the surface of stavax(S136) die steel workpieces. During finishing experiments, both AE signals and force signals were sampled and analysed. The finishing results show that MAF has nano scale finishing capability (upto 8nm in surface roughness) and the sensor signals have strong correlations with the parameters such as gap between the tool and workpiece, feed rate and abrasive size. In addition, the signals were utilized as the input parameters of artificial neural networks to predict generated surface roughness. Among the three networks constructed -AE rms input, force input, AE+force input- the ANN with sensor fusion (AE+force) produced most stable results. From above, it has been shown that the proposed sensor fusion scheme is appropriate for the monitoring and prediction of the nano scale precision finishing process.

Global Map Building and Navigation of Mobile Robot Based on Ultrasonic Sensor Data Fusion

  • Kang, Shin-Chul;Jin, Tae-Seok
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.7 no.3
    • /
    • pp.198-204
    • /
    • 2007
  • In mobile robotics, ultrasonic sensors became standard devices for collision avoiding. Moreover, their applicability for map building and navigation has exploited in recent years. In this paper, as the preliminary step for developing a multi-purpose autonomous carrier mobile robot to transport trolleys or heavy goods and serve as robotic nursing assistant in hospital wards. The aim of this paper is to present the use of multi-sensor data fusion such as ultrasonic sensor, IR sensor for mobile robot to navigate, and presents an experimental mobile robot designed to operate autonomously within both indoor and outdoor environments. The global map building based on multi-sensor data fusion is applied for recognition an obstacle free path from a starting position to a known goal region, and simultaneously build a map of straight line segment geometric primitives based on the application of the Hough transform from the actual and noisy sonar data. We will give an explanation for the robot system architecture designed and implemented in this study and a short review of existing techniques, Hough transform, since there exist several recent thorough books and review paper on this paper. Experimental results with a real Pioneer DX2 mobile robot will demonstrate the effectiveness of the discussed methods.

A Fusion Algorithm considering Error Characteristics of the Multi-Sensor (다중센서 오차특성을 고려한 융합 알고리즘)

  • Hyun, Dae-Hwan;Yoon, Hee-Byung
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.36 no.4
    • /
    • pp.274-282
    • /
    • 2009
  • Various location tracking sensors; such as GPS, INS, radar, and optical equipment; are used for tracking moving targets. In order to effectively track moving targets, it is necessary to develop an effective fusion method for these heterogeneous devices. There have been studies in which the estimated values of each sensors were regarded as different models and fused together, considering the different error characteristics of the sensors for the improvement of tracking performance using heterogeneous multi-sensor. However, the rate of errors for the estimated values of other sensors has increased, in that there has been a sharp increase in sensor errors and the attempts to change the estimated sensor values for the Sensor Probability could not be applied in real time. In this study, the Sensor Probability is obtained by comparing the RMSE (Root Mean Square Error) for the difference between the updated and measured values of the Kalman filter for each sensor. The process of substituting the new combined values for the Kalman filter input values for each sensor is excluded. There are improvements in both the real-time application of estimated sensor values, and the tracking performance for the areas in which the sensor performance has rapidly decreased. The proposed algorithm adds the error characteristic of each sensor as a conditional probability value, and ensures greater accuracy by performing the track fusion with the sensors with the most reliable performance. The trajectory of a UAV is generated in an experiment and a performance analysis is conducted with other fusion algorithms.