• Title/Summary/Keyword: Sonar based SLAM

Search Result 10, Processing Time 0.031 seconds

EKF-based SLAM Using Sonar Salient Feature and Line Feature for Mobile Robots (이동로봇을 위한 Sonar Salient 형상과 선 형상을 이용한 EKF 기반의 SLAM)

  • Heo, Young-Jin;Lim, Jong-Hwan;Lee, Se-Jin
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.28 no.10
    • /
    • pp.1174-1180
    • /
    • 2011
  • Not all line or point features capable of being extracted by sonar sensors from cluttered home environments are useful for simultaneous localization and mapping (SLAM) due to their ambiguity because it is difficult to determine the correspondence of line or point features with previously registered feature. Confused line and point features in cluttered environments leads to poor SLAM performance. We introduce a sonar feature structure suitable for a cluttered environment and the extended Kalman filter (EKF)-based SLAM scheme. The reliable line feature is expressed by its end points and engaged togather in EKF SLAM to overcome the geometric limits and maintain the map consistency. Experimental results demonstrate the validity and robustness of the proposed method.

A Practical Solution toward SLAM in Indoor environment Based on Visual Objects and Robust Sonar Features (가정환경을 위한 실용적인 SLAM 기법 개발 : 비전 센서와 초음파 센서의 통합)

  • Ahn, Sung-Hwan;Choi, Jin-Woo;Choi, Min-Yong;Chung, Wan-Kyun
    • The Journal of Korea Robotics Society
    • /
    • v.1 no.1
    • /
    • pp.25-35
    • /
    • 2006
  • Improving practicality of SLAM requires various sensors to be fused effectively in order to cope with uncertainty induced from both environment and sensors. In this case, combining sonar and vision sensors possesses numerous advantages of economical efficiency and complementary cooperation. Especially, it can remedy false data association and divergence problem of sonar sensors, and overcome low frequency SLAM update caused by computational burden and weakness in illumination changes of vision sensors. In this paper, we propose a SLAM method to join sonar sensors and stereo camera together. It consists of two schemes, extracting robust point and line features from sonar data and recognizing planar visual objects using multi-scale Harris corner detector and its SIFT descriptor from pre-constructed object database. And fusing sonar features and visual objects through EKF-SLAM can give correct data association via object recognition and high frequency update via sonar features. As a result, it can increase robustness and accuracy of SLAM in indoor environment. The performance of the proposed algorithm was verified by experiments in home -like environment.

  • PDF

Side Scan Sonar based Pose-graph SLAM (사이드 스캔 소나 기반 Pose-graph SLAM)

  • Gwon, Dae-Hyeon;Kim, Joowan;Kim, Moon Hwan;Park, Ho Gyu;Kim, Tae Yeong;Kim, Ayoung
    • The Journal of Korea Robotics Society
    • /
    • v.12 no.4
    • /
    • pp.385-394
    • /
    • 2017
  • Side scanning sonar (SSS) provides valuable information for robot navigation. However using the side scanning sonar images in the navigation was not fully studied. In this paper, we use range data, and side scanning sonar images from UnderWater Simulator (UWSim) and propose measurement models in a feature based simultaneous localization and mapping (SLAM) framework. The range data is obtained by echosounder and sidescanning sonar images from side scan sonar module for UWSim. For the feature, we used the A-KAZE feature for the SSS image matching and adjusting the relative robot pose by SSS bundle adjustment (BA) with Ceres solver. We use BA for the loop closure constraint of pose-graph SLAM. We used the Incremental Smoothing and Mapping (iSAM) to optimize the graph. The optimized trajectory was compared against the dead reckoning (DR).

Experimental result of Real-time Sonar-based SLAM for underwater robot (소나 기반 수중 로봇의 실시간 위치 추정 및 지도 작성에 대한 실험적 검증)

  • Lee, Yeongjun;Choi, Jinwoo;Ko, Nak Yong;Kim, Taejin;Choi, Hyun-Taek
    • Journal of the Institute of Electronics and Information Engineers
    • /
    • v.54 no.3
    • /
    • pp.108-118
    • /
    • 2017
  • This paper presents experimental results of realtime sonar-based SLAM (simultaneous localization and mapping) using probability-based landmark-recognition. The sonar-based SLAM is used for navigation of underwater robot. Inertial sensor as IMU (Inertial Measurement Unit) and DVL (Doppler Velocity Log) and external information from sonar image processing are fused by Extended Kalman Filter (EKF) technique to get the navigation information. The vehicle location is estimated by inertial sensor data, and it is corrected by sonar data which provides relative position between the vehicle and the landmark on the bottom of the basin. For the verification of the proposed method, the experiments were performed in a basin environment using an underwater robot, yShark.

Underwater Robot Localization by Probability-based Object Recognition Framework Using Sonar Image (소나 영상을 이용한 확률적 물체 인식 구조 기반 수중로봇의 위치추정)

  • Lee, Yeongjun;Choi, Jinwoo;Choi, Hyun-Teak
    • The Journal of Korea Robotics Society
    • /
    • v.9 no.4
    • /
    • pp.232-241
    • /
    • 2014
  • This paper proposes an underwater localization algorithm using probabilistic object recognition. It is organized as follows; 1) recognizing artificial objects using imaging sonar, and 2) localizing the recognized objects and the vehicle using EKF(Extended Kalman Filter) based SLAM. For this purpose, we develop artificial landmarks to be recognized even under the unstable sonar images induced by noise. Moreover, a probabilistic recognition framework is proposed. In this way, the distance and bearing of the recognized artificial landmarks are acquired to perform the localization of the underwater vehicle. Using the recognized objects, EKF-based SLAM is carried out and results in a path of the underwater vehicle and the location of landmarks. The proposed localization algorithm is verified by experiments in a basin.

Experiments of Unmanned Underwater Vehicle's 3 Degrees of Freedom Motion Applied the SLAM based on the Unscented Kalman Filter (무인 잠수정 3자유도 운동 실험에 대한 무향 칼만 필터 기반 SLAM기법 적용)

  • Hwang, A-Rom;Seong, Woo-Jae;Jun, Bong-Huan;Lee, Pan-Mook
    • Journal of Ocean Engineering and Technology
    • /
    • v.23 no.2
    • /
    • pp.58-68
    • /
    • 2009
  • The increased use of unmanned underwater vehicles (UUV) has led to the development of alternative navigational methods that do not employ acoustic beacons and dead reckoning sensors. This paper describes a simultaneous localization and mapping (SLAM) scheme that uses range sonars mounted on a small UUV. A SLAM scheme is an alternative navigation method for measuring the environment through which the vehicle is passing and providing the relative position of the UUV. A technique for a SLAM algorithm that uses several ranging sonars is presented. This technique utilizes an unscented Kalman filter to estimate the locations of the UUV and surrounding objects. In order to work efficiently, the nearest neighbor standard filter is introduced as the data association algorithm in the SLAM for associating the stored targets returned by the sonar at each time step. The proposed SLAM algorithm was tested by experiments under various three degrees of freedom motion conditions. The results of these experiments showed that the proposed SLAM algorithm was capable of estimating the position of the UUV and the surrounding objects and demonstrated that the algorithm will perform well in various environments.

Experimental Result on Map Expansion of Underwater Robot Using Acoustic Range Sonar (수중 초음파 거리 센서를 이용한 수중 로봇의 2차원 지도 확장 실험)

  • Lee, Yeongjun;Choi, Jinwoo;Lee, Yoongeon;Choi, Hyun-Taek
    • The Journal of Korea Robotics Society
    • /
    • v.13 no.2
    • /
    • pp.79-85
    • /
    • 2018
  • This study focuses on autonomous exploration based on map expansion for an underwater robot equipped with acoustic sonars. Map expansion is applicable to large-area mapping, but it may affect localization accuracy. Thus, as the key contribution of this paper, we propose a method for underwater autonomous exploration wherein the robot determines the trade-off between map expansion ratio and position accuracy, selects which of the two has higher priority, and then moves to a mission step. An occupancy grid map is synthesized by utilizing the measurements of an acoustic range sonar that determines the probability of occupancy. This information is then used to determine a path to the frontier, which becomes the new search point. During area searching and map building, the robot revisits artificial landmarks to improve its position accuracy as based on imaging sonar-based recognition and EKF-SLAM if the position accuracy is above the predetermined threshold. Additionally, real-time experiments were conducted by using an underwater robot, yShark, to validate the proposed method, and the analysis of the results is discussed herein.

Two Feature Points Based Laser Scanner for Mobile Robot Navigation (레이저 센서에서 두 개의 특징점을 이용한 이동로봇의 항법)

  • Kim, Joo-Wan;Shim, Duk-Sun
    • Journal of Advanced Navigation Technology
    • /
    • v.18 no.2
    • /
    • pp.134-141
    • /
    • 2014
  • Mobile robots use various sensors for navigation such as wheel encoder, vision sensor, sonar, and laser sensors. Dead reckoning is used with wheel encoder, resulting in the accumulation of positioning errors. For that reason wheel encoder can not be used alone. Too much information of vision sensors leads to an increase in the number of features and complexity of perception scheme. Also Sonar sensor is not suitable for positioning because of its poor accuracy. On the other hand, laser sensor provides accurate distance information relatively. In this paper we propose to extract the angular information from the distance information of laser range finder and use the Kalman filter that match the heading and distance of the laser range finder and those of wheel encoder. For laser scanner with one feature point error may increase much when the feature point is variant or jumping to a new feature point. To solve the problem, we propose to use two feature points and show that the positioning error can be reduced much.