• Title/Summary/Keyword: Mobile Robot Navigation

Search Result 562, Processing Time 0.024 seconds

Design of a Web-based Autonomous Under-water Mobile Robot Controller Using Neuro-Fuzzy in the Dynamic Environment (동적 환경에서 뉴로-퍼지를 이용한 웹 기반 자율 잠수 이동로봇 제어기 설계)

  • 최규종;신상운;안두성
    • Journal of the Korean Society of Fisheries and Ocean Technology
    • /
    • v.39 no.1
    • /
    • pp.77-83
    • /
    • 2003
  • Autonomous mobile robots based on the Web have been already used in public places such as museums. There are many kinds of problems to be solved because of the limitation of Web and the dynamically changing environment. We present a methodology for intelligent mobile robot that demonstrates a certain degree of autonomy in navigation applications. In this paper, we focus on a mobile robot navigator equipped with neuro-fuzzy controller which perceives the environment, make decisions, and take actions. The neuro-fuzzy controller equipped with collision avoidance behavior and target trace behavior enables the mobile robot to navigate in dynamic environment from the start location to goal location. Most telerobotics system workable on the Web have used standard Internet techniques such as HTTP, CGI and Scripting languages. However, for mobile robot navigations, these tools have significant limitations. In our study, C# and ASP.NET are used for both the client and the server side programs because of their interactivity and quick responsibility. Two kinds of simulations are performed to verify our proposed method. Our approach is verified through computer simulations of collision avoidance and target trace.

Simultaneous Localization and Mobile Robot Navigation using a Sensor Network

  • Jin Tae-Seok;Bashimoto Hideki
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.6 no.2
    • /
    • pp.161-166
    • /
    • 2006
  • Localization of mobile agent within a sensing network is a fundamental requirement for many applications, using networked navigating systems such as the sonar-sensing system or the visual-sensing system. To fully utilize the strengths of both the sonar and visual sensing systems, This paper describes a networked sensor-based navigation method in an indoor environment for an autonomous mobile robot which can navigate and avoid obstacle. In this method, the self-localization of the robot is done with a model-based vision system using networked sensors, and nonstop navigation is realized by a Kalman filter-based STSF(Space and Time Sensor Fusion) method. Stationary obstacles and moving obstacles are avoided with networked sensor data such as CCD camera and sonar ring. We will report on experiments in a hallway using the Pioneer-DX robot. In addition to that, the localization has inevitable uncertainties in the features and in the robot position estimation. Kalman filter scheme is used for the estimation of the mobile robot localization. And Extensive experiments with a robot and a sensor network confirm the validity of the approach.

Mobile Robot Navigation Using Vision Information (시각 정보를 이용한 이동 로보트의 항법)

  • Cho, Dong-Kwon;Kwon, Ho-Yeol;Suh, Il-Hong;Bien, Zeung-Nam
    • Proceedings of the KIEE Conference
    • /
    • 1989.07a
    • /
    • pp.689-692
    • /
    • 1989
  • In this paper, the navigation problem for a mobile robot is investigated. Specifically, it is proposed that simple guide-marks be introduced and the navigation scheme be generated in conjunction with the guide-marks sensed through camera vision. For autonomous navigation, it was shown that a triple guide-mark system is more effective than a single guide-mark in estimating the position of rho vehicle itself. the navigation system is tested via a mobile robot 'Hero' equipped with a single camera vision.

  • PDF

Maze Navigation System Using Image Recognition for Autonomous Mobile Robot (자율이동로봇의 영상인식 미로탐색시스템)

  • Lee Jeong Hun;Kang Seong-Ho;Eom Ki Hwan
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.11 no.5
    • /
    • pp.429-434
    • /
    • 2005
  • In this paper, the maze navigation system using image recognition for autonomous mobile robot is proposed. The proposed maze navigation system searches the target by image recognition method based on ADALINE neural network. The infrared sensor system must travel all blocks to find target because it can recognize only one block information each time. But the proposed maze navigation system can reduce the number of traveling blocks because of the ability of sensing several blocks at once. Especially, due to the simplicity of the algorithm, the proposed method could be easily implemented to the system which has low capacity processor.

GA-Fuzzy based Navigation of Multiple Mobile Robots in Unknown Dynamic Environments (미지 동적 환경에서 다중 이동로봇의 GA-Fuzzy 기반 자율항법)

  • Zhao, Ran;Lee, Hong-Kyu
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.66 no.1
    • /
    • pp.114-120
    • /
    • 2017
  • The work present in this paper deals with a navigation problem for multiple mobile robots in unknown indoor environments. The environments are completely unknown to the robots; thus, proximity sensors installed on the robots' bodies must be used to detect information about the surroundings. The environments simulated in this work are dynamic ones which contain not only static but also moving obstacles. In order to guide the robot to move along a collision-free path and reach the goal, this paper presented a navigation method based on fuzzy approach. Then genetic algorithms were applied to optimize the membership functions and rules of the fuzzy controller. The simulation results verified that the proposed method effectively addresses the mobile robot navigation problem.

Onboard dynamic RGB-D simultaneous localization and mapping for mobile robot navigation

  • Canovas, Bruce;Negre, Amaury;Rombaut, Michele
    • ETRI Journal
    • /
    • v.43 no.4
    • /
    • pp.617-629
    • /
    • 2021
  • Although the actual visual simultaneous localization and mapping (SLAM) algorithms provide highly accurate tracking and mapping, most algorithms are too heavy to run live on embedded devices. In addition, the maps they produce are often unsuitable for path planning. To mitigate these issues, we propose a completely closed-loop online dense RGB-D SLAM algorithm targeting autonomous indoor mobile robot navigation tasks. The proposed algorithm runs live on an NVIDIA Jetson board embedded on a two-wheel differential-drive robot. It exhibits lightweight three-dimensional mapping, room-scale consistency, accurate pose tracking, and robustness to moving objects. Further, we introduce a navigation strategy based on the proposed algorithm. Experimental results demonstrate the robustness of the proposed SLAM algorithm, its computational efficiency, and its benefits for on-the-fly navigation while mapping.

Mobile Robot Navigation in an Indoor Environment

  • Choi, Sung-Yug;Lee, Jang-Myung
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.1456-1459
    • /
    • 2005
  • To compensate the drawbacks, a new localization method that estimates the global position of the mobile robot by using a camera set on ceiling in the corridor is proposed. This scheme is not a relative localization, which decreases the position error through algorithms with noisy sensor data. The effectiveness of the proposed localization scheme is demonstrated by the experiments.

  • PDF

Navigation and Localization of Mobile Robot Based on Vision and Sensor Network Using Fuzzy Rules (퍼지 규칙을 이용한 비전 및 무선 센서 네트워크 기반의 이동로봇의 자율 주행 및 위치 인식)

  • Heo, Jun-Young;Kang, Geun-Tack;Lee, Won-Chang
    • Proceedings of the IEEK Conference
    • /
    • 2008.06a
    • /
    • pp.673-674
    • /
    • 2008
  • This paper presents a new navigation algorithm of an autonomous mobile robot with vision and IR sensors, Zigbee Sensor Network using fuzzy rules. We also show that the developed mobile robot with the proposed algorithm is navigating very well in complex unknown environments.

  • PDF

Location Estimation and Navigation of Mobile Robots using Wireless Sensor Network and Ultrasonic Sensors (무선 센서 네트워크와 초음파 센서를 이용한 이동로봇의 위치 인식과 주행)

  • Chun, Chang-Hee;Park, Jong-Jin
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.59 no.9
    • /
    • pp.1692-1698
    • /
    • 2010
  • In this paper we use wireless sensor network and ultrasonic sensors to estimate local position of mobile robots, and to navigate it. Ultra sonic sensor is simple and accurate so it is good to use in local estimation and navigation of mobile robots. But to obtain accurate distance of two sensors they need to face each others as possible as they can. To solve this problem we rotate ultra sonic sensor which is attached to robot in 360 degrees and obtain accurate distance. We can estimate precise position of mobile robot by triangulation using obtained distance information. A mobile robot navigates using embedded encoder and compensates its coordinates by ultrasonic sensors. Results of Experiments show proposed method obtains accurate distance between sensors and coordinates of position of robot. And mobile robots can navigate designated path well.

Implementation of a sensor fusion system for autonomous guided robot navigation in outdoor environments (실외 자율 로봇 주행을 위한 센서 퓨전 시스템 구현)

  • Lee, Seung-H.;Lee, Heon-C.;Lee, Beom-H.
    • Journal of Sensor Science and Technology
    • /
    • v.19 no.3
    • /
    • pp.246-257
    • /
    • 2010
  • Autonomous guided robot navigation which consists of following unknown paths and avoiding unknown obstacles has been a fundamental technique for unmanned robots in outdoor environments. The unknown path following requires techniques such as path recognition, path planning, and robot pose estimation. In this paper, we propose a novel sensor fusion system for autonomous guided robot navigation in outdoor environments. The proposed system consists of three monocular cameras and an array of nine infrared range sensors. The two cameras equipped on the robot's right and left sides are used to recognize unknown paths and estimate relative robot pose on these paths through bayesian sensor fusion method, and the other camera equipped at the front of the robot is used to recognize abrupt curves and unknown obstacles. The infrared range sensor array is used to improve the robustness of obstacle avoidance. The forward camera and the infrared range sensor array are fused through rule-based method for obstacle avoidance. Experiments in outdoor environments show the mobile robot with the proposed sensor fusion system performed successfully real-time autonomous guided navigation.