• Title/Summary/Keyword: Mobile robot recognition

Search Result 227, Processing Time 0.041 seconds

Real-Time Implementation of Wireless Remote Control of Mobile Robot Based-on Speech Recognition Command (음성명령에 의한 모바일로봇의 실시간 무선원격 제어 실현)

  • Shim, Byoung-Kyun;Han, Sung-Hyun
    • Journal of the Korean Society of Manufacturing Technology Engineers
    • /
    • v.20 no.2
    • /
    • pp.207-213
    • /
    • 2011
  • In this paper, we present a study on the real-time implementation of mobile robot to which the interactive voice recognition technique is applied. The speech command utters the sentential connected word and asserted through the wireless remote control system. We implement an automatic distance speech command recognition system for voice-enabled services interactively. We construct a baseline automatic speech command recognition system, where acoustic models are trained from speech utterances spoken by a microphone. In order to improve the performance of the baseline automatic speech recognition system, the acoustic models are adapted to adjust the spectral characteristics of speech according to different microphones and the environmental mismatches between cross talking and distance speech. We illustrate the performance of the developed speech recognition system by experiments. As a result, it is illustrated that the average rates of proposed speech recognition system shows about 95% above.

Recognition of 3-Dimensional Environment for a Mobile Robot Using Structured Light (Structured Light을 이용한 이동 로보트의 3차원 환경인식)

  • Lee, Seok-Jun;Chung, Myung-Jin
    • Journal of the Korean Institute of Telematics and Electronics
    • /
    • v.26 no.7
    • /
    • pp.30-41
    • /
    • 1989
  • In this paper, a robust and simple structured light sensory system has been studied to endow mobile robots with the ability of navigating in real world. A mobile robot with this sensor can be applied in two ways: first, real time navigation in 3-dimensional world, second, modeling and recognition of environment. Range data obtained with this sensor are fairy accurate, and the data aquisition speed is satisfactory. Experiments in diverse situation show effectiveness of the structured light sensor for the mobile robot.

  • PDF

Fuzzy Neural Network Based Sensor Fusion and It's Application to Mobile Robot in Intelligent Robotic Space

  • Jin, Tae-Seok;Lee, Min-Jung;Hashimoto, Hideki
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.6 no.4
    • /
    • pp.293-298
    • /
    • 2006
  • In this paper, a sensor fusion based robot navigation method for the autonomous control of a miniature human interaction robot is presented. The method of navigation blends the optimality of the Fuzzy Neural Network(FNN) based control algorithm with the capabilities in expressing knowledge and learning of the networked Intelligent Robotic Space(IRS). States of robot and IR space, for examples, the distance between the mobile robot and obstacles and the velocity of mobile robot, are used as the inputs of fuzzy logic controller. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a sensor fusion technique is introduced, where the sensory data of ultrasonic sensors and a vision sensor are fused into the identification process. Preliminary experiment and results are shown to demonstrate the merit of the introduced navigation control algorithm.

A Ubiquitous Robot System (유비쿼터스 로봇 시스템)

  • 김종환;유지환;이강희;유범상
    • Journal of the Korean Society for Precision Engineering
    • /
    • v.21 no.7
    • /
    • pp.7-14
    • /
    • 2004
  • In an upcoming ubiquitous era, humankind will live in a ubiquitous space, where everything is connected through communication network. In this ubiquitous space, a ubiquitous robot, which can be used by anyone for any service through any device and any network at anytime and anywhere in a u-space, is expected to be required to serve seamless and context-aware services to humankind. In this paper, we introduce the ubiquitous robot, and define three components of the ubiquitous robot. The first one is "SoBot" which can be connected through the network in anywhere with environment recognition function and communication ability with human. The second one is "EmBot" which is embedded into environments and mobile robots and has localization and certification function with sensor fusion. The last one is "Mobile Robot" which serves overall physical services. This paper also introduces KAIST ITRC-Intelligent Robot Research Center that pursues the implementation of the ubiquitous robot.

Path control of a mobile robot 'KMR-2' using odometer system (거리계를 이용한 이동로보트 'KMR-2'의 경로주행제어에 관한 연구)

  • 조형석;이대업;이종원
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1988.10a
    • /
    • pp.142-147
    • /
    • 1988
  • Free-path-type guidance system does not need a hardwired path in the environment so that it gives a mobile robot a flexible path. ln this study to achieve the free-path-type guidance system for a mobile robot which is steered by the differential steering of both drive forewheels, position recognition systems are constructed using odometer system as an internal position sensor. Two odometer systems, a auxiliary wheel odometer and a 2-encoder odometer system are constructed and path following algorithms using these odometer systems are designed and experimented. PID control type is adopted in the path following algorithms.

  • PDF

Fusion of Sonar and Laser Sensor for Mobile Robot Environment Recognition

  • Kim, Kyung-Hoon;Cho, Hyung-Suck
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2001.10a
    • /
    • pp.91.3-91
    • /
    • 2001
  • A sensor fusion scheme for mobile robot environment recognition that incorporates range data and contour data is proposed. Ultrasonic sensor provides coarse spatial description but guarantees open space with no obstacle within sonic cone with relatively high belief. Laser structured light system provides detailed contour description of environment but prone to light noise and is easily affected by surface reflectivity. Overall fusion process is composed of two stages: Noise elimination and belief updates. Dempster Shafer´s evidential reasoning is applied at each stage. Open space estimation from sonar range measurements brings elimination of noisy lines from laser sensor. Comparing actual sonar data to the simulated sonar data enables ...

  • PDF

Design of Ultrasonic Sensor Based Obstacle Recognition Mobile Robot (초음파 센서 기반 장애물 인지 이동 로봇 설계)

  • Moon, Inseok;Hong, Won-Kee;Ryu, Juang-Tak
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.6 no.5
    • /
    • pp.327-333
    • /
    • 2011
  • Intelligent robots are widely needed in various areas of industry from extremely dangerous environments to service tasks. For autonomous mobile robots, it is significant to move itself safely to a destination point, recognizing its surroundings. Advances in sensor technology and its applications are achieved in order to develop an intelligent robot. In this paper, a mobile robot with a path-finding algorithm is presented. The path-finding algorithm is the one that does not only find a path to designated destination and also recognizes obstacles on the way, calculating its distance. 10 ultrasonic sensor are mounted on the front and rear of the mobile robot to figure out its position. Specular reflection and wide viewing angle, which are inherent characteristics of ultrasonic waves, cause errors in measuring distance.

GPU based Fast Recognition of Artificial Landmark for Mobile Robot (주행로봇을 위한 GPU 기반의 고속 인공표식 인식)

  • Kwon, Oh-Sung;Kim, Young-Kyun;Cho, Young-Wan;Seo, Ki-Sung
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.20 no.5
    • /
    • pp.688-693
    • /
    • 2010
  • Vision based object recognition in mobile robots has many issues for image analysis problems with neighboring elements in dynamic environments. SURF(Speeded Up Robust Features) is the local feature extraction method of the image and its performance is constant even if disturbances, such as lighting, scale change and rotation, exist. However, it has a difficulty of real-time processing caused by representation of high dimensional vectors. To solve th problem, execution of SURF in GPU(Graphics Processing Unit) is proposed and implemented using CUDA of NVIDIA. Comparisons of recognition rates and processing time for SURF between CPU and GPU by variation of robot velocity and image sizes is experimented.

Moving Path Following of Autonomous Mobile Robot using Neural Network (신경망을 이용한 자율이동로봇의 이동 경로 추종)

  • 주기세
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.4 no.3
    • /
    • pp.585-594
    • /
    • 2000
  • The exact path following of an autonomous mobile robot in a factory and an unreliable environment has many disadvantages in case of a classical control algorithm. In this paper, a neural network control approach based on an error back propagation algorithm is proposed for controlling a mobile robot to follow a line installed on the road. Since not only the three recognized informations from three sensors attached on a mobile robot but also the ten detailed informations in non recognition area are learned with input patterns, a mobile robot moves smoothly an installed line in spite of non perception space. The mobile robot has an effect of error minimization with a short time till a destination. To test an effectiveness of the proposed controller, the two motor velocity changes which is affected from a moving angle change of a mobile robot are simulated with computer.

  • PDF

Map-Building and Position Estimation based on Multi-Sensor Fusion for Mobile Robot Navigation in an Unknown Environment (이동로봇의 자율주행을 위한 다중센서융합기반의 지도작성 및 위치추정)

  • Jin, Tae-Seok;Lee, Min-Jung;Lee, Jang-Myung
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.13 no.5
    • /
    • pp.434-443
    • /
    • 2007
  • Presently, the exploration of an unknown environment is an important task for thee new generation of mobile service robots and mobile robots are navigated by means of a number of methods, using navigating systems such as the sonar-sensing system or the visual-sensing system. To fully utilize the strengths of both the sonar and visual sensing systems. This paper presents a technique for localization of a mobile robot using fusion data of multi-ultrasonic sensors and vision system. The mobile robot is designed for operating in a well-structured environment that can be represented by planes, edges, comers and cylinders in the view of structural features. In the case of ultrasonic sensors, these features have the range information in the form of the arc of a circle that is generally named as RCD(Region of Constant Depth). Localization is the continual provision of a knowledge of position which is deduced from it's a priori position estimation. The environment of a robot is modeled into a two dimensional grid map. we defines a vision-based environment recognition, phisically-based sonar sensor model and employs an extended Kalman filter to estimate position of the robot. The performance and simplicity of the approach is demonstrated with the results produced by sets of experiments using a mobile robot.