• Title/Summary/Keyword: sensor-fusion technique

Search Result 109, Processing Time 0.026 seconds

On the multi-sensor systems for the second generation robots (제2세대 로보트를 위한 다중센서 시스템에 관하여)

  • 도용태
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1992.10a
    • /
    • pp.792-796
    • /
    • 1992
  • Readings from robotic sensors are somewhat uncertain. This uncertainty problem makes it difficult to employ the sensor feedback controlled robots widely in real industrial sites, In this paper, redundant sensor fusion techniques are discussed to effectively overcome the sensor uncertainty, A weighted averaging technique is proposed under static and dynamic sensing environments. Proposed technique is tested by the experiments of stereoscopic 3d position measurements.

  • PDF

Fuzzy Neural Network Based Sensor Fusion and It's Application to Mobile Robot in Intelligent Robotic Space

  • Jin, Tae-Seok;Lee, Min-Jung;Hashimoto, Hideki
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • v.6 no.4
    • /
    • pp.293-298
    • /
    • 2006
  • In this paper, a sensor fusion based robot navigation method for the autonomous control of a miniature human interaction robot is presented. The method of navigation blends the optimality of the Fuzzy Neural Network(FNN) based control algorithm with the capabilities in expressing knowledge and learning of the networked Intelligent Robotic Space(IRS). States of robot and IR space, for examples, the distance between the mobile robot and obstacles and the velocity of mobile robot, are used as the inputs of fuzzy logic controller. The navigation strategy is based on the combination of fuzzy rules tuned for both goal-approach and obstacle-avoidance. To identify the environments, a sensor fusion technique is introduced, where the sensory data of ultrasonic sensors and a vision sensor are fused into the identification process. Preliminary experiment and results are shown to demonstrate the merit of the introduced navigation control algorithm.

Pose Tracking of Moving Sensor using Monocular Camera and IMU Sensor

  • Jung, Sukwoo;Park, Seho;Lee, KyungTaek
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.15 no.8
    • /
    • pp.3011-3024
    • /
    • 2021
  • Pose estimation of the sensor is important issue in many applications such as robotics, navigation, tracking, and Augmented Reality. This paper proposes visual-inertial integration system appropriate for dynamically moving condition of the sensor. The orientation estimated from Inertial Measurement Unit (IMU) sensor is used to calculate the essential matrix based on the intrinsic parameters of the camera. Using the epipolar geometry, the outliers of the feature point matching are eliminated in the image sequences. The pose of the sensor can be obtained from the feature point matching. The use of IMU sensor can help initially eliminate erroneous point matches in the image of dynamic scene. After the outliers are removed from the feature points, these selected feature points matching relations are used to calculate the precise fundamental matrix. Finally, with the feature point matching relation, the pose of the sensor is estimated. The proposed procedure was implemented and tested, comparing with the existing methods. Experimental results have shown the effectiveness of the technique proposed in this paper.

Robust Data, Event, and Privacy Services in Real-Time Embedded Sensor Network Systems (실시간 임베디드 센서 네트워크 시스템에서 강건한 데이터, 이벤트 및 프라이버시 서비스 기술)

  • Jung, Kang-Soo;Kapitanova, Krasimira;Son, Sang-H.;Park, Seog
    • Journal of KIISE:Databases
    • /
    • v.37 no.6
    • /
    • pp.324-332
    • /
    • 2010
  • The majority of event detection in real-time embedded sensor network systems is based on data fusion that uses noisy sensor data collected from complicated real-world environments. Current research has produced several excellent low-level mechanisms to collect sensor data and perform aggregation. However, solutions that enable these systems to provide real-time data processing using readings from heterogeneous sensors and subsequently detect complex events of interest in real-time fashion need further research. We are developing real-time event detection approaches which allow light-weight data fusion and do not require significant computing resources. Underlying the event detection framework is a collection of real-time monitoring and fusion mechanisms that are invoked upon the arrival of sensor data. The combination of these mechanisms and the framework has the potential to significantly improve the timeliness and reduce the resource requirements of embedded sensor networks. In addition to that, we discuss about a privacy that is foundation technique for trusted embedded sensor network system and explain anonymization technique to ensure privacy.

Localization and Control of an Outdoor Mobile Robot Based on an Estimator with Sensor Fusion (센서 융합기반의 추측항법을 통한 야지 주행 이동로봇의 위치 추정 및 제어)

  • Jeon, Sang Woon;Jeong, Seul
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.4 no.2
    • /
    • pp.69-78
    • /
    • 2009
  • Localization is a very important technique for the mobile robot to navigate in outdoor environment. In this paper, the development of the sensor fusion algorithm for controlling mobile robots in outdoor environments is presented. The multi-sensorial dead-reckoning subsystem is established based on the optimal filtering by first fusing a heading angle reading data from a magnetic compass, a rate-gyro, and two encoders mounted on the robot wheels, thereby computing the dead-reckoned location. These data and the position data provided by a global sensing system are fused together by means of an extended Kalman filter. The proposed algorithm is proved by simulation studies of controlling a mobile robot controlled by a backstepping controller and a cascaded controller. Performances of each controller are compared.

  • PDF

Polarity Index Dependence of M13 Bacteriophage-based Nanostructure for Structural Color-based Sensing

  • Lee, Yujin;Moon, Jong-Sik;Kim, Kyujung;Oh, Jin-Woo
    • Current Optics and Photonics
    • /
    • v.1 no.1
    • /
    • pp.12-16
    • /
    • 2017
  • Color sensor systems based on M13 bacteriophage are being considerably researched. Although many studies on M13 bacteriophage-based chemical sensing of TNT, endocrine disrupting chemicals, and antibiotics have been undertaken, the fundamental physical and chemical properties of M13 bacteriophage-based nanostructures require further research. A simple M13 bacteriophage-based colorimetric sensor was fabricated by a simple pulling technique, and M13 bacteriophage was genetically engineered using a phage display technique to exhibit a negatively charged surface. Arrays of structurally and genetically modified M13 bacteriophage that can determine the polarity indexes of various alcohols were found. In this research, an M13 bacteriophage-based color sensor was used to detect various types of alcohols, including methanol, ethanol, and methanol/butanol mixtures, in order to investigate the polarity-related property of the sensor. Studies of the fundamental chemical sensing properties of M13 bacteriophage-based nanostructures should result in wider applications of M13 bacteriophage-based colorimetric sensors.

Estimation of Train Position Using Sensor Fusion Technique (센서융합에 의한 열차위치 추정방법)

  • Yoon H. S;Park T. H;Yoon Y. K;Hwang J. K.;Lee J. H.
    • Proceedings of the KSR Conference
    • /
    • 2004.10a
    • /
    • pp.1205-1211
    • /
    • 2004
  • We propose a train position estimation method for automatic train control system. The accurate train position should be continuously feedback to control system for safe and efficient operation of trains in railway. In this paper, we propose the sensor fusion method integrating the tachometer, the transponder, and the doppler sensor for estimation of train position. The external sensors(transponder, doppler sensor) are used to compensate for the error of internal sensor(tachometer). The Kalman filter is also applied to reduce the measurement error of the sensors. Simulation results are then presented to verify the usefulness of the proposed method.

  • PDF

Efficient Kinect Sensor-Based Reactive Path Planning Method for Autonomous Mobile Robots in Dynamic Environments (키넥트 센서를 이용한 동적 환경에서의 효율적인 이동로봇 반응경로계획 기법)

  • Tuvshinjargal, Doopalam;Lee, Deok Jin
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.39 no.6
    • /
    • pp.549-559
    • /
    • 2015
  • In this paper, an efficient dynamic reactive motion planning method for an autonomous vehicle in a dynamic environment is proposed. The purpose of the proposed method is to improve the robustness of autonomous robot motion planning capabilities within dynamic, uncertain environments by integrating a virtual plane-based reactive motion planning technique with a sensor fusion-based obstacle detection approach. The dynamic reactive motion planning method assumes a local observer in the virtual plane, which allows the effective transformation of complex dynamic planning problems into simple stationary ones proving the speed and orientation information between the robot and obstacles. In addition, the sensor fusion-based obstacle detection technique allows the pose estimation of moving obstacles using a Kinect sensor and sonar sensors, thus improving the accuracy and robustness of the reactive motion planning approach. The performance of the proposed method was demonstrated through not only simulation studies but also field experiments using multiple moving obstacles in hostile dynamic environments.

Real Time Motion Processing for Autonomous Navigation

  • Kolodko, J.;Vlacic, L.
    • International Journal of Control, Automation, and Systems
    • /
    • v.1 no.1
    • /
    • pp.156-161
    • /
    • 2003
  • An overview of our approach to autonomous navigation is presented showing how motion information can be integrated into existing navigation schemes. Particular attention is given to our short range motion estimation scheme which utilises a number of unique assumptions regarding the nature of the visual environment allowing a direct fusion of visual and range information. Graduated non-convexity is used to solve the resulting non-convex minimisation problem. Experimental results show the advantages of our fusion technique.