• 제목/요약/키워드: Sensors fusion

검색결과 421건 처리시간 0.04초

이동 물체를 추적하기 위한 감각 운동 융합 시스템 설계 (The Sensory-Motor Fusion System for Object Tracking)

  • 이상희;위재우;이종호
    • 대한전기학회논문지:시스템및제어부문D
    • /
    • 제52권3호
    • /
    • pp.181-187
    • /
    • 2003
  • For the moving objects with environmental sensors such as object tracking moving robot with audio and video sensors, environmental information acquired from sensors keep changing according to movements of objects. In such case, due to lack of adaptability and system complexity, conventional control schemes show limitations on control performance, and therefore, sensory-motor systems, which can intuitively respond to various types of environmental information, are desirable. And also, to improve the system robustness, it is desirable to fuse more than two types of sensory information simultaneously. In this paper, based on Braitenberg's model, we propose a sensory-motor based fusion system, which can trace the moving objects adaptively to environmental changes. With the nature of direct connecting structure, sensory-motor based fusion system can control each motor simultaneously, and the neural networks are used to fuse information from various types of sensors. And also, even if the system receives noisy information from one sensor, the system still robustly works with information from other sensors which compensates the noisy information through sensor fusion. In order to examine the performance, sensory-motor based fusion model is applied to object-tracking four-foot robot equipped with audio and video sensors. The experimental results show that the sensory-motor based fusion system can tract moving objects robustly with simpler control mechanism than model-based control approaches.

Sensor Data Fusion for Navigation of Mobile Robot With Collision Avoidance and Trap Recovery

  • Jeon, Young-Su;Ahn, Byeong-Kyu;Kuc, Tae-Yong
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2003년도 ICCAS
    • /
    • pp.2461-2466
    • /
    • 2003
  • This paper presents a simple sensor fusion algorithm using neural network for navigation of mobile robots with obstacle avoidance and trap recovery. The multiple sensors input sensor data to the input layer of neural network activating the input nodes. The multiple sensors used include optical encoders, ultrasonic sensors, infrared sensors, a magnetic compass sensor, and GPS sensors. The proposed sensor fusion algorithm is combined with the VFH(Vector Field Histogram) algorithm for obstacle avoidance and AGPM(Adaptive Goal Perturbation Method) which sets adaptive virtual goals to escape trap situations. The experiment results show that the proposed low-level fusion algorithm is effective for real-time navigation of mobile robot.

  • PDF

다중센서 기반 차선정보 시공간 융합기법 (Lane Information Fusion Scheme using Multiple Lane Sensors)

  • 이수목;박기광;서승우
    • 전자공학회논문지
    • /
    • 제52권12호
    • /
    • pp.142-149
    • /
    • 2015
  • 단일 카메라 센서를 기반으로 한 차선검출 시스템은 급격한 조도 변화, 열악한 기상환경 등에 취약하다. 이러한 단일 센서 시스템의 한계를 극복하기 위한 방안으로 센서 융합을 통해 성능 안정화를 도모할 수 있다. 하지만, 기존 센서 융합의 연구는 대부분 물체 및 차량을 대상으로 한 융합 모델에 국한되어 차용하기 어렵거나, 차선 센서의 다양한 신호 주기 및 인식범위에 대한 상이성을 고려하지 않은 경우가 대부분이었다. 따라서 본 연구에서는 다중센서의 상이성을 고려하여 차선 정보를 최적으로 융합하는 기법을 제안한다. 제안하는 융합 프레임워크는 센서 별 가변적인 신호처리 주기와 인식 신뢰 범위를 고려하므로 다양한 차선 센서 조합으로도 정교한 융합이 가능하다. 또한, 새로운 차선 예측 모델의 제안을 통해 간헐적으로 들어오는 차선정보를 세밀한 차선정보로 정밀하게 예측하여 다중주기 신호를 동기화한다. 조도환경이 열악한 환경에서의 실험과 정량적 평가를 통해, 제안하는 융합 시스템이 기존 단일 센서 대비 인식 성능이 개선됨을 검증한다.

AVM 카메라와 융합을 위한 다중 상용 레이더 데이터 획득 플랫폼 개발 (Development of Data Logging Platform of Multiple Commercial Radars for Sensor Fusion With AVM Cameras)

  • 진영석;전형철;신영남;현유진
    • 대한임베디드공학회논문지
    • /
    • 제13권4호
    • /
    • pp.169-178
    • /
    • 2018
  • Currently, various sensors have been used for advanced driver assistance systems. In order to overcome the limitations of individual sensors, sensor fusion has recently attracted the attention in the field of intelligence vehicles. Thus, vision and radar based sensor fusion has become a popular concept. The typical method of sensor fusion involves vision sensor that recognizes targets based on ROIs (Regions Of Interest) generated by radar sensors. Especially, because AVM (Around View Monitor) cameras due to their wide-angle lenses have limitations of detection performance over near distance and around the edges of the angle of view, for high performance of sensor fusion using AVM cameras and radar sensors the exact ROI extraction of the radar sensor is very important. In order to resolve this problem, we proposed a sensor fusion scheme based on commercial radar modules of the vendor Delphi. First, we configured multiple radar data logging systems together with AVM cameras. We also designed radar post-processing algorithms to extract the exact ROIs. Finally, using the developed hardware and software platforms, we verified the post-data processing algorithm under indoor and outdoor environments.

센서융합 검증을 위한 실시간 모니터링 및 검증 도구 개발 (Development of a Monitoring and Verification Tool for Sensor Fusion)

  • 김현우;신승환;배상진
    • 한국자동차공학회논문집
    • /
    • 제22권3호
    • /
    • pp.123-129
    • /
    • 2014
  • SCC (Smart Cruise Control) and AEBS (Autonomous Emergency Braking System) are using various types of sensors data, so it is important to consider about sensor data reliability. In this paper, data from radar and vision sensor is fused by applying a Bayesian sensor fusion technique to improve the reliability of sensors data. Then, it presents a sensor fusion verification tool developed to monitor acquired sensors data and to verify sensor fusion results, efficiently. A parallel computing method was applied to reduce verification time and a series of simulation results of this method are discussed in detail.

센서융합을 이용한 부정지형 적응형 이동로봇의 장애물 회피 (Sensor Fusion based Obstacle Avoidance for Terrain-Adaptive Mobile Robot)

  • 육경환;양현석;박노철;이상원
    • 제어로봇시스템학회논문지
    • /
    • 제13권2호
    • /
    • pp.93-100
    • /
    • 2007
  • The mobile robots to rescue a life in a disaster area and to explore planets demand high mobility as well as recognition of the environment. To avoid unknown obstacles exactly in unknown environment, accurate sensing is required. This paper proposes a sensor fusion to recognize unknown obstacles accurately by using low-cost sensors. Ultrasonic sensors and infrared sensors are used in this paper to avoid obstacles. If only one of these sensors is used alone, it is not useful fer the mobile robots to complete their tasks in the real world since the surrounding environment in the real world is complex and composed of many kinds of materials. So infrared sensor may not recognize transparent or reflective obstacles and ultrasonic sensor may not recognize narrow obstacles, far example, columns of small diameter. Therefore, I selected six ultrasonic sensors and five infrared sensors to detect obstacles. Then, I fused ultrasonic sensors with infrared sensors in order that both advantages and disadvantages of each sensor are utilized together. In fusing sensors, fuzzy algorithm is used to cope with the uncertainties of each sensor. TAMRY which is terrain-adaptive mobile robot is used as the mobile robot for experiments.

Collaborative Wireless Sensor Networks for Target Detection Based on the Generalized Approach to Signal Processing

  • Kim, Jai-Hoon;Tuzlukov, Vyacheslav;Yoon, Won-Sik;Kim, Yong-Deak
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2005년도 ICCAS
    • /
    • pp.1999-2005
    • /
    • 2005
  • Collaboration in wireless sensor networks must be fault-tolerant due to the harsh environmental conditions in which such networks can be deployed. This paper focuses on finding signal processing algorithms for collaborative target detection based on the generalized approach to signal processing in the presence of noise that are efficient in terms of communication cost, precision, accuracy, and number of faulty sensors tolerable in the wireless sensor network. Two algorithms, namely, value fusion and decision fusion constructed according to the generalized approach to signal processing in the presence of noise, are identified first. When comparing their performance and communication overhead, decision fusion is found to become superior to value fusion as the ratio of faulty sensors to fault free sensors increases. The use of the generalized approach to signal processing in the presence of noise under designing value and decision fusion algorithms in wireless sensor networks allows us to obtain the same performance, but at low values of signal energy, as under the employment of universally adopted signal processing algorithms widely used in practice.

  • PDF

스마트팩토리 실현을 위한 다중센서기반 모바일로봇의 위치 및 자세제어에 관한 연구 (A Study on Orientation and Position Control of Mobile Robot Based on Multi-Sensors Fusion for Implimentation of Smart FA)

  • 동근한;김희진;배호영;김상현;백영태;한성현
    • 한국산업융합학회 논문집
    • /
    • 제22권2호
    • /
    • pp.209-218
    • /
    • 2019
  • This study proposes a new approach to Control the Orientation and position based on obstacle avoidance technology by multi sensors fusion and autonomous travelling control of mobile robot system for implimentation of Smart FA. The important focus is to control mobile robot based on by the multiple sensor module for autonomous travelling and obstacle avoidance of proposed mobile robot system, and the multiple sensor module is consit with sonar sensors, psd sensors, color recognition sensors, and position recognition sensors. Especially, it is proposed two points for the real time implementation of autonomous travelling control of mobile robot in limited manufacturing environments. One is on the development of the travelling trajectory control algorithm which obtain accurate and fast in considering any constraints. such as uncertain nonlinear dynamic effects. The other is on the real time implementation of obstacle avoidance and autonomous travelling control of mobile robot based on multiple sensors. The reliability of this study has been illustrated by the computer simulation and experiments for autonomous travelling control and obstacle avoidance.

Suction Detection in Left Ventricular Assist System: Data Fusion Approach

  • Park, Seongjin
    • International Journal of Control, Automation, and Systems
    • /
    • 제1권3호
    • /
    • pp.368-375
    • /
    • 2003
  • Data fusion approach is investigated to avoid suction in the left ventricular assist system (LVAS) using a nonpulsatile pump. LVAS requires careful control of pump speed to support the heart while preventing suction in the left ventricle and providing proper cardiac output at adequate perfusion pressure to the body. Since the implanted sensors are usually unreliable for long-term use, a sensorless approach is adopted to detect suction. The pump model is developed to provide the load coefficient as a necessary signal to the data fusion system without the implanted sensors. The load coefficient of the pump mimics the pulsatility property of the actual pump flow and provides more comparable information than the pump flow after suction occurs. Four signals are generated from the load coefficient as inputs to the data fusion system for suction detection and a neural fuzzy method is implemented to construct the data fusion system. The data fusion approach has a good ability to classify suction status and it can also be used to design a controller for LVAS.

Control of the Mobile Robot Navigation Using a New Time Sensor Fusion

  • Tack, Han-Ho;Kim, Chang-Geun;Kim, Myeong-Kyu
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제4권1호
    • /
    • pp.23-28
    • /
    • 2004
  • This paper proposes a sensor-fusion technique where the data sets for the previous moments are properly transformed and fused into the current data sets to enable accurate measurement, such as, distance to an obstacle and location of the service robot itself. In the conventional fusion schemes, the measurement is dependent on the current data sets. As the results, more of sensors are required to measure a certain physical parameter or to improve the accuracy of the measurement. However, in this approach, instead of adding more sensors to the system, the temporal sequence of the data sets are stored and utilized for the measurement improvement. Theoretical basis is illustrated by examples and the effectiveness is proved through the simulations. Finally, the new space and time sensor fusion(STSF) scheme is applied to the control of a mobile robot in an unstructured environment as well as structured environment.