• 제목/요약/키워드: Sensor fusion

검색결과 815건 처리시간 0.03초

농업기계 내비게이션을 위한 INS/GPS 통합 연구 (Study on INS/GPS Sensor Fusion for Agricultural Vehicle Navigation System)

  • 노광모;박준걸;장영창
    • Journal of Biosystems Engineering
    • /
    • 제33권6호
    • /
    • pp.423-429
    • /
    • 2008
  • This study was performed to investigate the effects of inertial navigation system (INS) / global positioning system (GPS) sensor fusion for agricultural vehicle navigation. An extended Kalman filter algorithm was adopted for INS/GPS sensor fusion in an integrated mode, and the vehicle dynamic model was used instead of the navigation state error model. The INS/GPS system was consisted of a low-cost gyroscope, an odometer and a GPS receiver, and its performance was tested through computer simulations. When measurement noises of GPS receiver were 10, 1.0, 0.5, and 0.2 m ($1{\sigma}$), RMS position and heading errors of INS/GPS system at 5 m/s straight path were remarkably reduced with 10%, 35%, 40%, and 60% of those obtained from the GPS receiver, respectively. The decrease of position and heading errors by using INS/GPS rather than stand-alone GPS can provide more stable steering of agricultural equipments. Therefore, the low-cost INS/GPS system using the extended Kalman filter algorithm may enable the self-autonomous navigation to meet required performance like stable steering or more less position errors even in slow-speed operation.

CCD카메라와 적외선 카메라의 융합을 통한 효과적인 객체 추적 시스템 (Efficient Object Tracking System Using the Fusion of a CCD Camera and an Infrared Camera)

  • 김승훈;정일균;박창우;황정훈
    • 제어로봇시스템학회논문지
    • /
    • 제17권3호
    • /
    • pp.229-235
    • /
    • 2011
  • To make a robust object tracking and identifying system for an intelligent robot and/or home system, heterogeneous sensor fusion between visible ray system and infrared ray system is proposed. The proposed system separates the object by combining the ROI (Region of Interest) estimated from two different images based on a heterogeneous sensor that consolidates the ordinary CCD camera and the IR (Infrared) camera. Human's body and face are detected in both images by using different algorithms, such as histogram, optical-flow, skin-color model and Haar model. Also the pose of human body is estimated from the result of body detection in IR image by using PCA algorithm along with AdaBoost algorithm. Then, the results from each detection algorithm are fused to extract the best detection result. To verify the heterogeneous sensor fusion system, few experiments were done in various environments. From the experimental results, the system seems to have good tracking and identification performance regardless of the environmental changes. The application area of the proposed system is not limited to robot or home system but the surveillance system and military system.

퍼지 논리 융합과 반복적 Relaxation Labeling을 이용한 다중 센서 원격탐사 화상 분류 (Classification of Multi-sensor Remote Sensing Images Using Fuzzy Logic Fusion and Iterative Relaxation Labeling)

  • 박노욱;지광훈;권병두
    • 대한원격탐사학회지
    • /
    • 제20권4호
    • /
    • pp.275-288
    • /
    • 2004
  • 이 논문은 다중 센서 원격탐사 화상의 분류를 위해 퍼지 논리 융합과 결합된 relaxation labeling 방법을 제안하였다. 다중 센서 원격탐사 화상의 융합에는 퍼지 논리를, 분광정보와 공간정보의 융합에는 반복적인 relaxation labeling 방법을 적용하였다. 특히 반복적 relaxation labeling 방법은 공간정보의 이용에 따른 분류 화소의 변화양상을 얻을 수 있는 장점이 있다. 토지 피복의 감독 분류를 목적으로 광학 화상과 다중 주파수/편광 SAR 화상에 제안 기법을 적용한 결과, 다중 센서 자료를 이용하고 공간정보를 함께 결합하였을 때 향상된 분류 정확도를 얻을 수 있었다.

SPAD과 CNN의 특성을 반영한 ToF 센서와 스테레오 카메라 융합 시스템 (Fusion System of Time-of-Flight Sensor and Stereo Cameras Considering Single Photon Avalanche Diode and Convolutional Neural Network)

  • 김동엽;이재민;전세웅
    • 로봇학회논문지
    • /
    • 제13권4호
    • /
    • pp.230-236
    • /
    • 2018
  • 3D depth perception has played an important role in robotics, and many sensory methods have also proposed for it. As a photodetector for 3D sensing, single photon avalanche diode (SPAD) is suggested due to sensitivity and accuracy. We have researched for applying a SPAD chip in our fusion system of time-of-fight (ToF) sensor and stereo camera. Our goal is to upsample of SPAD resolution using RGB stereo camera. Currently, we have 64 x 32 resolution SPAD ToF Sensor, even though there are higher resolution depth sensors such as Kinect V2 and Cube-Eye. This may be a weak point of our system, however we exploit this gap using a transition of idea. A convolution neural network (CNN) is designed to upsample our low resolution depth map using the data of the higher resolution depth as label data. Then, the upsampled depth data using CNN and stereo camera depth data are fused using semi-global matching (SGM) algorithm. We proposed simplified fusion method created for the embedded system.

지능형 이족보행로봇을 위한 센서시스템 연구 (Sensor System Study for Intelligence Biped Walking Robot)

  • 김유신;황규득;최형식;이창만
    • 제어로봇시스템학회논문지
    • /
    • 제11권1호
    • /
    • pp.67-76
    • /
    • 2005
  • In this paper, An analysis on the intelligence system for a biped walking robot(BWR) was made and its results were applied to the BWR. Various sensors were applied to the developed BWR for autonomous and intelligent walk in unknown environments. To measure the distance between the object and BWR, ultrasonic sensor and infrared-rays sensor were used. To identity surrounding environments, vision system was used. Gyro sensor was used to control the posture of BWR. Also, piezoelectricity sensor was used to identity the pressure of foot landing on the surface. Sensors applied to the robot have measurement errors according to noises or walking environments. To improve the function of these sensors, influences of noise or sensing errors were minimized using a sensor fusion scheme. A gait test using the sensor fusion system was performed, and its results are presented.

선삭공정시 공구파손의 실시간 검출에 관한 연구 (A Study on Real-time Monitoing of Tool Fracture in Turning)

  • 최덕기;주종남;이장무
    • 한국정밀공학회지
    • /
    • 제12권3호
    • /
    • pp.130-143
    • /
    • 1995
  • This paper presents a new methodology for on-line tool breadage detection by sensor fusion of an acoustic emission (AE) sensor and a built-in force sensor. A built-in piezoelectric force sensor, instead of a tool dynamometer, was used to measure the cutting force without altering the machine tool dynamics. The sensor was inserted in the tool turret housing of an NC lathe. FEM analysis was carried out to locate the most sensitive position for the sensor. A burst of AE signal was used as a triggering signal to inspect the cutting force. A sighificant drop of cutting force was utilized to detect tool breakage. The algorithm was implemented on a DSP board for in-process tool breakage detection. Experiental works showed an excellent monitoring capability of the proposed tool breakage detection system.

  • PDF

다중 센서를 이용한 CNC 선반에서의 실시간 공구파손 감시에 관한 연구 (A Study on Real-time Tool Breakage Monitoring on CNC Lathe using Fusion Sensor)

  • 안영진;김재열
    • Tribology and Lubricants
    • /
    • 제28권3호
    • /
    • pp.130-135
    • /
    • 2012
  • This study presents a new methodology for realtime tool breakage detection by sensor fusion concept of two hall sensor and an acoustic emission (AE) sensor. Spindle induction motor torque of CNC Lathe during machining is estimated by two hall sensor. Estimated motor torque instead of a tool dynamometer was used to measure the cutting torque and tool breakage detection. A burst of AE signal was used as a triggering signal to inspect the cutting torque. A significant drop of cutting torque was utilized to detect tool breakage. The algorithm was implemented on a NI DAQ (Data Acquisition) board for in-process tool breakage detection. The result of experiment showed an excellent monitoring capability of the proposed tool breakage detection system. This system is available tool breakage monitoring through internet also provides this system's user with current cutting torque of induction motor.

레이더와 비전 센서를 이용하여 선행차량의 횡방향 운동상태를 보정하기 위한 IMM-PDAF 기반 센서융합 기법 연구 (A Study on IMM-PDAF based Sensor Fusion Method for Compensating Lateral Errors of Detected Vehicles Using Radar and Vision Sensors)

  • 장성우;강연식
    • 제어로봇시스템학회논문지
    • /
    • 제22권8호
    • /
    • pp.633-642
    • /
    • 2016
  • It is important for advanced active safety systems and autonomous driving cars to get the accurate estimates of the nearby vehicles in order to increase their safety and performance. This paper proposes a sensor fusion method for radar and vision sensors to accurately estimate the state of the preceding vehicles. In particular, we performed a study on compensating for the lateral state error on automotive radar sensors by using a vision sensor. The proposed method is based on the Interactive Multiple Model(IMM) algorithm, which stochastically integrates the multiple Kalman Filters with the multiple models depending on lateral-compensation mode and radar-single sensor mode. In addition, a Probabilistic Data Association Filter(PDAF) is utilized as a data association method to improve the reliability of the estimates under a cluttered radar environment. A two-step correction method is used in the Kalman filter, which efficiently associates both the radar and vision measurements into single state estimates. Finally, the proposed method is validated through off-line simulations using measurements obtained from a field test in an actual road environment.

명령융합과 퍼지기반의 지능형 시스템-이동로봇주행적용 (Intelligent System based on Command Fusion and Fuzzy Logic Approaches - Application to mobile robot navigation)

  • 진태석;김현덕
    • 한국정보통신학회논문지
    • /
    • 제18권5호
    • /
    • pp.1034-1041
    • /
    • 2014
  • 본 논문은 능동카메라가 장착된 이동로봇의 장애물 회피를 위한 퍼지추론방법 제시하였다. 영상센서를 이용하여 상황적 판단에 근거한 명령융합을 사용하여 미지의 환경에서의 목적지까지의 지능적인 탐색을 수행하도록 하였다. 본 연구를 검증하기 위하여 환경모델과 센서데이터에 기반 한 이동로봇의 경로생성을 위한 물리적 센서융합을 시도하지 않고, 환경에 따른 각각의 로봇의 주행행동을 제어하기 위한 명령융합 적용하였다. 주행을 위한 전략으로는 목적지 접근과 장애물 회피를 수행할 수 있도록 퍼지규칙 조합을 통해 판단하도록 수행하였다. 제안한 방법을 검증하기 위하여 영상데이터를 사용한 성공적인 주행 실험 결과를 제시하였다.

A Cyber-Physical Information System for Smart Buildings with Collaborative Information Fusion

  • Liu, Qing;Li, Lanlan
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제16권5호
    • /
    • pp.1516-1539
    • /
    • 2022
  • This article shows a set of physical information fusion IoT systems that we designed for smart buildings. Its essence is a computer system that combines physical quantities in buildings with quantitative analysis and control. In the part of the Internet of Things, its mechanism is controlled by a monitoring system based on sensor networks and computer-based algorithms. Based on the design idea of the agent, we have realized human-machine interaction (HMI) and machine-machine interaction (MMI). Among them, HMI is realized through human-machine interaction, while MMI is realized through embedded computing, sensors, controllers, and execution. Device and wireless communication network. This article mainly focuses on the function of wireless sensor networks and MMI in environmental monitoring. This function plays a fundamental role in building security, environmental control, HVAC, and other smart building control systems. The article not only discusses various network applications and their implementation based on agent design but also demonstrates our collaborative information fusion strategy. This strategy can provide a stable incentive method for the system through collaborative information fusion when the sensor system is unstable in the physical measurements, thereby preventing system jitter and unstable response caused by uncertain disturbances and environmental factors. This article also gives the results of the system test. The results show that through the CPS interaction of HMI and MMI, the intelligent building IoT system can achieve comprehensive monitoring, thereby providing support and expansion for advanced automation management.