• 제목/요약/키워드: Sensor fusion

검색결과 820건 처리시간 0.03초

센서 융합을 이용한 MAF 공정 특성 분석 (Characterization of Magnetic Abrasive Finishing Using Sensor Fusion)

  • 김설빔;안병운;이성환
    • 대한기계학회논문집A
    • /
    • 제33권5호
    • /
    • pp.514-520
    • /
    • 2009
  • In configuring an automated polishing system, a monitoring scheme to estimate the surface roughness is necessary. In this study, a precision polishing process, magnetic abrasive finishing (MAF), along with an in-process monitoring setup was investigated. A magnetic tooling is connected to a CNC machining to polish the surface of stavax(S136) die steel workpieces. During finishing experiments, both AE signals and force signals were sampled and analysed. The finishing results show that MAF has nano scale finishing capability (upto 8nm in surface roughness) and the sensor signals have strong correlations with the parameters such as gap between the tool and workpiece, feed rate and abrasive size. In addition, the signals were utilized as the input parameters of artificial neural networks to predict generated surface roughness. Among the three networks constructed -AE rms input, force input, AE+force input- the ANN with sensor fusion (AE+force) produced most stable results. From above, it has been shown that the proposed sensor fusion scheme is appropriate for the monitoring and prediction of the nano scale precision finishing process.

Global Map Building and Navigation of Mobile Robot Based on Ultrasonic Sensor Data Fusion

  • Kang, Shin-Chul;Jin, Tae-Seok
    • International Journal of Fuzzy Logic and Intelligent Systems
    • /
    • 제7권3호
    • /
    • pp.198-204
    • /
    • 2007
  • In mobile robotics, ultrasonic sensors became standard devices for collision avoiding. Moreover, their applicability for map building and navigation has exploited in recent years. In this paper, as the preliminary step for developing a multi-purpose autonomous carrier mobile robot to transport trolleys or heavy goods and serve as robotic nursing assistant in hospital wards. The aim of this paper is to present the use of multi-sensor data fusion such as ultrasonic sensor, IR sensor for mobile robot to navigate, and presents an experimental mobile robot designed to operate autonomously within both indoor and outdoor environments. The global map building based on multi-sensor data fusion is applied for recognition an obstacle free path from a starting position to a known goal region, and simultaneously build a map of straight line segment geometric primitives based on the application of the Hough transform from the actual and noisy sonar data. We will give an explanation for the robot system architecture designed and implemented in this study and a short review of existing techniques, Hough transform, since there exist several recent thorough books and review paper on this paper. Experimental results with a real Pioneer DX2 mobile robot will demonstrate the effectiveness of the discussed methods.

좁은 환경에서 초음파 및 적외선 센서를 융합한 강인한 지도작성 (Robust Map Building in Narrow Environments based on Combination of Sonar and IR Sensors)

  • 한혜민;송재복
    • 로봇학회논문지
    • /
    • 제6권1호
    • /
    • pp.42-48
    • /
    • 2011
  • It is very important for a mobile robot to recognize and model its environments for navigation. However, the grid map constructed by sonar sensors cannot accurately represent the environment, especially the narrow environment, due to the angular uncertainty of sonar data. Therefore, we propose a map building scheme which combines sonar sensors and IR sensors. The maps built by sonar sensors and IR sensors are combined with different weights which are determined by the degree of translational and rotational motion of a robot. To increase the effectiveness of sensor fusion, we also propose optimal sensor arrangement through various experiments. The experimental results show that the proposed method can represent the environment such as narrow corridor and open door more accurately than conventional sonar sensor-based map building methods.

센서융합을 이용한 모바일로봇 실내 위치인식 기법 (An Indoor Localization of Mobile Robot through Sensor Data Fusion)

  • 김윤구;이기동
    • 로봇학회논문지
    • /
    • 제4권4호
    • /
    • pp.312-319
    • /
    • 2009
  • This paper proposes a low-complexity indoor localization method of mobile robot under the dynamic environment by fusing the landmark image information from an ordinary camera and the distance information from sensor nodes in an indoor environment, which is based on sensor network. Basically, the sensor network provides an effective method for the mobile robot to adapt to environmental changes and guides it across a geographical network area. To enhance the performance of localization, we used an ordinary CCD camera and the artificial landmarks, which are devised for self-localization. Experimental results show that the real-time localization of mobile robot can be achieved with robustness and accurateness using the proposed localization method.

  • PDF

화재 특성 고찰을 통한 농연 극복 센서 모듈 (A Sensor Module Overcoming Thick Smoke through Investigation of Fire Characteristics)

  • 조민영;신동인;전세웅
    • 로봇학회논문지
    • /
    • 제13권4호
    • /
    • pp.237-247
    • /
    • 2018
  • In this paper, we describe a sensor module that monitors fire environment by analyzing fire characteristics. We analyzed the smoke characteristics of indoor fire. Six different environments were defined according to the type of smoke and the flame, and the sensors available for each environment were combined. Based on this analysis, the sensors were selected from the perspective of firefighter. The sensor module consists of an RGB camera, an infrared camera and a radar. It is designed with minimum weight to fit on the robot. the enclosure of sensor is designed to protect against the radiant heat of the fire scene. We propose a single camera mode, thermal stereo mode, data fusion mode, and radar mode that can be used depending on the fire scene. Thermal stereo was effectively refined using an image segmentation algorithm, SLIC (Simple Linear Iterative Clustering). In order to reproduce the fire scene, three fire test environments were built and each sensor was verified.

지형 정보를 사용한 다중 지상 표적 추적 알고리즘의 연구 (Study on Multiple Ground Target Tracking Algorithm Using Geographic Information)

  • 김인택;이응기
    • 제어로봇시스템학회논문지
    • /
    • 제6권2호
    • /
    • pp.173-180
    • /
    • 2000
  • During the last decade many researches have been working on multiple target tracking problem in the area of radar application, Various approaches have been proposed to solve the tracking problem and the concept of sensor fusion was established as an effort. In this paper utilization of geographic information for ground target tracking is investigated and performance comparison with the results of applying sensor fusion is described. Geographic information is used in three aspects: association masking target measurement and re-striction of removing true target. Simulation results indicate that using two sensors shows better performance with respect to tracking but a single with geographic information is a winner in reducing the number of false tracks.

  • PDF

멀티센서 융합을 이용한 자율이동로봇의 주행기록계 에러 보상에 관한 연구 (A Study on Odometry Error Compensation using Multisensor fusion for Mobile Robot Navigation)

  • 송신우;박문수;홍석교
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2001년도 합동 추계학술대회 논문집 정보 및 제어부문
    • /
    • pp.288-291
    • /
    • 2001
  • This paper present effective odometry error compensation using multisensor fusion for the accurate positioning of mobile robot in navigation. During obstacle avoidance and wall following of mobile robot, position estimates obtained by odometry become unrealistic and useless because of its accumulated errors. To measure the position and heading direction of mobile robot accurately, odometry sensor a gyroscope and an azimuth sensor are mounted on mobile robot and Complementary-filter is designed and implemented in order to compensate complementary drawback of each sensor and fuse their information. The experimental results show that the multisensor fusion system is more accurate than odometry only in estimation of the position and direction of mobile robot.

  • PDF

센서 융합기반의 추측항법을 통한 야지 주행 이동로봇의 위치 추정 및 제어 (Localization and Control of an Outdoor Mobile Robot Based on an Estimator with Sensor Fusion)

  • 전상운;정슬
    • 대한임베디드공학회논문지
    • /
    • 제4권2호
    • /
    • pp.69-78
    • /
    • 2009
  • Localization is a very important technique for the mobile robot to navigate in outdoor environment. In this paper, the development of the sensor fusion algorithm for controlling mobile robots in outdoor environments is presented. The multi-sensorial dead-reckoning subsystem is established based on the optimal filtering by first fusing a heading angle reading data from a magnetic compass, a rate-gyro, and two encoders mounted on the robot wheels, thereby computing the dead-reckoned location. These data and the position data provided by a global sensing system are fused together by means of an extended Kalman filter. The proposed algorithm is proved by simulation studies of controlling a mobile robot controlled by a backstepping controller and a cascaded controller. Performances of each controller are compared.

  • PDF

GPS/INS/기압고도계의 웨이블릿 센서융합 기법 (Sensor Fusion of GPS/INS/Baroaltimeter Using Wavelet Analysis)

  • 김성필;김응태;성기정
    • 제어로봇시스템학회논문지
    • /
    • 제14권12호
    • /
    • pp.1232-1237
    • /
    • 2008
  • This paper introduces an application of wavelet analysis to the sensor fusion of GPS/INS/baroaltimeter. Using wavelet analysis the baro-inertial altitude is decomposed into the low frequency content and the high frequency content. The high frequency components, 'details', represent the perturbed altitude change from the long time trend. GPS altitude is also broken down by a wavelet decomposition. The low frequency components, 'approximations', of the decomposed signal address the long-term trend of altitude. It is proposed that the final altitude be determined as the sum of both the details of the baro-inertial altitude and the approximations of GPS altitude. Then the final altitude exclude long-term baro-inertial errors and short-term GPS errors. Finally, it is shown from the test results that the proposed method produces continuous and sensitive altitude successfully.

다중 센서 융합 알고리즘을 이용한 감정인식 및 표현기법 (Emotion Recognition and Expression Method using Bi-Modal Sensor Fusion Algorithm)

  • 주종태;장인훈;양현창;심귀보
    • 제어로봇시스템학회논문지
    • /
    • 제13권8호
    • /
    • pp.754-759
    • /
    • 2007
  • In this paper, we proposed the Bi-Modal Sensor Fusion Algorithm which is the emotional recognition method that be able to classify 4 emotions (Happy, Sad, Angry, Surprise) by using facial image and speech signal together. We extract the feature vectors from speech signal using acoustic feature without language feature and classify emotional pattern using Neural-Network. We also make the feature selection of mouth, eyes and eyebrows from facial image. and extracted feature vectors that apply to Principal Component Analysis(PCA) remakes low dimension feature vector. So we proposed method to fused into result value of emotion recognition by using facial image and speech.