• 제목/요약/키워드: Laser distance sensor

검색결과 140건 처리시간 0.031초

용접선 자동추적을 위한 이중 전자기센서의 개발에 관한 연구 (A Study of a Dual-Electromagnetic Sensor for Automatic Weld Seam Tracking)

  • 신준호;김재응
    • Journal of Welding and Joining
    • /
    • 제18권4호
    • /
    • pp.70-75
    • /
    • 2000
  • The weld seam tracking system for arc welding process uses various kinds of sensors such as arc sensor, vision sensor, laser displacement and so on. Among the variety of sensors available, electro-magnetic sensor is one of the most useful methods especially in sheet metal butt-joint arc welding, primarily because it is hardly affected by the intense arc light and fume generated during the welding process, and also by the surface condition of weldments. In this study, a dual-electromagnetic sensor, which utilizes the induced current variation in the sensing coil due to the eddy current variation of the metal near the sensor, was developed for arc welding of sheet metal butt-joints. The dual-electromagnetic sensor thus detects the offset displacement of weld line from the center of sensor head even though there's no clearance in the joint. A set of design variables of the sensor were determined for the maximum sensing capability through the repeated experiments. Seam tracking is performed by correcting the position of sensor to the amount of offset displacement every sampling period. From the experimental results, the developed sensor showed the excellent capability of weld seam detection when the sensor to workpiece distance is near less than 5 mm, and it was revealed that the system has excellent seam tracking ability for the butt-joint of sheet metal.

  • PDF

카메라-레이저스캐너 상호보완 추적기를 이용한 이동 로봇의 사람 추종 (Person-following of a Mobile Robot using a Complementary Tracker with a Camera-laser Scanner)

  • 김형래;최학남;이재홍;이승준;김학일
    • 제어로봇시스템학회논문지
    • /
    • 제20권1호
    • /
    • pp.78-86
    • /
    • 2014
  • This paper proposes a method of tracking an object for a person-following mobile robot by combining a monocular camera and a laser scanner, where each sensor can supplement the weaknesses of the other sensor. For human-robot interaction, a mobile robot needs to maintain a distance between a moving person and itself. Maintaining distance consists of two parts: object tracking and person-following. Object tracking consists of particle filtering and online learning using shape features which are extracted from an image. A monocular camera easily fails to track a person due to a narrow field-of-view and influence of illumination changes, and has therefore been used together with a laser scanner. After constructing the geometric relation between the differently oriented sensors, the proposed method demonstrates its robustness in tracking and following a person with a success rate of 94.7% in indoor environments with varying lighting conditions and even when a moving object is located between the robot and the person.

IR 레이저 센서 및 관성 센서를 이용한 다목적 골프 퍼팅 거리 측정기에 대한 연구 (A Study on the Multipurpose Golf Putting Range Finder using IR Razer Sensor and Inertial Sensor)

  • 신민승;강대웅;김기덕;김지환;이철선;고윤석
    • 한국전자통신학회논문지
    • /
    • 제18권4호
    • /
    • pp.669-676
    • /
    • 2023
  • 본 논문에서는 IR 레이저 센서 및 관성 센서를 이용한 골프 퍼팅 거리 측정을 위한 다목적 골프 퍼팅 거리 측정기를 설계, 제작하였다. 거리 측정기는 골프 퍼팅 거리 측정을 위해 실외 50m 측정 범위 내에서 거리 측정과 경사도 측정, 그리고 퍼팅에 영향을 미치는 온도 측정이 가능하도록 설계하였다. 또한, 실내 30m 측정범위 내에서 길이 및 수평도를 측정함으로써 실내 작업을 지원하며, 실내 온도 이상시 웹서버와의 연동을 통해 휴대폰 사용자에 경보를 제공함으로써 실내 또는 차량 화재로부터 안전이 확보될 수 있도록 하였다. 제안된 방법의 정확성과 스마트 폰과의 연동 성능을 평가하기 위해 하나의 시작품을 제작하고 웹서버를 구축하였으며, 반복적인 실험에서 허용 가능한 5%내의 오차율을 보임으로써 유용성을 확인할 수 있었다.

Experimental Study of Spacecraft Pose Estimation Algorithm Using Vision-based Sensor

  • Hyun, Jeonghoon;Eun, Youngho;Park, Sang-Young
    • Journal of Astronomy and Space Sciences
    • /
    • 제35권4호
    • /
    • pp.263-277
    • /
    • 2018
  • This paper presents a vision-based relative pose estimation algorithm and its validation through both numerical and hardware experiments. The algorithm and the hardware system were simultaneously designed considering actual experimental conditions. Two estimation techniques were utilized to estimate relative pose; one was a nonlinear least square method for initial estimation, and the other was an extended Kalman Filter for subsequent on-line estimation. A measurement model of the vision sensor and equations of motion including nonlinear perturbations were utilized in the estimation process. Numerical simulations were performed and analyzed for both the autonomous docking and formation flying scenarios. A configuration of LED-based beacons was designed to avoid measurement singularity, and its structural information was implemented in the estimation algorithm. The proposed algorithm was verified again in the experimental environment by using the Autonomous Spacecraft Test Environment for Rendezvous In proXimity (ASTERIX) facility. Additionally, a laser distance meter was added to the estimation algorithm to improve the relative position estimation accuracy. Throughout this study, the performance required for autonomous docking could be presented by confirming the change in estimation accuracy with respect to the level of measurement error. In addition, hardware experiments confirmed the effectiveness of the suggested algorithm and its applicability to actual tasks in the real world.

센서데이터 융합을 이용한 원주형 물체인식 (Cylindrical Object Recognition using Sensor Data Fusion)

  • 김동기;윤광익;윤지섭;강이석
    • 제어로봇시스템학회논문지
    • /
    • 제7권8호
    • /
    • pp.656-663
    • /
    • 2001
  • This paper presents a sensor fusion method to recognize a cylindrical object a CCD camera, a laser slit beam and ultrasonic sensors on a pan/tilt device. For object recognition with a vision sensor, an active light source projects a stripe pattern of light on the object surface. The 2D image data are transformed into 3D data using the geometry between the camera and the laser slit beam. The ultrasonic sensor uses an ultrasonic transducer array mounted in horizontal direction on the pan/tilt device. The time of flight is estimated by finding the maximum correlation between the received ultrasonic pulse and a set of stored templates - also called a matched filter. The distance of flight is calculated by simply multiplying the time of flight by the speed of sound and the maximum amplitude of the filtered signal is used to determine the face angle to the object. To determine the position and the radius of cylindrical objects, we use a statistical sensor fusion. Experimental results show that the fused data increase the reliability for the object recognition.

  • PDF

자율주행을 위한 센서 데이터 융합 기반의 맵 생성 (Map Building Based on Sensor Fusion for Autonomous Vehicle)

  • 강민성;허수정;박익현;박용완
    • 한국자동차공학회논문집
    • /
    • 제22권6호
    • /
    • pp.14-22
    • /
    • 2014
  • An autonomous vehicle requires a technology of generating maps by recognizing surrounding environment. The recognition of the vehicle's environment can be achieved by using distance information from a 2D laser scanner and color information from a camera. Such sensor information is used to generate 2D or 3D maps. A 2D map is used mostly for generating routs, because it contains information only about a section. In contrast, a 3D map involves height values also, and therefore can be used not only for generating routs but also for finding out vehicle accessible space. Nevertheless, an autonomous vehicle using 3D maps has difficulty in recognizing environment in real time. Accordingly, this paper proposes the technology for generating 2D maps that guarantee real-time recognition. The proposed technology uses only the color information obtained by removing height values from 3D maps generated based on the fusion of 2D laser scanner and camera data.

2D 레이저 스캐너 흔듦을 이용한 패턴인식 (Pattern Recognition Using 2D Laser Scanner Shaking)

  • 권성경;조해준;윤진영;이호승;이재천;곽성우;최해운
    • 한국자동차공학회논문집
    • /
    • 제22권4호
    • /
    • pp.138-144
    • /
    • 2014
  • Now, Autonomous unmanned vehicle has become an issue in next generation technology. 2D Laser scanner as the distance measurement sensor is used. 2D Laser scanner detects the distance of 80m, measured angle is -5 to 185 degree. Laser scanner detects only the plane, but using motor swings. As a result, traffic signs detect and analyze patterns. Traffic signs when driving at low speed, shape of the detected pattern is very similar. By shaking the laser scanner, traffic signs and other obstacles became clear distinction.

어안 렌즈와 레이저 스캐너를 이용한 3차원 전방향 영상 SLAM (3D Omni-directional Vision SLAM using a Fisheye Lens Laser Scanner)

  • 최윤원;최정원;이석규
    • 제어로봇시스템학회논문지
    • /
    • 제21권7호
    • /
    • pp.634-640
    • /
    • 2015
  • This paper proposes a novel three-dimensional mapping algorithm in Omni-Directional Vision SLAM based on a fisheye image and laser scanner data. The performance of SLAM has been improved by various estimation methods, sensors with multiple functions, or sensor fusion. Conventional 3D SLAM approaches which mainly employed RGB-D cameras to obtain depth information are not suitable for mobile robot applications because RGB-D camera system with multiple cameras have a greater size and slow processing time for the calculation of the depth information for omni-directional images. In this paper, we used a fisheye camera installed facing downwards and a two-dimensional laser scanner separate from the camera at a constant distance. We calculated fusion points from the plane coordinates of obstacles obtained by the information of the two-dimensional laser scanner and the outline of obstacles obtained by the omni-directional image sensor that can acquire surround view at the same time. The effectiveness of the proposed method is confirmed through comparison between maps obtained using the proposed algorithm and real maps.

센서기반 지능형 아크 용접 로봇 시스템의 동향 (Trends of Sensor-based Intelligent Arc Welding Robot System)

  • 정지훈;신현호;송영훈;김수종
    • 제어로봇시스템학회논문지
    • /
    • 제20권10호
    • /
    • pp.1051-1056
    • /
    • 2014
  • In this paper, we introduce an intelligent robotic arc welding system which exploits sensors like as LVS (Laser Vision Sensor), Hall effect sensor, voltmeter and so on. The use of industrial robot is saturated because of its own limitation, and one of the major limitations is that industrial robot cannot recognize the environment. Lately, sensor-based environmental awareness research of the industrial robot is performed actively to overcome such limitation, and it can expand application field and improve productivity. We classify the sensor-based intelligent arc welding robot system by the goal and the sensing data. The goals can be categorized into detection of a welding start point, tracking of a welding line and correction of a torch deformation. The Sensing data can be categorized into welding data (i.e. current, voltage and short circuit detection) and displacement data (i.e. distance, position). This paper covers not only the explanation of the each category but also its advantage and limitation.

분할법과 평균거리 개념에 의한 용접 결함 표현 방법 (The Weld Defects Expression Method by the Concept of Segment Splitting Method and Mean Distance)

  • 이정익;고병갑
    • 한국공작기계학회논문집
    • /
    • 제16권2호
    • /
    • pp.37-43
    • /
    • 2007
  • In this paper, laser vision sensor is used to detect some defects any $co_{2}$ welded specimen in hardware. But, as the best expression of defects of welded specimen, the concept of segment splitting method and mean distance are introduced in software. The developed GUI software is used for deriding whether any welded specimen makes as proper shape or detects in real time. The criteria are based upon ISO 5817 as limits of imperfections in metallic fusion welds.