• Title/Summary/Keyword: Camera Angle

Search Result 784, Processing Time 0.026 seconds

Developement of a System for Glass Thickness Measurement (비접촉 유리 두께 측정 장치 개발)

  • Park, Jae-Beom;Lee, Eung-Suk;Lee, Min-Ki;Lee, Jong-Gun
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.33 no.5
    • /
    • pp.529-535
    • /
    • 2009
  • This paper describes a measuring device of glass thickness using machine vision and image processing techniques on real-time. Today, the machine vision enable to inspect fast and exactly than human's eyes. The presented system has advantages of continuous measurement, flexibility and good accuracy. The system consists of a laser diode, a CCD camera with PC. The camera located on the opposite side of the incident beam measures the distance between two reflected laser beams from the glass top and bottom surface. We apply a binary algorithm to convert and analyze the image from camera to PC. Laser point coordination by border tracing algorithm is used to find the center of beam circle. The measured result was compared with micrometer and showed 0.002mm accuracy. Finally, the errors were discussed how to minimize the influence of glass wedge angle and angular error of moving stage.

Fuzzy Control of a Mobile Robot with Camera

  • Cho, Jung-Tae;Lee, Seok-Won;Nam, Boo-Hee
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2000.10a
    • /
    • pp.381-381
    • /
    • 2000
  • This paper describes the path planning method in an unknown environment for an autonomous mobile robot equipped with CCD(Charge-Coupled Device) camera. The mobile robot moves along the guideline. The CCD camera is useful to detect the existence of a guideline. The wavelet transform is used to find the edge of guideline. Using wavelet transform, we can make an image processing more easily and rapidly. We make a fuzzy control rule using image data then make a decision the position and the navigation of the mobile robot. The center value that indicates the center of guideline is the input of fuzzy logic controller and the steering angle of the mobile robot is the fuzzy output. Some actual experiments for the mobile robot applied fuzzy control show that the mobile robot effectively moves to target position.

  • PDF

An Experimental Study on the Optimal Number of Cameras used for Vision Control System (비젼 제어시스템에 사용된 카메라의 최적개수에 대한 실험적 연구)

  • 장완식;김경석;김기영;안힘찬
    • Transactions of the Korean Society of Machine Tool Engineers
    • /
    • v.13 no.2
    • /
    • pp.94-103
    • /
    • 2004
  • The vision system model used for this study involves the six parameters that permits a kind of adaptability in that relationship between the camera space location of manipulable visual cues and the vector of robot joint coordinates is estimated in real time. Also this vision control method requires the number of cameras to transform 2-D camera plane from 3-D physical space, and be used irrespective of location of cameras, if visual cues are displayed in the same camera plane. Thus, this study is to investigate the optimal number of cameras used for the developed vision control system according to the change of the number of cameras. This study is processed in the two ways : a) effectiveness of vision system model b) optimal number of cameras. These results show the evidence of the adaptability of the developed vision control method using the optimal number of cameras.

Use of Optical Flow Information with three Cameras for Robot Navigation (로봇 주행을 위한 세개의 카메라를 사용한 광류 정보 활용)

  • Lee, Soo-Yong
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.18 no.2
    • /
    • pp.110-117
    • /
    • 2012
  • This paper describes a new design of optical flow estimation system with three cameras. Optical flow provides useful information of camera movement; however a unique solution is not usually available for unknowns including the depth information. A camera and two tilted cameras are used to have different view of angle and direction of movement to the camera axis. Geometric analysis is performed for cases of several independent movements. The ideas of taking advantage of the extra information for robot navigation are discussed with experimental results.

Design and Performance Verification of a LWIR Zoom Camera for Drones

  • Kwang-Woo Park;Jonghwa Choi;Jian Kang
    • Current Optics and Photonics
    • /
    • v.7 no.4
    • /
    • pp.354-361
    • /
    • 2023
  • We present the optical design and experimental verification of resolving performance of a 3× long wavelength infrared (LWIR) zoom camera for drones. The effective focal length of the system varies from 24.5 mm at the wide angle position to 75.1 mm at the telephoto position. The design specifications of the system were derived from ground resolved distance (GRD) to recognize 3 m × 6 m target at a distance of 1 km, at the telephoto position. To satisfy the system requirement, the aperture (f-number) of the system is taken as F/1.6 and the final modulation transfer function (MTF) should be higher than 0.1 (10%). The measured MTF in the laboratory was 0.127 (12.7%), exceeds the system requirement. Outdoor targets were used to verify the comprehensive performance of the system. The system resolved 4-bar targets corresponding to the spatial resolution at the distance of 1 km, 1.4 km and 2 km.

Development of a Novel System for Measuring Sizing Degree Based on Contact Angle(I) - Development of a Novel Principle for Automatic Measurement of Contact Angle - (접촉각 측정 원리를 이용한 새로운 사이즈도 측정기 (제1보) -자동 접촉각 측정 원리의 개발 -)

  • 이찬용;김철환;최경민;박종열;권오철
    • Journal of Korea Technical Association of The Pulp and Paper Industry
    • /
    • v.35 no.3
    • /
    • pp.43-52
    • /
    • 2003
  • The new principle to measure a sizing degree by a contact angle was developed using an automatic determination of the 3-end point coordinates of the water droplet on a sheet, which could diminish the operator's bias during measurement. A constant amount of water was first placed on a sample sheet by a water dispenser, and then an image of the liquid droplet was captured by a digital camera and then transmitted to a computer. The program measuring for contact angle extracted a liquid contour by Gaussian function combined with a 8-direction chain code. The Euclidean equation was applied to the binary image of the liquid contour in order to measure the diameter of the contour. Finally, the contact angle of the liquid was calculated by using the diameter and the top coordinates. In addition, a surface free energy of the sample sheet and an elapsed time taken up to the complete absorption into the sheet were simultaneously measured with the contact angle.

Preflight Calibration Results of Wide-Angle Polarimetric Camera (PolCam) onboard Korean Lunar Orbiter, Danuri

  • Minsup Jeong;Young-Jun Choi;Kyung-In Kang;Bongkon Moon;Bonju Gu;Sungsoo S. Kim;Chae Kyung Sim;Dukhang Lee;Yuriy G. Shkuratov;Gorden Videen;Vadym Kaydash
    • Journal of The Korean Astronomical Society
    • /
    • v.56 no.2
    • /
    • pp.293-299
    • /
    • 2023
  • The Wide-Angle Polarimetric Camera (PolCam) is installed on the Korea's lunar orbiter, Danuri, which launched on August 5, 2022. The mission objectives of PolCam are to construct photometric maps at a wavelength of 336 nm and polarization maps at 461 and 748 nm, with a phase angle range of 0°-135° and a spatial resolution of less than 100 m. PolCam is an imager using the push-broom method and has two cameras, Cam 1 and Cam 2, with a viewing angle of 45° to the right and left of the spacecraft's direction of orbit. We conducted performance tests in a laboratory setting before installing PolCam's flight model on the spacecraft. We analyzed the CCD's dark current, flat-field frame, spot size, and light flux. The dark current was obtained during thermal / vacuum test with various temperatures and the flat-field frame data was also obtained with an integrating sphere and tungsten light bulb. We describe the calibration method and results in this study.

Coordinated Intra-Limb Relationships and Control in Gait Development Via the Angle-Angle Diagram (보행 시 연령에 따른 하지 관절 내 운동학적 협응과 제어)

  • Lee, Kyung-Ok
    • Korean Journal of Applied Biomechanics
    • /
    • v.14 no.3
    • /
    • pp.17-35
    • /
    • 2004
  • The purpose of this study is to explain developmental process of gait via angle-angle diagram to understand how coordinated relationships and control change with age. Twenty four female children, from one to five years of age were the test subjects for this study, and their results were compared to a control group consisting of twenty one adult females. The Vicon 370 CCD camera, VCR, video timer, monitor, and audio visual mixer was utilized to graph the gait cycle for all test subjects. Both coordinated Intra-limb relationships, and range of motion and timing according to quadrant were explained through the angle angle diagram. Movement in the sagittal plane showed both coordinated relationships and control earlier than movement in the coronal or transverse plane. In the sagittal plane, hip and Knee coordinated relationships developed first (from one year of age.) Coordinated relationships in the Knee and ankle and hip and ankle developed next, respectively. Both hip and ankle and knee and ankle development were inhibited by the inability of children to completely perform plantar flexion during the swing and initial double limb support phases. Children appeared to compensate for this by extending at their hip joint more than adults during the third phase, final double limb support. In many cases the angle angle diagram for children had a similar shape as adult's angle angle diagram. This shows that children can coordinate their movements at an early age. However, the magnitudes and timing of children's angle angle diagrams still varied greatly from adults, even at five years of age. This indicates that even at this age, children still do not possess full control of their movements.

Verification of Camera-Image-Based Target-Tracking Algorithm for Mobile Surveillance Robot Using Virtual Simulation (가상 시뮬레이션을 이용한 기동형 경계 로봇의 영상 기반 목표추적 알고리즘 검증)

  • Lee, Dong-Youm;Seo, Bong-Cheol;Kim, Sung-Soo;Park, Sung-Ho
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.36 no.11
    • /
    • pp.1463-1471
    • /
    • 2012
  • In this study, a 3-axis camera system design is proposed for application to an existing 2-axis surveillance robot. A camera-image-based target-tracking algorithm for this robot has also been proposed. The algorithm has been validated using a virtual simulation. In the algorithm, the heading direction vector of the camera system in the mobile surveillance robot is obtained by the position error between the center of the view finder and the center of the object in the camera image. By using the heading direction vector of the camera system, the desired pan and tilt angles for target-tracking and the desired roll angle for the stabilization of the camera image are obtained through inverse kinematics. The algorithm has been validated using a virtual simulation model based on MATLAB and ADAMS by checking the corresponding movement of the robot to the target motion and the virtual image error of the view finder.

Using Contour Matching for Omnidirectional Camera Calibration (투영곡선의 자동정합을 이용한 전방향 카메라 보정)

  • Hwang, Yong-Ho;Hong, Hyun-Ki
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.45 no.6
    • /
    • pp.125-132
    • /
    • 2008
  • Omnidirectional camera system with a wide view angle is widely used in surveillance and robotics areas. In general, most of previous studies on estimating a projection model and the extrinsic parameters from the omnidirectional images assume corresponding points previously established among views. This paper presents a novel omnidirectional camera calibration based on automatic contour matching. In the first place, we estimate the initial parameters including translation and rotations by using the epipolar constraint from the matched feature points. After choosing the interested points adjacent to more than two contours, we establish a precise correspondence among the connected contours by using the initial parameters and the active matching windows. The extrinsic parameters of the omnidirectional camera are estimated minimizing the angular errors of the epipolar plane of endpoints and the inverse projected 3D vectors. Experimental results on synthetic and real images demonstrate that the proposed algorithm obtains more precise camera parameters than the previous method.