• 제목/요약/키워드: 2D position measurement

검색결과 187건 처리시간 0.027초

점 대응 기법을 이용한 카메라의 교정 파라미터 추정에 관한 연구 (A Study on the Estimation of Camera Calibration Parameters using Cooresponding Points Method)

  • 최성구;고현민;노도환
    • 대한전기학회논문지:시스템및제어부문D
    • /
    • 제50권4호
    • /
    • pp.161-167
    • /
    • 2001
  • Camera calibration is very important problem in 3D measurement using vision system. In this paper is proposed the simple method for camera calibration. It is designed that uses the principle of vanishing points and the concept of corresponding points extracted from the parallel line pairs. Conventional methods are necessary for 4 reference points in one frame. But we proposed has need for only 2 reference points to estimate vanishing points. It has to calculate camera parameters, focal length, camera attitude and position. Our experiment shows the validity and the usability from the result that absolute error of attitude and position is in $10^{-2}$.

  • PDF

UAV와 BIM 정보를 활용한 시설물 외관 손상의 위치 측정 방법 (Structural Damage Localization for Visual Inspection Using Unmanned Aerial Vehicle with Building Information Modeling Information)

  • 이용주;박만우
    • 한국BIM학회 논문집
    • /
    • 제13권4호
    • /
    • pp.64-73
    • /
    • 2023
  • This study introduces a method of estimating the 3D coordinates of structural damage from the detection results of visual inspection provided in 2D image coordinates using sensing data of UAV and 3D shape information of BIM. This estimation process takes place in a virtual space and utilizes the BIM model, so it is possible to immediately identify which member of the structure the estimated location corresponds to. Difference from conventional structural damage localization methods that require 3D scanning or additional sensor attachment, it is a method that can be applied locally and rapidly. Measurement accuracy was calculated through the distance difference between the measured position measured by TLS (Terrestrial Laser Scanner) and the estimated position calculated by the method proposed in this study, which can determine the applicability of this study and the direction of future research.

Precise Edge Detection Method Using Sigmoid Function in Blurry and Noisy Image for TFT-LCD 2D Critical Dimension Measurement

  • Lee, Seung Woo;Lee, Sin Yong;Pahk, Heui Jae
    • Current Optics and Photonics
    • /
    • 제2권1호
    • /
    • pp.69-78
    • /
    • 2018
  • This paper presents a precise edge detection algorithm for the critical dimension (CD) measurement of a Thin-Film Transistor Liquid-Crystal Display (TFT-LCD) pattern. The sigmoid surface function is proposed to model the blurred step edge. This model can simultaneously find the position and geometry of the edge precisely. The nonlinear least squares fitting method (Levenberg-Marquardt method) is used to model the image intensity distribution into the proposed sigmoid blurred edge model. The suggested algorithm is verified by comparing the CD measurement repeatability from high-magnified blurry and noisy TFT-LCD images with those from the previous Laplacian of Gaussian (LoG) based sub-pixel edge detection algorithm and error function fitting method. The proposed fitting-based edge detection algorithm produces more precise results than the previous method. The suggested algorithm can be applied to in-line precision CD measurement for high-resolution display devices.

천장지향 2D-LiDAR 회전 모듈을 이용한 실내 주행 로봇의 천장 기반 위치 추정 (Ceiling-Based Localization of Indoor Robots Using Ceiling-Looking 2D-LiDAR Rotation Module)

  • 안재원;고윤호
    • 한국멀티미디어학회논문지
    • /
    • 제22권7호
    • /
    • pp.780-789
    • /
    • 2019
  • In this paper, we propose a new indoor localization method for indoor mobile robots using LiDAR. The indoor mobile robots operating in limited areas usually require high-precision localization to provide high level services. The performance of the widely used localization methods based on radio waves or computer vision are highly dependent on their usage environment. Therefore, the reproducibility of the localization is insufficient to provide high level services. To overcome this problem, we propose a new localization method based on the comparison between ceiling shape information obtained from LiDAR measurement and the blueprint. Specifically, the method includes a reliable segmentation method to classify point clouds into connected planes, an effective comparison method to estimate position by matching 3D point clouds and 2D blueprint information. Since the ceiling shape information is rarely changed, the proposed localization method is robust to its usage environment. Simulation results prove that the position error of the proposed localization method is less than 10 cm.

An Analysis of 2D Positional Accuracy of Human Bodies Detection Using the Movement of Mono-UWB Radar

  • Kiasari, Mohammad Ahangar;Na, Seung You;Kim, Jin Young
    • 센서학회지
    • /
    • 제23권3호
    • /
    • pp.149-157
    • /
    • 2014
  • This paper considers the ability of counting and positioning multi-targets by using a mobile UWB radar device. After a background subtraction process, distinguishing between clutters and human body signals, the position of targets will be computed using weighted Gaussian mixture methods. While computer vision offers many advantages, it has limited performance in poor visibility conditions (e.g., at night, haze, fog or smoke). UWB radar can provide a complementary technology for detecting and tracking humans, particularly in poor visibility or through-wall conditions. As we know, for 2D measurement, one method is the use of at least two receiver antennas. Another method is the use of one mobile radar receiver. This paper tried to investigate the position detection of the stationary human body using the movement of one UWB radar module.

Validity of Three-dimensional Facial Scan Taken with Facial Scanner and Digital Photo Wrapping on the Cone-beam Computed Tomography: Comparison of Soft Tissue Parameters

  • Aljawad, Hussein;Lee, Kyungmin Clara
    • Journal of Korean Dental Science
    • /
    • 제15권1호
    • /
    • pp.19-30
    • /
    • 2022
  • Purpose: The purpose of the study was to assess the validity of three-dimensional (3D) facial scan taken with facial scanner and digital photo wrapping on the cone-beam computed tomography (CBCT). Materials and Methods: Twenty-five patients had their CBCT scan, two-dimensional (2D) standardized frontal photographs and 3D facial scan obtained on the same day. The facial scans were taken with a facial scanner in an upright position. The 2D standardized frontal photographs were taken at a fixed distance from patients using a camera fixed to a cephalometric apparatus. The 2D integrated facial models were created using digital photo wrapping of frontal photographs on the corresponding CBCT images. The 3D integrated facial models were created using the integration process of 3D facial scans on the CBCT images. On the integrated facial models, sixteen soft tissue landmarks were identified, and the vertical, horizontal, oblique and angular distances between soft tissue landmarks were compared among the 2D facial models and 3D facial models, and CBCT images. Result: The results showed no significant differences of linear and angular measurements among CBCT images, 2D and 3D facial models except for Se-Sn vertical linear measurement which showed significant difference for the 3D facial models. The Bland-Altman plots showed that all measurements were within the limit of agreement. For 3D facial model, all Bland-Altman plots showed that systematic bias was less than 2.0 mm and 2.0° except for Se-Sn linear vertical measurement. For 2D facial model, the Bland-Altman plots of 6 out of 11 of the angular measurements showed systematic bias of more than 2.0°. Conclusion: The facial scan taken with facial scanner showed a clinically acceptable performance. The digital 2D photo wrapping has limitations in clinical use compared to 3D facial scans.

2 차원 탐색 레이다를 위한 국부 항법 좌표계에서의 운동보상을 포함한 추적필터 (A Tracking Filter with Motion Compensation in Local Navigation Frame for Ship-borne 2D Surveillance Radar)

  • 김병두;이자성
    • 제어로봇시스템학회논문지
    • /
    • 제13권5호
    • /
    • pp.507-512
    • /
    • 2007
  • This paper presents a tracking filter with ship's motion compensation for a ship-borne radar tracking system. The ship's maneuver is described by displacement and rotational motions in the ship-centered east-north frame. The first order Taylor series approximation of the measurement error covariance of the converted measurement is derived in the ship-centered east-north frame. The ship's maneuver is compensated by incorporating the measurement error covariance of the converted measurement and displacement of the position state in the tracking filter. The simulation results via 500 Monte-Carlo runs show that the proposed method follows the target successfully and provides consistent tracking performance during ship's maneuvers while the conventional tracking filter without ship motion compensation fails to track during such periods.

Measurement of position based on correlative function in self-movement

  • Amano, Naoki;Hashimoto, Hiroshi;Higashiguchi, Minoru
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1994년도 Proceedings of the Korea Automatic Control Conference, 9th (KACC) ; Taejeon, Korea; 17-20 Oct. 1994
    • /
    • pp.601-604
    • /
    • 1994
  • This paper describes an effective method to estimate a position of an automous vehicle equipped with a single CCD-camera along indoor passageways. Using the sequential image data from the self-movement of the vehicle, the position is estimated by integrating the approximated motion parameters. The detection of the yaw angle that is one of the motion parameter is difficult in general, e.g. slip or error for noise, therefore the different detection is presented, which is, without shaft encoders, based on a projection function for 2D-image data and a cross-correlation function so as to be robust for noise. The approximated geometric function to estimate the position is used to reduce the computational effort. To verify the effectiveness of the method, the analysis and the computational results are shown through the simulations. Furthermore, the experimental results by using the test vehicle for the real indoor passageway are shown.

  • PDF

방사선수술을 위한 3차원 정위 시스템 및 방사선량 측정 시스템 개발 (Development of 3-D Stereotactic Localization System and Radiation Measurement for Stereotactic Radiosurgery)

  • 서태석;서덕영;박승훈;장홍석;최보영;윤세철;신경섭;박용휘;김일환;강위생;하성환;박찬일
    • Journal of Radiation Protection and Research
    • /
    • 제20권1호
    • /
    • pp.25-36
    • /
    • 1995
  • The purpose of this research is to develop stereotactic localization and radiation measurement system for the efficient and precise radiosurgery. The algorithm to obtain a 3-D stereotactic coordinates of the target has been developed using a Fisher CT or angio localization. The procedure of stereotactic localization was programmed with PC computer, and consists of three steps: (1) transferring patient images into PC; (2) marking the position of target and reference points of the localizer from the patient image; (3) computing the stereotactic 3-D coordinates of target associated with position information of localizer. Coordinate transformation was quickly done on a real time base. The difference of coordinates computed from between Angio and CT localization method was within 2 mm, which could be generally accepted for the reliability of the localization system developed. We measured dose distribution in small fields of NEC 6 MVX linear accelerator using various detector; ion chamber, film, diode. Specific quantities measured include output factor, percent depth dose (PDD), tissue maximum ratio (TMR), off-axis ratio (OAR). There was small variation of measured data according to the different kinds of detectors used. The overall trends of measured beam data were similar enough to rely on our measurement. The measurement was performed with the use of hand-made spherical water phantom and film for standard arc set-up. We obtained the dose distribution as we expected. In conclusion, PC-based 3-D stereotactic localization system was developed to determine the stereotactic coordinate of the target. A convenient technique for the small field measurement was demonstrated. Those methods will be much helpful for the stereotactic radiosurgery.

  • PDF

화살 탄착점 측정을 위한 레이저 스캔 카메라 파라미터 보정 (Parameter Calibration of Laser Scan Camera for Measuring the Impact Point of Arrow)

  • 백경동;천성표;이인성;김성신
    • 한국생산제조학회지
    • /
    • 제21권1호
    • /
    • pp.76-84
    • /
    • 2012
  • This paper presents the measurement system of arrow's point of impact using laser scan camera and describes the image calibration method. The calibration process of distorted image is primarily divided into explicit and implicit method. Explicit method focuses on direct optical property using physical camera and its parameter adjustment functionality, while implicit method relies on a calibration plate which assumed relations between image pixels and target positions. To find the relations of image and target position in implicit method, we proposed the performance criteria based polynomial theorem model that overcome some limitations of conventional image calibration model such as over-fitting problem. The proposed method can be verified with 2D position of arrow that were taken by SICK Ranger-D50 laser scan camera.