• 제목/요약/키워드: Mobile Camera

검색결과 959건 처리시간 0.024초

이동 로봇의 상대적 위치 추정을 위한 직사각형 기반의 기하학적 방법 (Geometric Formulation of Rectangle Based Relative Localization of Mobile Robot)

  • 이주행;이재연;이아현;김재홍
    • 로봇학회논문지
    • /
    • 제11권1호
    • /
    • pp.9-18
    • /
    • 2016
  • A rectangle-based relative localization method is proposed for a mobile robot based on a novel geometric formulation. In an artificial environment where a mobile robot navigates, rectangular shapes are ubiquitous. When a scene rectangle is captured using a camera attached to a mobile robot, localization can be performed and described in the relative coordinates of the scene rectangle. Especially, our method works with a single image for a scene rectangle whose aspect ratio is not known. Moreover, a camera calibration is unnecessary with an assumption of the pinhole camera model. The proposed method is largely based on the theory of coupled line cameras (CLC), which provides a basis for efficient computation with analytic solutions and intuitive geometric interpretation. We introduce the fundamentals of CLC and describe the proposed method with some experimental results in simulation environment.

단일 카메라를 이용한 이동 로봇의 실시간 위치 추정 및 지도 작성에 관한 연구 (A Study on Real-Time Localization and Map Building of Mobile Robot using Monocular Camera)

  • 정대섭;최종훈;장철웅;장문석;공정식;이응혁;심재홍
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2006년 학술대회 논문집 정보 및 제어부문
    • /
    • pp.536-538
    • /
    • 2006
  • The most important factor of mobile robot is to build a map for surrounding environment and estimate its localization. This paper proposes a real-time localization and map building method through 3-D reconstruction using scale invariant feature from monocular camera. Mobile robot attached monocular camera looking wall extracts scale invariant features in each image using SIFT(Scale Invariant Feature Transform) as it follows wall. Matching is carried out by the extracted features and matching feature map that is transformed into absolute coordinates using 3-D reconstruction of point and geometrical analysis of surrounding environment build, and store it map database. After finished feature map building, the robot finds some points matched with previous feature map and find its pose by affine parameter in real time. Position error of the proposed method was maximum. 8cm and angle error was within $10^{\circ}$.

  • PDF

카메라-레이저스캐너 상호보완 추적기를 이용한 이동 로봇의 사람 추종 (Person-following of a Mobile Robot using a Complementary Tracker with a Camera-laser Scanner)

  • 김형래;최학남;이재홍;이승준;김학일
    • 제어로봇시스템학회논문지
    • /
    • 제20권1호
    • /
    • pp.78-86
    • /
    • 2014
  • This paper proposes a method of tracking an object for a person-following mobile robot by combining a monocular camera and a laser scanner, where each sensor can supplement the weaknesses of the other sensor. For human-robot interaction, a mobile robot needs to maintain a distance between a moving person and itself. Maintaining distance consists of two parts: object tracking and person-following. Object tracking consists of particle filtering and online learning using shape features which are extracted from an image. A monocular camera easily fails to track a person due to a narrow field-of-view and influence of illumination changes, and has therefore been used together with a laser scanner. After constructing the geometric relation between the differently oriented sensors, the proposed method demonstrates its robustness in tracking and following a person with a success rate of 94.7% in indoor environments with varying lighting conditions and even when a moving object is located between the robot and the person.

Compact Zoom Lens Design for a 5x Mobile Camera Using Prism

  • Park, Sung-Chan;Lee, Sang-Hun;Kim, Jong-Gyu
    • Journal of the Optical Society of Korea
    • /
    • 제13권2호
    • /
    • pp.206-212
    • /
    • 2009
  • This study presents the compact zoom lens with a zoom ratio of 5x for a mobile camera by using a prism. The lens modules and aberrations are applied to the initial design for a four-group inner-focus zoom system. An initial design with a focal length range of 4.4 to 22.0 mm is derived by assigning the first-order quantities and third-order aberrations to each module along with the constraints required for optimum solutions. We separately designed a real lens for each group and then combined them to establish an actual zoom system. The combination of the separately designed groups results in a system that satisfies the basic properties of the zoom system consisting of the original lens modules. In order to have a slim system, we directly inserted the right-angle prism in front of the first group. This configuration resulted in a more compact zoom system with a depth of 8 mm. The finally designed zoom lens has an f-number of 3.5 to 4.5 and is expected to fulfill the requirements for a slim mobile zoom camera having high zoom ratio of 5x.

옴니 카메라의 전방향 영상을 이용한 이동 로봇의 위치 인식 시스템 (Omni Camera Vision-Based Localization for Mobile Robots Navigation Using Omni-Directional Images)

  • 김종록;임미섭;임준홍
    • 제어로봇시스템학회논문지
    • /
    • 제17권3호
    • /
    • pp.206-210
    • /
    • 2011
  • Vision-based robot localization is challenging due to the vast amount of visual information available, requiring extensive storage and processing time. To deal with these challenges, we propose the use of features extracted from omni-directional panoramic images and present a method for localization of a mobile robot equipped with an omni-directional camera. The core of the proposed scheme may be summarized as follows : First, we utilize an omni-directional camera which can capture instantaneous $360^{\circ}$ panoramic images around a robot. Second, Nodes around the robot are extracted by the correlation coefficients of Circular Horizontal Line between the landmark and the current captured image. Third, the robot position is determined from the locations by the proposed correlation-based landmark image matching. To accelerate computations, we have assigned the node candidates using color information and the correlation values are calculated based on Fast Fourier Transforms. Experiments show that the proposed method is effective in global localization of mobile robots and robust to lighting variations.

차량측량시스템을 위한 영상취득 프로그램 개발 (Development of Image Capture Program for a Mobile Mapping System along CCD Camera Characteristics)

  • 정동훈;김병국
    • 한국공간정보시스템학회 논문지
    • /
    • 제4권2호
    • /
    • pp.35-40
    • /
    • 2002
  • 본 연구에서는 현재 개발되고 있는 차량측량시스템(또는 Mobile Mapping System)에서 사용하기 위한 CCD(Charge Coupled Device) 카메라 영상을 취득하는 프로그램을 개발하였다. 특히 2개의 고해상도 칼라영상을 동시에 취득하여 저장할 수 있도록 하여 수치사진측량 기법의 적용이 가능하도록 하였다. 본 연구에서 개발한 프로그램은 GPS-IMU의 장비와 결합하여 차량측량시스템을 이용한 각종 시설물의 3차원 위치 결정, 보존상태 파악 등에 보다 효율적으로 이유될 수 있을 것이다. 특히 현재까지 개발되어 활용중인 프로그램들이 흑백영상을 대상으로 하고 있는 반면 본 연구에서 사용하고 있는 영상은 고해상도의 칼라영상이기 때문에 대상물의 판독력을 높일 수 있어 차량측량시스템의 활용분야를 넓일 수 있을 것으로 기대된다.

  • PDF

혼합 비주얼 서보잉을 통한 모바일 로봇의 물체 추종 (Objects Tracking of the Mobile Robot Using the Hybrid Visual Servoing)

  • 박강일;우창준;이장명
    • 제어로봇시스템학회논문지
    • /
    • 제21권8호
    • /
    • pp.781-787
    • /
    • 2015
  • This paper proposes a hybrid visual servoing algorithm for the object tracking by a mobile robot with the stereo camera. The mobile robot with the stereo camera performs an object recognition and object tracking using the SIFT and CAMSHIFT algorithms for the hybrid visual servoing. The CAMSHIFT algorithm using stereo camera images has been used to obtain the three-dimensional position and orientation of the mobile robot. With the hybrid visual servoing, a stable balance control has been realized by a control system which calculates a desired angle of the center of gravity whose location depends on variations of link rotation angles of the manipulator. A PID controller algorithm has adopted in this research for the control of the manipulator since the algorithm is simple to design and it does not require unnecessary complex dynamics. To demonstrate the control performance of the hybrid visual servoing, real experiments are performed using the mobile manipulator system developed for this research.

Novel Telecentric Collimator Design for Mobile Optical Inspection Instruments

  • Hojong Choi;Seongil Cho;Jaemyung Ryu
    • Current Optics and Photonics
    • /
    • 제7권3호
    • /
    • pp.263-272
    • /
    • 2023
  • A collimator refers to an optical system that images a collimated beam at a desired point. A resolution target located at a near distance can be converted into a virtual image located at a long distance. To test the resolution for mobile cameras, a large target is placed at a long distance. If a collimator system is used, the target can be placed at a near distance. The space required for a resolution inspection can thus be drastically reduced. However, to inspect a mobile camera, the exit pupil of the collimator system and the entrance pupil of the mobile camera must match, and the stop of the collimator system must be located on the last surface. Because a collimator system cannot be symmetrical with respect to the stop, the distortion becomes extremely large, which can be corrected by combining the collimator symmetrically with respect to the object plane. A novel system was designed to inspect an optical lens on a mobile phone. After arranging the refractive power, lenses were added using the equivalent lens design method. The distortion was reduced to less than 1%. This optical system satisfies a half-field angle of 45° and an optical performance sufficient for inspection.

Modified Particle Filtering for Unstable Handheld Camera-Based Object Tracking

  • Lee, Seungwon;Hayes, Monson H.;Paik, Joonki
    • IEIE Transactions on Smart Processing and Computing
    • /
    • 제1권2호
    • /
    • pp.78-87
    • /
    • 2012
  • In this paper, we address the tracking problem caused by camera motion and rolling shutter effects associated with CMOS sensors in consumer handheld cameras, such as mobile cameras, digital cameras, and digital camcorders. A modified particle filtering method is proposed for simultaneously tracking objects and compensating for the effects of camera motion. The proposed method uses an elastic registration algorithm (ER) that considers the global affine motion as well as the brightness and contrast between images, assuming that camera motion results in an affine transform of the image between two successive frames. By assuming that the camera motion is modeled globally by an affine transform, only the global affine model instead of the local model was considered. Only the brightness parameter was used in intensity variation. The contrast parameters used in the original ER algorithm were ignored because the change in illumination is small enough between temporally adjacent frames. The proposed particle filtering consists of the following four steps: (i) prediction step, (ii) compensating prediction state error based on camera motion estimation, (iii) update step and (iv) re-sampling step. A larger number of particles are needed when camera motion generates a prediction state error of an object at the prediction step. The proposed method robustly tracks the object of interest by compensating for the prediction state error using the affine motion model estimated from ER. Experimental results show that the proposed method outperforms the conventional particle filter, and can track moving objects robustly in consumer handheld imaging devices.

  • PDF

Landmark를 이용한 localization 문제 접근에 관한 연구 (A study on approach of localization problem using landmarks)

  • 김태우;이쾌희
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 1997년도 한국자동제어학술회의논문집; 한국전력공사 서울연수원; 17-18 Oct. 1997
    • /
    • pp.44-47
    • /
    • 1997
  • Building a reliable mobile robot - one that can navigate without failures for long periods of time - requires that the uncertainty which results from control and sensing is bounded. This paper proposes a new mobile robot localization method using artificial landmarks. For a mobile robot localization, the proposed method uses a camera calibration(only extrinsic parameters). We use the FANUC arc mate to estimate the posture error, and the result shows that the position error is less than 1 cm and the orientation error less than 1 degrees.

  • PDF