• 제목/요약/키워드: Real-time position tracking

검색결과 281건 처리시간 0.02초

PSD 센서 및 Back Propagation 알고리즘을 이용한 AM1 로봇의 견질 제어 (Robust Control of AM1 Robot Using PSD Sensor and Back Propagation Algorithm)

  • 정동연;한성현
    • 한국산업융합학회 논문집
    • /
    • 제7권2호
    • /
    • pp.167-172
    • /
    • 2004
  • Neural networks are used in the framework of sensor based tracking control of robot manipulators. They learn by practice movements the relationship between PSD(an analog Position Sensitive Detector) sensor readings for target positions and the joint commands to reach them. Using this configuration, the system can track or follow a moving or stationary object in real time. Furthermore, an efficient neural network architecture has been developed for real time learning. This network uses multiple sets of simple back propagation networks one of which is selected according to which division (Corresponding to a cluster of the self-organizing feature map) in data space the current input data belongs to. This lends itself to a very training and processing implementation required for real time control.

  • PDF

IGRT를 위한 비침습적인 호흡에 의한 장기 움직임 실시간 추적시스템 (A Non-invasive Real-time Respiratory Organ Motion Tracking System for Image Guided Radio-Therapy)

  • 김윤종;윤의중
    • 대한의용생체공학회:의공학회지
    • /
    • 제28권5호
    • /
    • pp.676-683
    • /
    • 2007
  • A non-invasive respiratory gated radiotherapy system like those based on external anatomic motion gives better comfortableness to patients than invasive system on treatment. However, higher correlation between the external and internal anatomic motion is required to increase the effectiveness of non-invasive respiratory gated radiotherapy. Both of invasive and non-invasive methods need to track the internal anatomy with the higher precision and rapid response. Especially, the non-invasive method has more difficulty to track the target position successively because of using only image processing. So we developed the system to track the motion for a non-invasive respiratory gated system to accurately find the dynamic position of internal structures such as the diaphragm and tumor. The respiratory organ motion tracking apparatus consists of an image capture board, a fluoroscopy system and a processing computer. After the image board grabs the motion of internal anatomy through the fluoroscopy system, the computer acquires the organ motion tracking data by image processing without any additional physical markers. The patients breathe freely without any forced breath control and coaching, when this experiment was performed. The developed pattern-recognition software could extract the target motion signal in real-time from the acquired fluoroscopic images. The range of mean deviations between the real and acquired target positions was measured for some sample structures in an anatomical model phantom. The mean and max deviation between the real and acquired positions were less than 1mm and 2mm respectively with the standardized movement using a moving stage and an anatomical model phantom. Under the real human body, the mean and maximum distance of the peak to trough was measured 23.5mm and 55.1mm respectively for 13 patients' diaphragm motion. The acquired respiration profile showed that human expiration period was longer than the inspiration period. The above results could be applied to respiratory-gated radiotherapy.

ECDIS에 의한 외끌이 기선저인망 어선의 투양망 조업 과정의 실시간 모니터링 (Real-time monitoring of net setting and hauling process in fishing operations of Danish seine vessel using ECDIS)

  • 이대재;변덕수
    • 수산해양기술연구
    • /
    • 제43권4호
    • /
    • pp.347-354
    • /
    • 2007
  • This paper describes on the real-time monitoring of net setting and hauling process for fishing operations of Danish seine vessels in the southern waters of Korea as an application of a PC based ECDIS system. Tracking of fishing process was performed for the large scale Danish seine vessel of G/T 90 and 350 PS class using the fishing gear which the length of net, ground rope, head rope and sweep line including warp in both sides were 86m, 104m, 118m and 3,200m, respectively. Tracking information for net setting and hauling process was continuously recorded for 23 fishing operations performed on November and December, 2003. All measurement data, such as trawl position, heading, towing course and past track which was individually time stamped during data acquisition, was processed in real time on the ECDIS and displayed simultaneously on the ENC chart. The results indicated that after the separation of a marker buoy from Danish seiner, the averaged running speed of vessel and the averaged setting period while shooting the seine on the course of diamond shape to surround the fish school in the 23 fishing operations were 8.3 knots and 13.1 minutes, respectively. And with the maker buoy taken on board, the averaged running speed of vessel and the averaged towing period while closing the seine on the straight route was 1.0 knots and 47.0 minutes, respectively. After the closing stage of hand rope, the hand rope was towed by the averaged speed of 2.2 knots during the 13.0 minutes. The average area for route of diamond shape swept by sweep lines of the seine in 23 fishing grounds was $709,951.6m^2$. Further investigation is also planed to provide more quantitative tracking information and to achieve more effective surveillance and control of Danish seine vessels in EEZ fishing grounds.

S-FPDP기법을 적용한 동영상홍채의 위치추적 향상 (Irises Position Tracking Improvement Using S-FPDP on Moving Irises Image)

  • 류광렬;채덕현
    • 한국정보통신학회논문지
    • /
    • 제10권1호
    • /
    • pp.123-126
    • /
    • 2006
  • 본 논문은 눈 안에서 움직이는 홍채의 영상에 대해 홍채의 위치추적능력을 향상시키기 위 한 연구이다. 향상기법은 S-FPDP기법을 적용하였다. 실험 결과 기존의 FPDP기법을 적용한 방법보다 추적범위가 증가하였고, 홍채의 움직임에서 추적이 가능하였다.

불확실한 환경에서 매니퓰레이터 위치제어를 위한 실시간 비젼제어기법에 관한 연구 (A Study on the Real-Time Vision Control Method for Manipulator's position Control in the Uncertain Circumstance)

  • 정완식;김경석;신광수;주철;김재확;윤현권
    • 한국정밀공학회지
    • /
    • 제16권12호
    • /
    • pp.87-98
    • /
    • 1999
  • This study is concentrated on the development of real-time estimation model and vision control method as well as the experimental test. The proposed method permits a kind of adaptability not otherwise available in that the relationship between the camera-space location of manipulable visual cues and the vector of manipulator joint coordinates is estimate in real time. This is done based on a estimation model ta\hat generalizes known manipulator kinematics to accommodate unknown relative camera position and orientation as well as uncertainty of manipulator. This vision control method is roboust and reliable, which overcomes the difficulties of the conventional research such as precise calibration of the vision sensor, exact kinematic modeling of the manipulator, and correct knowledge of position and orientation of CCD camera with respect to the manipulator base. Finally, evidence of the ability of real-time vision control method for manipulator's position control is provided by performing the thin-rod placement in space with 2 cues test model which is completed without a prior knowledge of camera or manipulator positions. This feature opens the door to a range of applications of manipulation, including a mobile manipulator with stationary cameras tracking and providing information for control of the manipulator event.

  • PDF

연근해 소형 어선의 레이더 정보 수록 및 해석 시스템 개발 -위치 추적 및 실시간 모니터링 - (Development of Acquisition and Analysis System of Radar Information for Small Inshore and Coastal Fishing Vessels - Position Tracking and Real-Time Monitoring-)

  • 이대재;김광식;신형일;변덕수
    • 수산해양기술연구
    • /
    • 제39권4호
    • /
    • pp.337-346
    • /
    • 2003
  • 레이더 정보의 수록 및 정량적 해석을 위한 시도로써 타선의 위치정보를 자선의 전자해도 및 radar 화변상에 모니터링하여 두 선박의 간격을 측정할 수 있는 양선거리측정시스템과 ARPA radar가 탐지한 표적의 추적정보를 자선의 전자해도 화면상에 실시간으로 모니터링하기 위한 시스템을 구축함과 동시에 이들 레이더 정보를 필요에 따라 수록 및 재생할 수 있는 레이더 정보수록 및 해석 시스템을 개발하고, 실용화 실험을 행한 결과를 요약하면 다음과 같다. (1) 목포 인근해역에서 레이더 신호를 수록하고, 이 레이더 영상에 위치발생시뮬레이터에 의해 생성한 타선 위치를 RS232C interface를 통해 전송 및 중첩시켜 타선의 위치를 추적하면서 실시간으로 본선과 타선의 양선 간격을 산출한 결과, 양선 거리 및 방위의 실시간 측정이 가능하여 이 시스템을 양선 거리계로써 활용할 수 있음이 입증되었다. (2) 타선의 레이더 신호를 수신한 후, $\alpha$-$\beta$ tracker를 이용하여 타선 영상의 중심 위치를 실시간으로 추적하면서 침로, 속력, 방위, 거리 등을 예측한 결과, 매우 안정된 평활화 추정치를 얻을 수 있었다. (3) ARPA 시뮬레이터를 이용하여 표적의 추적정보를 TTM sentence 등으로써 생성한 후, 이 코드를 전자해도상에 전송 및 중첩 표시시킨 결과, 추적표적의 위치, 속력, 침로, 방위, 거리 등의 추적정보의 실시간 모니터링이 안정적으로 실현된 바, 선내의 여러 장소에서 ARPA 정보를 공유하기 위한 멀티 모니터링 장치의 개발이 기대된다.

영상처리를 이용한 머리의 움직임 추적 시스템 (Head tracking system using image processing)

  • 박경수;임창주;반영환;장필식
    • 대한인간공학회지
    • /
    • 제16권3호
    • /
    • pp.1-10
    • /
    • 1997
  • This paper is concerned with the development and evaluation of the camera calibration method for a real-time head tracking system. Tracking of head movements is important in the design of an eye-controlled human/computer interface and the area of virtual environment. We proposed a video-based head tracking system. A camera was mounted on the subject's head and it took the front view containing eight 3-dimensional reference points(passive retr0-reflecting markers) fixed at the known position(computer monitor). The reference points were captured by image processing board. These points were used to calculate the position (3-dimensional) and orientation of the camera. A suitable camera calibration method for providing accurate extrinsic camera parameters was proposed. The method has three steps. In the first step, the image center was calibrated using the method of varying focal length. In the second step, the focal length and the scale factor were calibrated from the Direct Linear Transformation (DLT) matrix obtained from the known position and orientation of the camera. In the third step, the position and orientation of the camera was calculated from the DLT matrix, using the calibrated intrinsic camera parameters. Experimental results showed that the average error of camera positions (3- dimensional) is about $0.53^{\circ}C$, the angular errors of camera orientations are less than $0.55^{\circ}C$and the data aquisition rate is about 10Hz. The results of this study can be applied to the tracking of head movements related to the eye-controlled human/computer interface and the virtual environment.

  • PDF

카메라-라이다 센서 융합을 통한 VRU 분류 및 추적 알고리즘 개발 (Vision and Lidar Sensor Fusion for VRU Classification and Tracking in the Urban Environment)

  • 김유진;이호준;이경수
    • 자동차안전학회지
    • /
    • 제13권4호
    • /
    • pp.7-13
    • /
    • 2021
  • This paper presents an vulnerable road user (VRU) classification and tracking algorithm using vision and LiDAR sensor fusion method for urban autonomous driving. The classification and tracking for vulnerable road users such as pedestrian, bicycle, and motorcycle are essential for autonomous driving in complex urban environments. In this paper, a real-time object image detection algorithm called Yolo and object tracking algorithm from LiDAR point cloud are fused in the high level. The proposed algorithm consists of four parts. First, the object bounding boxes on the pixel coordinate, which is obtained from YOLO, are transformed into the local coordinate of subject vehicle using the homography matrix. Second, a LiDAR point cloud is clustered based on Euclidean distance and the clusters are associated using GNN. In addition, the states of clusters including position, heading angle, velocity and acceleration information are estimated using geometric model free approach (GMFA) in real-time. Finally, the each LiDAR track is matched with a vision track using angle information of transformed vision track and assigned a classification id. The proposed fusion algorithm is evaluated via real vehicle test in the urban environment.

Kinematic Method of Camera System for Tracking of a Moving Object

  • Jin, Tae-Seok
    • Journal of information and communication convergence engineering
    • /
    • 제8권2호
    • /
    • pp.145-149
    • /
    • 2010
  • In this paper, we propose a kinematic approach to estimating the real-time moving object. A new scheme for a mobile robot to track and capture a moving object using images of a camera is proposed. The moving object is assumed to be a point-object and projected onto an image plane to form a geometrical constraint equation that provides position data of the object based on the kinematics of the active camera. Uncertainties in the position estimation caused by the point-object assumption are compensated using the Kalman filter. To generate the shortest time path to capture the moving object, the linear and angular velocities are estimated and utilized. The experimental results of tracking and capturing of the target object with the mobile robot are presented.

드론 기반의 전력선 추적 제어 시스템 (Drone-based Power-line Tracking System)

  • 정종민;김재승;윤태성;박진배
    • 전기학회논문지
    • /
    • 제67권6호
    • /
    • pp.773-781
    • /
    • 2018
  • In recent years, a study of power-line inspection using an unmanned aerial vehicle (UAV) has been actively conducted. However, relevant studies have been conducting power-line inspection with an UAV operated by manual control, and they have developed just power-line detection algorithm on aerial images. To overcome limitations of existing research, we propose a drone-based power-line tracking system in this paper. The main contributions of this paper are to operate developed system under configured environment and to develop a power-line detection algorithm in real-time. Developed system is composed of the power-line detection and the image-based tracking control. To detect a power-line in real-time, a region of interest (ROI) image is extracted. Furthermore, clustering algorithm is used in order to discriminate the power-line from background. Finally, the power-line is detected by using the Hough transform, and a center position and a tilt angle are estimated by using the Kalman filter to control a drone smoothly. We design a position controller and an attitude controller for image-based tracking control, and both controllers are designed based on the proportional-derivative (PD) control method. The interaction between the position controller and the attitude controller makes the drone track the power-line. Several experiments were carried out in environments where conditions are similar to actual environments, which demonstrates the superiority of the developed system.