• Title/Summary/Keyword: 영상보간법

Search Result 305, Processing Time 0.021 seconds

Correlation interpretation for surface-geophysical exploration data-Chojeong Area, Chungbuk (지표물리탐사 자료의 상관해석-충북 초정지역)

  • Gwon, Il Ryong;Kim, Ji Su;Kim, Gyeong Ho
    • Journal of the Korean Geophysical Society
    • /
    • v.2 no.1
    • /
    • pp.75-88
    • /
    • 1999
  • A recent major subject of geophysical exploration is research into 3-D subsurface imaging with a composite information from the various geophysical data. In an attempt to interpret Schlumberger sounding data for the study area in 2-D and 3-D view, resistivity imaging was firstly performed and then pseudo-3-D resistivity volume was reconstructed by interpolating several 1-D resistivity plots. Electrical resistivity discontinuities such as fracture zone were successfully clarified in pseudo-3-D resistivity volume. The low resistivity zone mainly associated with fracture zone appears to develop down to granitic basement in the central part of the study area. Seismic velocity near the lineament is estimated to be approximately as small as 3,000 m/s, and weathering-layer for the southeastern part is interpreted to be deeper than for the northwestern part. Geophysical attributes such as electrical resistivity, seismic velocity, radioactivity for the Chojeong Area were analysed by utilizing a GIS software Arc/Info. The major fault boundaries and fracture zones were resolved through image enhancement of composite section (electrical resistivity and seismic refraction data) and were interpreted to develop in the southeastern part of the area, as characterized by low electrical resistivity and low seismic velocity. However, radioactivity attribute was found to be less sensitive to geological discontinuities, compared to resistivity and seismic velocity attributes.

  • PDF

The Construction of GIS-based Flood Risk Area Layer Considering River Bight (하천 만곡부를 고려한 GIS 기반 침수지역 레이어 구축)

  • Lee, Geun-Sang;Yu, Byeong-Hyeok;Park, Jin-Hyeog;Lee, Eul-Rae
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.12 no.1
    • /
    • pp.1-11
    • /
    • 2009
  • Rapid visualization of flood area of downstream according to the dam effluent in flood season is very important in dam management works. Overlay zone of river bight should be removed to represent flood area efficiently based on flood stage which was modeled in river channels. This study applied drainage enforcement algorithm to visualize flood area considering river bight by coupling Coordinate Operation System for Flood control In Multi-reservoir (COSFIM) and Flood Wave routing model (FLDWAV). The drainage enforcement algorithm is a kind of interpolation which gives to advantage into hydrological process studies by removing spurious sinks of terrain in automatic drainage algorithm. This study presented mapping technique of flood area layer considering river bight in Namgang-Dam downstream, and developed system based on Arcobject component to execute this process automatically. Automatic extraction system of flood area layer could save time-consuming efficiently in flood inundation visualization work which was propelled based on large volume data. Also, flood area layer by coupling with IKONOS satellite image presented real information in flood disaster works.

  • PDF

A Study on Transport Robot for Autonomous Driving to a Destination Based on QR Code in an Indoor Environment (실내 환경에서 QR 코드 기반 목적지 자율주행을 위한 운반 로봇에 관한 연구)

  • Se-Jun Park
    • Journal of Platform Technology
    • /
    • v.11 no.2
    • /
    • pp.26-38
    • /
    • 2023
  • This paper is a study on a transport robot capable of autonomously driving to a destination using a QR code in an indoor environment. The transport robot was designed and manufactured by attaching a lidar sensor so that the robot can maintain a certain distance during movement by detecting the distance between the camera for recognizing the QR code and the left and right walls. For the location information of the delivery robot, the QR code image was enlarged with Lanczos resampling interpolation, then binarized with Otsu Algorithm, and detection and analysis were performed using the Zbar library. The QR code recognition experiment was performed while changing the size of the QR code and the traveling speed of the transport robot while the camera position of the transport robot and the height of the QR code were fixed at 192cm. When the QR code size was 9cm × 9cm The recognition rate was 99.7% and almost 100% when the traveling speed of the transport robot was less than about 0.5m/s. Based on the QR code recognition rate, an experiment was conducted on the case where the destination is only going straight and the destination is going straight and turning in the absence of obstacles for autonomous driving to the destination. When the destination was only going straight, it was possible to reach the destination quickly because there was little need for position correction. However, when the destination included a turn, the time to arrive at the destination was relatively delayed due to the need for position correction. As a result of the experiment, it was found that the delivery robot arrived at the destination relatively accurately, although a slight positional error occurred while driving, and the applicability of the QR code-based destination self-driving delivery robot was confirmed.

  • PDF

Development of Preliminary Quality Assurance Software for $GafChromic^{(R)}$ EBT2 Film Dosimetry ($GafChromic^{(R)}$ EBT2 Film Dosimetry를 위한 품질 관리용 초기 프로그램 개발)

  • Park, Ji-Yeon;Lee, Jeong-Woo;Choi, Kyoung-Sik;Hong, Semie;Park, Byung-Moon;Bae, Yong-Ki;Jung, Won-Gyun;Suh, Tae-Suk
    • Progress in Medical Physics
    • /
    • v.21 no.1
    • /
    • pp.113-119
    • /
    • 2010
  • Software for GafChromic EBT2 film dosimetry was developed in this study. The software provides film calibration functions based on color channels, which are categorized depending on the colors red, green, blue, and gray. Evaluations of the correction effects for light scattering of a flat-bed scanner and thickness differences of the active layer are available. Dosimetric results from EBT2 films can be compared with those from the treatment planning system ECLIPSE or the two-dimensional ionization chamber array MatriXX. Dose verification using EBT2 films is implemented by carrying out the following procedures: file import, noise filtering, background correction and active layer correction, dose calculation, and evaluation. The relative and absolute background corrections are selectively applied. The calibration results and fitting equation for the sensitometric curve are exported to files. After two different types of dose matrixes are aligned through the interpolation of spatial pixel spacing, interactive translation, and rotation, profiles and isodose curves are compared. In addition, the gamma index and gamma histogram are analyzed according to the determined criteria of distance-to-agreement and dose difference. The performance evaluations were achieved by dose verification in the $60^{\circ}$-enhanced dynamic wedged field and intensity-modulated (IM) beams for prostate cancer. All pass ratios for the two types of tests showed more than 99% in the evaluation, and a gamma histogram with 3 mm and 3% criteria was used. The software was developed for use in routine periodic quality assurance and complex IM beam verification. It can also be used as a dedicated radiochromic film software tool for analyzing dose distribution.

3D Facial Animation with Head Motion Estimation and Facial Expression Cloning (얼굴 모션 추정과 표정 복제에 의한 3차원 얼굴 애니메이션)

  • Kwon, Oh-Ryun;Chun, Jun-Chul
    • The KIPS Transactions:PartB
    • /
    • v.14B no.4
    • /
    • pp.311-320
    • /
    • 2007
  • This paper presents vision-based 3D facial expression animation technique and system which provide the robust 3D head pose estimation and real-time facial expression control. Many researches of 3D face animation have been done for the facial expression control itself rather than focusing on 3D head motion tracking. However, the head motion tracking is one of critical issues to be solved for developing realistic facial animation. In this research, we developed an integrated animation system that includes 3D head motion tracking and facial expression control at the same time. The proposed system consists of three major phases: face detection, 3D head motion tracking, and facial expression control. For face detection, with the non-parametric HT skin color model and template matching, we can detect the facial region efficiently from video frame. For 3D head motion tracking, we exploit the cylindrical head model that is projected to the initial head motion template. Given an initial reference template of the face image and the corresponding head motion, the cylindrical head model is created and the foil head motion is traced based on the optical flow method. For the facial expression cloning we utilize the feature-based method, The major facial feature points are detected by the geometry of information of the face with template matching and traced by optical flow. Since the locations of varying feature points are composed of head motion and facial expression information, the animation parameters which describe the variation of the facial features are acquired from geometrically transformed frontal head pose image. Finally, the facial expression cloning is done by two fitting process. The control points of the 3D model are varied applying the animation parameters to the face model, and the non-feature points around the control points are changed by use of Radial Basis Function(RBF). From the experiment, we can prove that the developed vision-based animation system can create realistic facial animation with robust head pose estimation and facial variation from input video image.