• Title/Summary/Keyword: 반복적 보정

Search Result 220, Processing Time 0.027 seconds

Mobile Watermarking Based on the Distortion Analysis of Display-Capture Image in a Smart Phone (스마트폰에서 디스플레이-캡쳐 영상의 왜곡분석에 기반한 모바일 워터마킹)

  • Bae, Jong-Wook;Jung, Sung-Hwan
    • Journal of Korea Multimedia Society
    • /
    • v.15 no.7
    • /
    • pp.847-858
    • /
    • 2012
  • In this paper, we propose a mobile watermarking based on the distortion analysis of display-capture image in a smart phone. We compose a random sequence by utilizing the property of frequency band in the wavelet domain. Then we calculate the CCS (Coefficients Comparative Sum) using the block wavelet coefficients of selected subbands after the wavelet transformation and the random sequence and repeatedly embed a watermark using an insertion threshold for the watermark robustness. For correcting a distortion caused by the display-capture process, we adopt a frame at the outside of watermarked image, then we can equate a watermark synchronization by detecting the frame. And we can improve frame detection ratio by using an iteratively adaptive threshold. A proposed scheme embedded information of 206 bits into standard digital images and it shows an average about 41.42 dB in PSNR. In watermark extract experiments, a proposed scheme accurately recognizes the frame more than 97% in total captured images. Also in BER (Bit Error Ratio) of captured images, it shows about 3.73%, then it was improved more than 70%, compared with the Pramila's method.

Effects of Cyclic Thermal Load on the Signal Characteristics of FBG Sensors Packaged with Epoxy Adhesives (주기적인 반복 열하중이 패키징된 FBG 센서 신호 특성에 미치는 영향)

  • Kim, Heonyoung;Kang, Donghoon
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.41 no.4
    • /
    • pp.313-319
    • /
    • 2017
  • Fiber optics sensors that have been mainly applied to aerospace areas are now finding applicability in other areas, such as transportation, including railways. Among the sensors, the fiber Bragg grating (FBG) sensors have led to a steep increase due to their properties of absolute measurement and multiplexing capability. Generally, the FBG sensors adhere to structures and sensing modules using adhesives such as an epoxy. However, the measurement errors that occurred when the FBG sensors were used in a long-term application, where they were exposed to environmental thermal load, required calibration. For this reason, the thermal curing of adhesives needs to be investigated to enhance the reliability of the FBG sensor system. This can be done at room temperature through cyclic thermal load tests using four types of specimens. From the test results, it is confirmed that residual compressive strain occurs to the FBG sensors due to an initial cyclic thermal load. In conclusion, signals of the FBG sensors need to be stabilized for applying them to a long-term SHM.

Improved characterization method for mobile phone camera and LCD display (모바일 폰 카메라와 LCD의 향상된 특성화 방법)

  • Jang, In-Su;Son, Chang-Hwan;Lee, Cheol-Hee;Song, Kun-Woen;Ha, Yeong-Ho
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.45 no.2
    • /
    • pp.65-73
    • /
    • 2008
  • The characterization process for the accurate color reproduction in mobile phone with camera and LCD is popular. The camera and LCD characterization, gamut mapping process is necessary to map the camera's input color stimulus, CIEXYZ value, into the LCD's output color stimulus. Each characterization is the process estimating the relation between input and output signals. In case of LCD, because of output device, the output color stimulus for the arbitrary input signal can be measured by spectro-radiometer However, in the camera, as the input device, the characterization is an inaccurate and needs the manual works in the process obtaining the output signal because the input signal can not be generated. Moreover, after gamut mapping process, the noise is increased because the optimized gamma tone curve of camera for the noise is distorted by the characterization. Thus, this paper proposed the system of obtaining the output signal of camera and the method of gamma correction for the noise. The camera's output signal is obtained by RGB values of patches from captured the color chart image. However, besides the illumination, the error for the location of the chart in the viewfinder is generated when many camera modules are captured the chart. The method of correcting the position to correct the error from manual works. The position of camera is estimated by captured image. This process and moving of camera is accomplished repeatedly, and the optimized position can be obtained. Moreover, the lightness curve of camera output is corrected partly to reduce the noise from the characterization process.

Patient Position Verification and Corrective Evaluation Using Cone Beam Computed Tomography (CBCT) in Intensity.modulated Radiation Therapy (세기조절방사선치료 시 콘빔CT (CBCT)를 이용한 환자자세 검증 및 보정평가)

  • Do, Gyeong-Min;Jeong, Deok-Yang;Kim, Young-Bum
    • The Journal of Korean Society for Radiation Therapy
    • /
    • v.21 no.2
    • /
    • pp.83-88
    • /
    • 2009
  • Purpose: Cone beam computed tomography (CBCT) using an on board imager (OBI) can check the movement and setup error in patient position and target volume by comparing with the image of computer simulation treatment in real.time during patient treatment. Thus, this study purposed to check the change and movement of patient position and target volume using CBCT in IMRT and calculate difference from the treatment plan, and then to correct the position using an automated match system and to test the accuracy of position correction using an electronic portal imaging device (EPID) and examine the usefulness of CBCT in IMRT and the accuracy of the automatic match system. Materials and Methods: The subjects of this study were 3 head and neck patients and 1 pelvis patient sampled from IMRT patients treated in our hospital. In order to investigate the movement of treatment position and resultant displacement of irradiated volume, we took CBCT using OBI mounted on the linear accelerator. Before each IMRT treatment, we took CBCT and checked difference from the treatment plan by coordinate by comparing it with the image of CT simulation. Then, we made correction through the automatic match system of 3D/3D match to match the treatment plan, and verified and evaluated using electronic portal imaging device. Results: When CBCT was compared with the image of CT simulation before treatment, the average difference by coordinate in the head and neck was 0.99 mm vertically, 1.14 mm longitudinally, 4.91 mm laterally, and 1.07o in the rotational direction, showing somewhat insignificant differences by part. In testing after correction, when the image from the electronic portal imaging device was compared with DRR image, it was found that correction had been made accurately with error less than 0.5 mm. Conclusion: By comparing a CBCT image before treatment with a 3D image reconstructed into a volume instead of a 2D image for the patient's setup error and change in the position of the organs and the target, we could measure and correct the change of position and target volume and treat more accurately, and could calculate and compare the errors. The results of this study show that CBCT was useful to deliver accurate treatment according to the treatment plan and to increase the reproducibility of repeated treatment, and satisfactory results were obtained. Accuracy enhanced through CBCT is highly required in IMRT, in which the shape of the target volume is complex and the change of dose distribution is radical. In addition, further research is required on the criteria for match focus by treatment site and treatment purpose.

  • PDF

An Iterative, Interactive and Unified Seismic Velocity Analysis (반복적 대화식 통합 탄성파 속도분석)

  • Suh Sayng-Yong;Chung Bu-Heung;Jang Seong-Hyung
    • Geophysics and Geophysical Exploration
    • /
    • v.2 no.1
    • /
    • pp.26-32
    • /
    • 1999
  • Among the various seismic data processing sequences, the velocity analysis is the most time consuming and man-hour intensive processing steps. For the production seismic data processing, a good velocity analysis tool as well as the high performance computer is required. The tool must give fast and accurate velocity analysis. There are two different approches in the velocity analysis, batch and interactive. In the batch processing, a velocity plot is made at every analysis point. Generally, the plot consisted of a semblance contour, super gather, and a stack pannel. The interpreter chooses the velocity function by analyzing the velocity plot. The technique is highly dependent on the interpreters skill and requires human efforts. As the high speed graphic workstations are becoming more popular, various interactive velocity analysis programs are developed. Although, the programs enabled faster picking of the velocity nodes using mouse, the main improvement of these programs is simply the replacement of the paper plot by the graphic screen. The velocity spectrum is highly sensitive to the presence of the noise, especially the coherent noise often found in the shallow region of the marine seismic data. For the accurate velocity analysis, these noise must be removed before the spectrum is computed. Also, the velocity analysis must be carried out by carefully choosing the location of the analysis point and accuarate computation of the spectrum. The analyzed velocity function must be verified by the mute and stack, and the sequence must be repeated most time. Therefore an iterative, interactive, and unified velocity analysis tool is highly required. An interactive velocity analysis program, xva(X-Window based Velocity Analysis) was invented. The program handles all processes required in the velocity analysis such as composing the super gather, computing the velocity spectrum, NMO correction, mute, and stack. Most of the parameter changes give the final stack via a few mouse clicks thereby enabling the iterative and interactive processing. A simple trace indexing scheme is introduced and a program to nike the index of the Geobit seismic disk file was invented. The index is used to reference the original input, i.e., CDP sort, directly A transformation techinique of the mute function between the T-X domain and NMOC domain is introduced and adopted to the program. The result of the transform is simliar to the remove-NMO technique in suppressing the shallow noise such as direct wave and refracted wave. However, it has two improvements, i.e., no interpolation error and very high speed computing time. By the introduction of the technique, the mute times can be easily designed from the NMOC domain and applied to the super gather in the T-X domain, thereby producing more accurate velocity spectrum interactively. The xva program consists of 28 files, 12,029 lines, 34,990 words and 304,073 characters. The program references Geobit utility libraries and can be installed under Geobit preinstalled environment. The program runs on X-Window/Motif environment. The program menu is designed according to the Motif style guide. A brief usage of the program has been discussed. The program allows fast and accurate seismic velocity analysis, which is necessary computing the AVO (Amplitude Versus Offset) based DHI (Direct Hydrocarn Indicator), and making the high quality seismic sections.

  • PDF

Different Impacts of Independent Recurrent and Non-Recurrent Congestion on Freeway Segments (고속도로상의 독립적인 반복 및 비반복정체의 영향비교)

  • Gang, Gyeong-Pyo;Jang, Myeong-Sun
    • Journal of Korean Society of Transportation
    • /
    • v.25 no.6
    • /
    • pp.99-109
    • /
    • 2007
  • There have been few studies on the impacts of independent recurrent and non-recurrent congestion on freeway networks. The main reason is due partly to the lack of traffic data collected during those periods of recurrent and non-recurrent congestion and partly to the difficulty of using the simulation tools effectively. This study has suggested a methodology to analyze the independent impacts of the recurrent and non-recurrent congestion on target freeway segments. The proposed methodology is based on an elaborately calibrated simulation analysis, using real traffic data obtained during the recurrent and non-recurrent congestion periods. This paper has also summarized the evaluation results from the field tests of two ITS technologies, which were developed to provide drivers with real-time traffic information under traffic congestion. As a result, their accuracy may not be guaranteed during the transition periods such as the non-recurrent congestion. In summary, this study has been focused on the importance of non-recurrent congestion compared to recurrent congestion, and the proposed methodology is expected to provide a basic foundation for prioritizing limited government investments for improving freeway network performance degraded by recurrent or non-recurrent congestion.

Super-Resolution Image Reconstruction Using Multi-View Cameras (다시점 카메라를 이용한 초고해상도 영상 복원)

  • Ahn, Jae-Kyun;Lee, Jun-Tae;Kim, Chang-Su
    • Journal of Broadcast Engineering
    • /
    • v.18 no.3
    • /
    • pp.463-473
    • /
    • 2013
  • In this paper, we propose a super-resolution (SR) image reconstruction algorithm using multi-view images. We acquire 25 images from multi-view cameras, which consist of a $5{\times}5$ array of cameras, and then reconstruct an SR image of the center image using a low resolution (LR) input image and the other 24 LR reference images. First, we estimate disparity maps from the input image to the 24 reference images, respectively. Then, we interpolate a SR image by employing the LR image and matching points in the reference images. Finally, we refine the SR image using an iterative regularization scheme. Experimental results demonstrate that the proposed algorithm provides higher quality SR images than conventional algorithms.

Particle filter for Correction of GPS location data of a mobile robot (이동로봇의 GPS위치 정보 보정을 위한 파티클 필터 방법)

  • Noh, Sung-Woo;Kim, Tae-Gyun;Ko, Nak-Yong;Bae, Young-Chul
    • The Journal of the Korea institute of electronic communication sciences
    • /
    • v.7 no.2
    • /
    • pp.381-389
    • /
    • 2012
  • This paper proposes a method which corrects location data of GPS for navigation of outdoor mobile robot. The method uses a Bayesian filter approach called the particle filter(PF). The method iterates two procedures: prediction and correction. The prediction procedure calculates robot location based on translational and rotational velocity data given by the robot command. It incorporates uncertainty into the predicted robot location by adding uncertainty to translational and rotational velocity command. Using the sensor characteristics of the GPS, the belief that a particle assumes true location of the robot is calculated. The resampling from the particles based on the belief constitutes the correction procedure. Since usual GPS data includes abrupt and random noise, the robot motion command based on the GPS data suffers from sudden and unexpected change, resulting in jerky robot motion. The PF reduces corruption on the GPS data and prevents unexpected location error. The proposed method is used for navigation of a mobile robot in the 2011 Robot Outdoor Navigation Competition, which was held at Gwangju on the 16-th August 2011. The method restricted the robot location error below 0.5m along the navigation of 300m length.

Measurement of ECF for $CaSO_4:Dy$ Thermoluminescent Dosimeters ($CaSO_4:Dy$ 열형광선량계의 소자보정인자(ECF) 산출)

  • Lim, Kil-Sung;Kim, Jang-Lyul
    • Journal of Radiation Protection and Research
    • /
    • v.30 no.4
    • /
    • pp.231-236
    • /
    • 2005
  • Dosimeters are manufactured from same process in the manufacturer but the deviation of TL raw counts exists among the dosimeters. TL raw counts are also gradually degrade due to multiple readings and physical abuse. ECF (Element Correction Factor) correct the degradation and deviation of TL raw counts to the average TL raw counts of reference dosimeters. Procedures for producing ECF of thermoluminescent dosimeters were described In detail. ECFs of 319 reference, control and field dosimeters were measured three times and average of three ECF values was calculated. Also, % CV(Coefficient of Variation) of three ECF values was calculated to verify ECF. ECF & % CV distributions for the field and control dosimeters are presented. TL raw counts of field dosimeters, being used about 6 times for the past 3 years, were almost unchanged, but those of control dosimeters being used more frequently, were degraded about 4.7 %.

Data Processing using Anisotropic Analysis for the Long-offset Marine Seismic Data of the East Sea, Korea (동해 해역 원거리 해양탄성파 탐사자료의 이방성 분석을 이용한 전산처리)

  • Joo, Yonghwan;Kim, Byoung-yeop
    • Geophysics and Geophysical Exploration
    • /
    • v.23 no.1
    • /
    • pp.13-21
    • /
    • 2020
  • The acquisition and processing of long-offset data are essential for imaging deep geological structures in marine seismic surveys. It is challenging to derive an accurate subsurface image by employing conventional data processing to long-offset data owing to the normal moveout (NMO) stretch and non-hyperbolic moveout phenomena induced by seismic anisotropy. In 2017, the Korea Institute of Geoscience and Mineral Resources conducted a simultaneous two-dimensional multichannel streamer and ocean-bottom seismic survey using a 5.7-km streamer and an ocean-bottom seismometer to identify the deep geological structure of the Ulleung Basin. Herein, the actual geological subsurface structure was obtained via the sequential iterative updating of the velocity and anisotropic parameters of the long-offset data obtained using a multichannel streamer, and anisotropic prestack Kirchhoff migration was performed using the updated velocity and anisotropic parameters as input parameters. As a result, the reflection energy in the long-offset traces, which showed non-hyperbolic moveout owing to seismic anisotropy, was well aligned horizontally and NMO stretches were also reduced. Thus, a more precise and accurate migrated image was obtained, minimizing the distortion of reflectors and mispositioned reflection energy.