• Title/Summary/Keyword: noise analysis

Search Result 9,297, Processing Time 0.034 seconds

Study on Standardization of the Environmental Impact Evaluation Method of Extremely Low Frequency Magnetic Fields near High Voltage Overhead Transmission Lines (고압 가공송전선로의 극저주파자기장 환경영향평가 방법 표준화에 관한 연구)

  • Park, Sung-Ae;Jung, Joonsig;Choi, Taebong;Jeong, Minjoo;Kim, Bu-Kyung;Lee, Jongchun
    • Journal of Environmental Impact Assessment
    • /
    • v.27 no.6
    • /
    • pp.658-673
    • /
    • 2018
  • Social conflicts with extremely low frequency magnetic field(ELF-MF) exposures are expected to exacerbate due to continued increase in electric power demand and construction of high voltage transmission lines(HVTL). However, in current environmental impact assessment(EIA) act, specific guidelines have not been included concretely about EIA of ELF-MF. Therefore, this study conducted a standardization study on EIA method through case analysis, field measurement, and expert consultation of the EIA for the ELF-MF near HVTL which is the main cause of exposures. The status of the EIA of the ELF-MF and the problem to be improved are derived and the EIA method which can solve it is suggested. The main contents of the study is that the physical characteristics of the ELF-MF affected by distance and powerload should be considered at all stages of EIA(survey of the current situation - Prediction of the impacts - preparation of mitigation plan ? post EIA planning). Based on this study, we also suggested the 'Measurement method for extremely low frequency magnetic field on transmission line' and 'Table for extremely low frequency magnetic field measurement record on transmission line'. The results of this study can be applied to the EIA that minimizes the damage and conflict to the construction of transmission line and derives rational measures at the present time when the human hazard to long term exposure of the ELF-MF is unclear.

Reduction of Artifacts in Magnetic Resonance Imaging with Diamagnetic Substance (반자성 물질을 이용한 자기공명영상검사에서의 인공물 감소)

  • Choi, Woo Jeon;Kim, Dong Hyun
    • Journal of the Korean Society of Radiology
    • /
    • v.13 no.4
    • /
    • pp.581-588
    • /
    • 2019
  • MRI is superior when contrasted to help the organization generate artifacts resolution, but also affect the diagnosis and create a image that can not be read. Metal is inserted into the tooth, it is necessary to often be inhibited in imaging by causing the geometric distortion due to the majority and if the difference between the magnetic susceptibility of a ferromagnetic material or paramagnetic reducing them. The purpose of this study is to conduct a metal artefact in accordance with the analysis using a diamagnetic material. The magnetic material include a wire for the orthodontic bracket and a stainless steel was used as a diamagnetic material was used copper, zinc, bismuth. Testing equipment is sequenced using 1.5T, 3T was used was measured using a SE, TSE, GE, EPI. A self-produced phantom material was used for agarose gel (10%) to a uniform signal artifacts causing materials are stainless steel were tested by placing in the center of the phantom and cover inspection of the positive cube diamagnetic material of 10mm each length.After a measurement artefact artifact zone settings area was calculated using the Wand tool After setting the Low Threshold value of 10 in the image obtained by subtracting images, including magnetic material from a pure tool phantom images using Image J. Metal artifacts occur in stainless steel metal artifact reduction was greatest in the image with the bismuth diamagnetic materials of copper and zinc is slightly reduced, but the difference in degree will not greater. The reason for this is thought to be due to hayeotgi offset most of the susceptibility in bismuth diamagnetic susceptibility of most small ferromagnetic. Most came with less artifacts in image of bismuth in both 1.5T and 3T. Sequence-specific artifact reduction was most reduced artifacts from the TSE 1.5T 3T was reduced in the most artifacts from SE. Signal-to-noise ratio was the lowest SNR is low, appears in the implant, the 1.5T was the Implant + Bi Cu and Zn showed similar results to each other. Therefore, the results of artifacts variation of diamagnetic material, magnetic susceptibility (${\chi}$) is the most this shows the reduced aspect lower than the implant artificial metal artifacts criteria in the video using low bismuth susceptibility to low material the more metal artifacts It was found that the decrease. Therefore, based on the study on the increase, the metal artifacts reduction for the whole, as well as dental prosthesis future orthodontic materials in a way that can even reduce the artifact does not appear which has been pointed out as a disadvantage of the solutions of conventional metal artifact It is considered to be material.

Extraction of Water Body Area using Micro Satellite SAR: A Case Study of the Daecheng Dam of South korea (초소형 SAR 위성을 활용한 수체면적 추출: 대청댐 유역 대상)

  • PARK, Jongsoo;KANG, Ki-Mook;HWANG, Eui-Ho
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.24 no.4
    • /
    • pp.41-54
    • /
    • 2021
  • It is very essential to estimate the water body area using remote exploration for water resource management, analysis and prediction of water disaster damage. Hydrophysical detection using satellites has been mainly performed on large satellites equipped with optical and SAR sensors. However, due to the long repeat cycle, there is a limitation that timely utilization is impossible in the event of a disaster/disaster. With the recent active development of Micro satellites, it has served as an opportunity to overcome the limitations of time resolution centered on existing large satellites. The Micro satellites currently in active operation are ICEYE in Finland and Capella satellites in the United States, and are operated in the form of clusters for earth observation purposes. Due to clustering operation, it has a short revisit cycle and high resolution and has the advantage of being able to observe regardless of weather or day and night with the SAR sensor mounted. In this study, the operation status and characteristics of micro satellites were described, and the water area estimation technology optimized for micro SAR satellite images was applied to the Daecheong Dam basin on the Korean Peninsula. In addition, accuracy verification was performed based on the reference value of the water generated from the optical satellite Sentinel-2 satellite as a reference. In the case of the Capella satellite, the smallest difference in area was shown, and it was confirmed that all three images showed high correlation. Through the results of this study, it was confirmed that despite the low NESZ of Micro satellites, it is possible to estimate the water area, and it is believed that the limitations of water resource/water disaster monitoring using existing large SAR satellites can be overcome.

Benchmark Test Study of Localized Digital Streamer System (국산화 디지털 스트리머 시스템의 벤치마크 테스트 연구)

  • Jungkyun Shin;Jiho Ha;Gabseok Seo;Young-Jun Kim;Nyeonkeon Kang;Jounggyu Choi;Dongwoo Cho;Hanhui Lee;Seong-Pil Kim
    • Geophysics and Geophysical Exploration
    • /
    • v.26 no.2
    • /
    • pp.52-61
    • /
    • 2023
  • The use of ultra-high-resolution (UHR) seismic surveys to preceisly characterize coastal and shallow structures have increased recently. UHR surveys derive a spatial resolution of 3.125 m using a high-frequency source (80 Hz to 1 kHz). A digital streamer system is an essential module for acquiring high-quality UHR seismic data. Localization studies have focused on reducing purchase costs and decreasing maintenance periods. Basic performance verification and application tests of the developed streamer have been successfully carried out; however, a comparative analysis with the existing benchmark model was not conducted. In this study, we characterized data obtained by using a developed streamer and a benchmark model simultaneously. Tamhae 2 and auxiliary equipment of the Korea Institute of Geoscience and Mineral Resources were used to acquire 2D seismic data, which were analyzed from different perspectives. The data obtained using the developed streamer differed in sensitivity from that obtained using benchmark model by frequency band.However, both type of data had a very high level of similarity in the range corresponding to the central frequency band of the seismic source. However, in the low frequency band below 60 Hz, data obtained using the developed streamer showed a lower signal-to-noise ratio than that obtained using the benchmark model.This lower ratio can hinder the quality in data acquisition using low-frequency sound sources such as cluster air guns. Three causes for this difference were, and streamers developed in future will attempt to reflect on these improvements.

A Study on the Digital Drawing of Archaeological Relics Using Open-Source Software (오픈소스 소프트웨어를 활용한 고고 유물의 디지털 실측 연구)

  • LEE Hosun;AHN Hyoungki
    • Korean Journal of Heritage: History & Science
    • /
    • v.57 no.1
    • /
    • pp.82-108
    • /
    • 2024
  • With the transition of archaeological recording method's transition from analog to digital, the 3D scanning technology has been actively adopted within the field. Research on the digital archaeological digital data gathered from 3D scanning and photogrammetry is continuously being conducted. However, due to cost and manpower issues, most buried cultural heritage organizations are hesitating to adopt such digital technology. This paper aims to present a digital recording method of relics utilizing open-source software and photogrammetry technology, which is believed to be the most efficient method among 3D scanning methods. The digital recording process of relics consists of three stages: acquiring a 3D model, creating a joining map with the edited 3D model, and creating an digital drawing. In order to enhance the accessibility, this method only utilizes open-source software throughout the entire process. The results of this study confirms that in terms of quantitative evaluation, the deviation of numerical measurement between the actual artifact and the 3D model was minimal. In addition, the results of quantitative quality analysis from the open-source software and the commercial software showed high similarity. However, the data processing time was overwhelmingly fast for commercial software, which is believed to be a result of high computational speed from the improved algorithm. In qualitative evaluation, some differences in mesh and texture quality occurred. In the 3D model generated by opensource software, following problems occurred: noise on the mesh surface, harsh surface of the mesh, and difficulty in confirming the production marks of relics and the expression of patterns. However, some of the open source software did generate the quality comparable to that of commercial software in quantitative and qualitative evaluations. Open-source software for editing 3D models was able to not only post-process, match, and merge the 3D model, but also scale adjustment, join surface production, and render image necessary for the actual measurement of relics. The final completed drawing was tracked by the CAD program, which is also an open-source software. In archaeological research, photogrammetry is very applicable to various processes, including excavation, writing reports, and research on numerical data from 3D models. With the breakthrough development of computer vision, the types of open-source software have been diversified and the performance has significantly improved. With the high accessibility to such digital technology, the acquisition of 3D model data in archaeology will be used as basic data for preservation and active research of cultural heritage.

Performance Evaluation of Siemens CTI ECAT EXACT 47 Scanner Using NEMA NU2-2001 (NEMA NU2-2001을 이용한 Siemens CTI ECAT EXACT 47 스캐너의 표준 성능 평가)

  • Kim, Jin-Su;Lee, Jae-Sung;Lee, Dong-Soo;Chung, June-Key;Lee, Myung-Chul
    • The Korean Journal of Nuclear Medicine
    • /
    • v.38 no.3
    • /
    • pp.259-267
    • /
    • 2004
  • Purpose: NEMA NU2-2001 was proposed as a new standard for performance evaluation of whole body PET scanners. in this study, system performance of Siemens CTI ECAT EXACT 47 PET scanner including spatial resolution, sensitivity, scatter fraction, and count rate performance in 2D and 3D mode was evaluated using this new standard method. Methods: ECAT EXACT 47 is a BGO crystal based PET scanner and covers an axial field of view (FOV) of 16.2 cm. Retractable septa allow 2D and 3D data acquisition. All the PET data were acquired according to the NEMA NU2-2001 protocols (coincidence window: 12 ns, energy window: $250{\sim}650$ keV). For the spatial resolution measurement, F-18 point source was placed at the center of the axial FOV((a) x=0, and y=1, (b)x=0, and y=10, (c)x=70, and y=0cm) and a position one fourth of the axial FOV from the center ((a) x=0, and y=1, (b)x=0, and y=10, (c)x=10, and y=0cm). In this case, x and y are transaxial horizontal and vertical, and z is the scanner's axial direction. Images were reconstructed using FBP with ramp filter without any post processing. To measure the system sensitivity, NEMA sensitivity phantom filled with F-18 solution and surrounded by $1{\sim}5$ aluminum sleeves were scanned at the center of transaxial FOV and 10 cm offset from the center. Attenuation free values of sensitivity wire estimated by extrapolating data to the zero wall thickness. NEMA scatter phantom with length of 70 cm was filled with F-18 or C-11solution (2D: 2,900 MBq, 3D: 407 MBq), and coincidence count rates wire measured for 7 half-lives to obtain noise equivalent count rate (MECR) and scatter fraction. We confirmed that dead time loss of the last flame were below 1%. Scatter fraction was estimated by averaging the true to background (staffer+random) ratios of last 3 frames in which the fractions of random rate art negligibly small. Results: Axial and transverse resolutions at 1cm offset from the center were 0.62 and 0.66 cm (FBP in 2D and 3D), and 0.67 and 0.69 cm (FBP in 2D and 3D). Axial, transverse radial, and transverse tangential resolutions at 10cm offset from the center were 0.72 and 0.68 cm (FBP in 2D and 3D), 0.63 and 0.66 cm (FBP in 2D and 3D), and 0.72 and 0.66 cm (FBP in 2D and 3D). Sensitivity values were 708.6 (2D), 2931.3 (3D) counts/sec/MBq at the center and 728.7 (2D, 3398.2 (3D) counts/sec/MBq at 10 cm offset from the center. Scatter fractions were 0.19 (2D) and 0.49 (3D). Peak true count rate and NECR were 64.0 kcps at 40.1 kBq/mL and 49.6 kcps at 40.1 kBq/mL in 2D and 53.7 kcps at 4.76 kBq/mL and 26.4 kcps at 4.47 kBq/mL in 3D. Conclusion: Information about the performance of CTI ECAT EXACT 47 PET scanner reported in this study will be useful for the quantitative analysis of data and determination of optimal image acquisition protocols using this widely used scanner for clinical and research purposes.

Estimation of GARCH Models and Performance Analysis of Volatility Trading System using Support Vector Regression (Support Vector Regression을 이용한 GARCH 모형의 추정과 투자전략의 성과분석)

  • Kim, Sun Woong;Choi, Heung Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.2
    • /
    • pp.107-122
    • /
    • 2017
  • Volatility in the stock market returns is a measure of investment risk. It plays a central role in portfolio optimization, asset pricing and risk management as well as most theoretical financial models. Engle(1982) presented a pioneering paper on the stock market volatility that explains the time-variant characteristics embedded in the stock market return volatility. His model, Autoregressive Conditional Heteroscedasticity (ARCH), was generalized by Bollerslev(1986) as GARCH models. Empirical studies have shown that GARCH models describes well the fat-tailed return distributions and volatility clustering phenomenon appearing in stock prices. The parameters of the GARCH models are generally estimated by the maximum likelihood estimation (MLE) based on the standard normal density. But, since 1987 Black Monday, the stock market prices have become very complex and shown a lot of noisy terms. Recent studies start to apply artificial intelligent approach in estimating the GARCH parameters as a substitute for the MLE. The paper presents SVR-based GARCH process and compares with MLE-based GARCH process to estimate the parameters of GARCH models which are known to well forecast stock market volatility. Kernel functions used in SVR estimation process are linear, polynomial and radial. We analyzed the suggested models with KOSPI 200 Index. This index is constituted by 200 blue chip stocks listed in the Korea Exchange. We sampled KOSPI 200 daily closing values from 2010 to 2015. Sample observations are 1487 days. We used 1187 days to train the suggested GARCH models and the remaining 300 days were used as testing data. First, symmetric and asymmetric GARCH models are estimated by MLE. We forecasted KOSPI 200 Index return volatility and the statistical metric MSE shows better results for the asymmetric GARCH models such as E-GARCH or GJR-GARCH. This is consistent with the documented non-normal return distribution characteristics with fat-tail and leptokurtosis. Compared with MLE estimation process, SVR-based GARCH models outperform the MLE methodology in KOSPI 200 Index return volatility forecasting. Polynomial kernel function shows exceptionally lower forecasting accuracy. We suggested Intelligent Volatility Trading System (IVTS) that utilizes the forecasted volatility results. IVTS entry rules are as follows. If forecasted tomorrow volatility will increase then buy volatility today. If forecasted tomorrow volatility will decrease then sell volatility today. If forecasted volatility direction does not change we hold the existing buy or sell positions. IVTS is assumed to buy and sell historical volatility values. This is somewhat unreal because we cannot trade historical volatility values themselves. But our simulation results are meaningful since the Korea Exchange introduced volatility futures contract that traders can trade since November 2014. The trading systems with SVR-based GARCH models show higher returns than MLE-based GARCH in the testing period. And trading profitable percentages of MLE-based GARCH IVTS models range from 47.5% to 50.0%, trading profitable percentages of SVR-based GARCH IVTS models range from 51.8% to 59.7%. MLE-based symmetric S-GARCH shows +150.2% return and SVR-based symmetric S-GARCH shows +526.4% return. MLE-based asymmetric E-GARCH shows -72% return and SVR-based asymmetric E-GARCH shows +245.6% return. MLE-based asymmetric GJR-GARCH shows -98.7% return and SVR-based asymmetric GJR-GARCH shows +126.3% return. Linear kernel function shows higher trading returns than radial kernel function. Best performance of SVR-based IVTS is +526.4% and that of MLE-based IVTS is +150.2%. SVR-based GARCH IVTS shows higher trading frequency. This study has some limitations. Our models are solely based on SVR. Other artificial intelligence models are needed to search for better performance. We do not consider costs incurred in the trading process including brokerage commissions and slippage costs. IVTS trading performance is unreal since we use historical volatility values as trading objects. The exact forecasting of stock market volatility is essential in the real trading as well as asset pricing models. Further studies on other machine learning-based GARCH models can give better information for the stock market investors.