• Title/Summary/Keyword: 성능 위험

Search Result 1,192, Processing Time 0.03 seconds

Estimation of GARCH Models and Performance Analysis of Volatility Trading System using Support Vector Regression (Support Vector Regression을 이용한 GARCH 모형의 추정과 투자전략의 성과분석)

  • Kim, Sun Woong;Choi, Heung Sik
    • Journal of Intelligence and Information Systems
    • /
    • v.23 no.2
    • /
    • pp.107-122
    • /
    • 2017
  • Volatility in the stock market returns is a measure of investment risk. It plays a central role in portfolio optimization, asset pricing and risk management as well as most theoretical financial models. Engle(1982) presented a pioneering paper on the stock market volatility that explains the time-variant characteristics embedded in the stock market return volatility. His model, Autoregressive Conditional Heteroscedasticity (ARCH), was generalized by Bollerslev(1986) as GARCH models. Empirical studies have shown that GARCH models describes well the fat-tailed return distributions and volatility clustering phenomenon appearing in stock prices. The parameters of the GARCH models are generally estimated by the maximum likelihood estimation (MLE) based on the standard normal density. But, since 1987 Black Monday, the stock market prices have become very complex and shown a lot of noisy terms. Recent studies start to apply artificial intelligent approach in estimating the GARCH parameters as a substitute for the MLE. The paper presents SVR-based GARCH process and compares with MLE-based GARCH process to estimate the parameters of GARCH models which are known to well forecast stock market volatility. Kernel functions used in SVR estimation process are linear, polynomial and radial. We analyzed the suggested models with KOSPI 200 Index. This index is constituted by 200 blue chip stocks listed in the Korea Exchange. We sampled KOSPI 200 daily closing values from 2010 to 2015. Sample observations are 1487 days. We used 1187 days to train the suggested GARCH models and the remaining 300 days were used as testing data. First, symmetric and asymmetric GARCH models are estimated by MLE. We forecasted KOSPI 200 Index return volatility and the statistical metric MSE shows better results for the asymmetric GARCH models such as E-GARCH or GJR-GARCH. This is consistent with the documented non-normal return distribution characteristics with fat-tail and leptokurtosis. Compared with MLE estimation process, SVR-based GARCH models outperform the MLE methodology in KOSPI 200 Index return volatility forecasting. Polynomial kernel function shows exceptionally lower forecasting accuracy. We suggested Intelligent Volatility Trading System (IVTS) that utilizes the forecasted volatility results. IVTS entry rules are as follows. If forecasted tomorrow volatility will increase then buy volatility today. If forecasted tomorrow volatility will decrease then sell volatility today. If forecasted volatility direction does not change we hold the existing buy or sell positions. IVTS is assumed to buy and sell historical volatility values. This is somewhat unreal because we cannot trade historical volatility values themselves. But our simulation results are meaningful since the Korea Exchange introduced volatility futures contract that traders can trade since November 2014. The trading systems with SVR-based GARCH models show higher returns than MLE-based GARCH in the testing period. And trading profitable percentages of MLE-based GARCH IVTS models range from 47.5% to 50.0%, trading profitable percentages of SVR-based GARCH IVTS models range from 51.8% to 59.7%. MLE-based symmetric S-GARCH shows +150.2% return and SVR-based symmetric S-GARCH shows +526.4% return. MLE-based asymmetric E-GARCH shows -72% return and SVR-based asymmetric E-GARCH shows +245.6% return. MLE-based asymmetric GJR-GARCH shows -98.7% return and SVR-based asymmetric GJR-GARCH shows +126.3% return. Linear kernel function shows higher trading returns than radial kernel function. Best performance of SVR-based IVTS is +526.4% and that of MLE-based IVTS is +150.2%. SVR-based GARCH IVTS shows higher trading frequency. This study has some limitations. Our models are solely based on SVR. Other artificial intelligence models are needed to search for better performance. We do not consider costs incurred in the trading process including brokerage commissions and slippage costs. IVTS trading performance is unreal since we use historical volatility values as trading objects. The exact forecasting of stock market volatility is essential in the real trading as well as asset pricing models. Further studies on other machine learning-based GARCH models can give better information for the stock market investors.

Animal Infectious Diseases Prevention through Big Data and Deep Learning (빅데이터와 딥러닝을 활용한 동물 감염병 확산 차단)

  • Kim, Sung Hyun;Choi, Joon Ki;Kim, Jae Seok;Jang, Ah Reum;Lee, Jae Ho;Cha, Kyung Jin;Lee, Sang Won
    • Journal of Intelligence and Information Systems
    • /
    • v.24 no.4
    • /
    • pp.137-154
    • /
    • 2018
  • Animal infectious diseases, such as avian influenza and foot and mouth disease, occur almost every year and cause huge economic and social damage to the country. In order to prevent this, the anti-quarantine authorities have tried various human and material endeavors, but the infectious diseases have continued to occur. Avian influenza is known to be developed in 1878 and it rose as a national issue due to its high lethality. Food and mouth disease is considered as most critical animal infectious disease internationally. In a nation where this disease has not been spread, food and mouth disease is recognized as economic disease or political disease because it restricts international trade by making it complex to import processed and non-processed live stock, and also quarantine is costly. In a society where whole nation is connected by zone of life, there is no way to prevent the spread of infectious disease fully. Hence, there is a need to be aware of occurrence of the disease and to take action before it is distributed. Epidemiological investigation on definite diagnosis target is implemented and measures are taken to prevent the spread of disease according to the investigation results, simultaneously with the confirmation of both human infectious disease and animal infectious disease. The foundation of epidemiological investigation is figuring out to where one has been, and whom he or she has met. In a data perspective, this can be defined as an action taken to predict the cause of disease outbreak, outbreak location, and future infection, by collecting and analyzing geographic data and relation data. Recently, an attempt has been made to develop a prediction model of infectious disease by using Big Data and deep learning technology, but there is no active research on model building studies and case reports. KT and the Ministry of Science and ICT have been carrying out big data projects since 2014 as part of national R &D projects to analyze and predict the route of livestock related vehicles. To prevent animal infectious diseases, the researchers first developed a prediction model based on a regression analysis using vehicle movement data. After that, more accurate prediction model was constructed using machine learning algorithms such as Logistic Regression, Lasso, Support Vector Machine and Random Forest. In particular, the prediction model for 2017 added the risk of diffusion to the facilities, and the performance of the model was improved by considering the hyper-parameters of the modeling in various ways. Confusion Matrix and ROC Curve show that the model constructed in 2017 is superior to the machine learning model. The difference between the2016 model and the 2017 model is that visiting information on facilities such as feed factory and slaughter house, and information on bird livestock, which was limited to chicken and duck but now expanded to goose and quail, has been used for analysis in the later model. In addition, an explanation of the results was added to help the authorities in making decisions and to establish a basis for persuading stakeholders in 2017. This study reports an animal infectious disease prevention system which is constructed on the basis of hazardous vehicle movement, farm and environment Big Data. The significance of this study is that it describes the evolution process of the prediction model using Big Data which is used in the field and the model is expected to be more complete if the form of viruses is put into consideration. This will contribute to data utilization and analysis model development in related field. In addition, we expect that the system constructed in this study will provide more preventive and effective prevention.

Efficacy of a Protective Grass Shield in Reduction of Radiation Exposure Dose During Interventional Radiology (방사선학적 중재적 시술시 납유리의 방사선 방어효과에 관한 연구)

  • Jang, Young-Ill;Song, Jong-Nam;Kim, Young-Jae
    • Journal of the Korean Society of Radiology
    • /
    • v.5 no.5
    • /
    • pp.303-308
    • /
    • 2011
  • Background/Aims : The increasing use of diagnostic and therapeutic interventional radiology calls for greater consideration of radiation exposure risk to radiologist and radiological technician, and emphasizes the proper system of radiation protection. This study was designed to assess the effect of a protective grass shield. Methods : A protective grass was following data depth, 0.8 cm; width, 100 cm; length, 100 cm, lead equivalent, 1.6 mmPb. The protective shield was located between the patient and the radiologist. Thirty patients (13 male and 17 female) undergoing interventional radiology between September 2010 and December 2010 were selected for this study. The dose of radiation exposure was recorded with or without the protective grass shield at the level of the head, chest, and pelvis. The measurement was made at 50 cm and 150 cm from the radiation source. Results : The mean patient age was 69 years. The mean patient height and weight was $159.7{\pm}6.7$ cm and $60.3{\pm}5.9$ kg, respectively. The mean body mass index (BMI) was $20.5{\pm}3.0$ kg/m2. radiologists received $1530.2{\pm}550.0$ mR/hr without the protective lead shield. At the same distance, radiation exposure was significantly reduced to $50.3{\pm}85.2$ mR/hr with the protective lead shield (p-value<0.0001). The radiation exposure to radiologist and radiological technician was significantly reduced by the use of a protective lead shield (p value <0.0001). The amount of radiation exposure during interventional radiology was related to the patient' BMI (r=0.749, p=0.001). Conclusions : This protective shield grass is effective in protecting radiologist and radiological technician from radiation exposure.

Mapping Precise Two-dimensional Surface Deformation on Kilauea Volcano, Hawaii using ALOS2 PALSAR2 Spotlight SAR Interferometry (ALOS-2 PALSAR-2 Spotlight 영상의 위성레이더 간섭기법을 활용한 킬라우에아 화산의 정밀 2차원 지표변위 매핑)

  • Hong, Seong-Jae;Baek, Won-Kyung;Jung, Hyung-Sup
    • Korean Journal of Remote Sensing
    • /
    • v.35 no.6_3
    • /
    • pp.1235-1249
    • /
    • 2019
  • Kilauea Volcano is one of the most active volcano in the world. In this study, we used the ALOS-2 PALSAR-2 satellite imagery to measure the surface deformation occurring near the summit of the Kilauea volcano from 2015 to 2017. In order to measure two-dimensional surface deformation, interferometric synthetic aperture radar (InSAR) and multiple aperture SAR interferometry (MAI) methods were performed using two interferometric pairs. To improve the precision of 2D measurement, we compared root-mean-squared deviation (RMSD) of the difference of measurement value as we change the effective antenna length and normalized squint value, which are factors that can affect the measurement performance of the MAI method. Through the compare, the values of the factors, which can measure deformation most precisely, were selected. After select optimal values of the factors, the RMSD values of the difference of the MAI measurement were decreased from 4.07 cm to 2.05 cm. In each interferograms, the maximum deformation in line-of-sight direction is -28.6 cm and -27.3 cm, respectively, and the maximum deformation in the along-track direction is 20.2 cm and 20.8 cm, in the opposite direction is -24.9 cm and -24.3 cm, respectively. After stacking the two interferograms, two-dimensional surface deformation mapping was performed, and a maximum surface deformation of approximately 30.4 cm was measured in the northwest direction. In addition, large deformation of more than 20 cm were measured in all directions. The measurement results show that the risk of eruption activity is increasing in Kilauea Volcano. The measurements of the surface deformation of Kilauea volcano from 2015 to 2017 are expected to be helpful for the study of the eruption activity of Kilauea volcano in the future.

Anisotrpic radar crosshole tomography and its applications (이방성 레이다 시추공 토모그래피와 그 응용)

  • Kim Jung-Ho;Cho Seong-Jun;Yi Myeong-Jong
    • 한국지구물리탐사학회:학술대회논문집
    • /
    • 2005.09a
    • /
    • pp.21-36
    • /
    • 2005
  • Although the main geology of Korea consists of granite and gneiss, it Is not uncommon to encounter anisotropy Phenomena in crosshole radar tomography even when the basement is crystalline rock. To solve the anisotropy Problem, we have developed and continuously upgraded an anisotropic inversion algorithm assuming a heterogeneous elliptic anisotropy to reconstruct three kinds of tomograms: tomograms of maximum and minimum velocities, and of the direction of the symmetry axis. In this paper, we discuss the developed algorithm and introduce some case histories on the application of anisotropic radar tomography in Korea. The first two case histories were conducted for the construction of infrastructure, and their main objective was to locate cavities in limestone. The last two were performed In a granite and gneiss area. The anisotropy in the granite area was caused by fine fissures aligned in the same direction, while that in the gneiss and limestone area by the alignment of the constituent minerals. Through these case histories we showed that the anisotropic characteristic itself gives us additional important information for understanding the internal status of basement rock. In particular, the anisotropy ratio defined by the normalized difference between maximum and minimum velocities as well as the direction of maximum velocity are helpful to interpret the borehole radar tomogram.

  • PDF

Effect of Attenuation Correction, Scatter Correction and Resolution Recovery on Diagnostic Performance of Quantitative Myocardial SPECT for Coronary Artery Disease (감쇠보정, 산란보정 및 해상도복원이 정량적 심근 SPECT의 관상동맥질환 진단성능에 미치는 효과)

  • Hwang, Kyung-Hoon;Lee, Dong-Soo;Paeng, Jin-Chul;Lee, Myoung-Mook;Chung, June-Key;Lee, Myung-Chul
    • The Korean Journal of Nuclear Medicine
    • /
    • v.36 no.5
    • /
    • pp.288-297
    • /
    • 2002
  • Purpose: Soft tissue attenuation and scattering are major methodological limitations of myocardial perfusion SPECT. To overcome these limitations, algorithms for attenuation, scatter correction and resolution recovery (ASCRR) is being developed, while quantitative myocardial SPECT has also become available. In this study, we investigated the efficacy of an ASCRR-corrected quantitative myocardial SPECT method for the diagnosis of coronary artery disease (CAD). Materials and Methods: Seventy-five patients (M:F=51:24, $61.0{\pm}8.9$ years old) suspected of CAD who underwent coronary angiography (CAG) within $7{\pm}12$ days of SPECT(Group-I) and 20 subjects (M:F=10:10, age $40.6{\pm}9.4$) with a low likelihood of coronary artery disease (Group-II) were enrolled. Tl-201 rest/ dipyridamole-stress Tc-99m-MIBI gated myocardial SPECT was performed. ASCRR correction was peformed using a Gd-153 line source and automatic software (Vantage-Pro; ADAC Labs, USA). Using a 20-segment model, segmental perfusion was automatically quantified on both the ASCRR-corrected and uncorrected images using an automatic quantifying software (AutoQUANT; ADAC Labs.). Using these quantified values, CAD was diagnosed in each of the 3 coronary arterial territories. The diagnostic performance of ASCRR-corrected SPECT was compared with that of non-corrected SPECT. Results: Among the 75 patients of Group-I, 9 patients had normal CAG while the remaining 66 patients had 155 arterial lesions; 61 left anterior descending (LAD), 48 left circumflex (LCX) and 46 right coronary (RCA) arterial lesions. For the LAD and LCX lesions, there was no significant difference in diagnostic performance. In Group-II patients, the overall normalcy rate improved but this improvement was not statistically significant (p=0.07). However, for RCA lesions, specificity improved significantly but sensitivity worsened significantly with ASCRR correction (both p<0.05). Overall accuracy was the same. Conclusion: The ASCRR correction did not improve diagnostic performance significantly although the diagnostic specificity for RCA lesions improved on quantitative myocardial SPECT. The clinical application of the ASC-RR correction requires more discretion regarding cost and efficacy.

A Study on the Selection of the Recommended Safety Distance Between Marine Structures and Ships Based on AIS Data (AIS 기반 해양시설물과 선박간 권고 안전이격거리 선정에 관한 연구)

  • Son, Woo-ju;Lee, Jeong-seok;Lee, Bo-kyeong;Cho, Ik-soon
    • Journal of Navigation and Port Research
    • /
    • v.43 no.6
    • /
    • pp.420-428
    • /
    • 2019
  • Although marine structures are a risk factor interfering with the passage of ships, there are no obvious guidelines on the required safety distance between ships and marine structures under regulations and laws. In this study, the width of the shipping route width was set based on the AIS data to analyze the separation distance between marine structures and ships, and the ships were classified by the length of each ship. By analyzing the distribution at marine structures, this study confirmed that the ships' traffic volume was in the form of normal distribution. To statistically analyze the separation distance between the traffic distribution results and the normal distribution of ships in this study, the traffic pattern analysis around the marine structures was performed. As a result, the traffic pattern was different by length and the recommended safety distance for each length is presented accordingly. Referring to the IMO (International Maritime Organization) the standard turning circle and reference of safety separation distance between ships and offshore wind turbines of the CESMA (Confederation of European Shipmasters' Associations) and P IANC (World Association for Waterborne Transport Infrastructures), the analysis was conducted on ships that did not follow the set distance among the AIS data by setting the distance within the recommended ship safety distance to 5-7 overall length. As a result, the 5.5 length over all of the safety recommendations were selected as appropriate, and based on the above results, the two cases recommending ship safety distance were proposed.

ANC Caching Technique for Replacement of Execution Code on Active Network Environment (액티브 네트워크 환경에서 실행 코드 교체를 위한 ANC 캐싱 기법)

  • Jang Chang-bok;Lee Moo-Hun;Cho Sung-Hoon;Choi Eui-In
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.9B
    • /
    • pp.610-618
    • /
    • 2005
  • As developed Internet and Computer Capability, Many Users take the many information through the network. So requirement of User that use to network was rapidly increased and become various. But it spend much time to accept user requirement on current network, so studied such as Active network for solved it. This Active node on Active network have the capability that stored and processed execution code aside from capability of forwarding packet on current network. So required execution code for executed packet arrived in active node, if execution code should not be in active node, have to take by request previous Action node and Code Server to it. But if this execution code take from previous active node and Code Server, bring to time delay by transport execution code and increased traffic of network and execution time. So, As used execution code stored in cache on active node, it need to increase execution time and decreased number of request. So, our paper suggest ANC caching technique that able to decrease number of execution code request and time of execution code by efficiently store execution code to active node. ANC caching technique may decrease the network traffic and execution time of code, to decrease request of execution code from previous active node.

Debris flow characteristics and sabo dam function in urban steep slopes (도심지 급경사지에서 토석류 범람 특성 및 사방댐 기능)

  • Kim, Yeonjoong;Kim, Taewoo;Kim, Dongkyum;Yoon, Jongsung
    • Journal of Korea Water Resources Association
    • /
    • v.53 no.8
    • /
    • pp.627-636
    • /
    • 2020
  • Debris flow disasters primarily occur in mountainous terrains far from cities. As such, they have been underestimated to cause relatively less damage compared with other natural disasters. However, owing to urbanization, several residential areas and major facilities have been built in mountainous regions, and the frequency of debris flow disasters is steadily increasing owing to the increase in rainfall with environmental and climate changes. Thus, the risk of debris flow is on the rise. However, only a few studies have explored the characteristics of flooding and reduction measures for debris flow in areas designated as steep slopes. In this regard, it is necessary to conduct research on securing independent disaster prevention technology, suitable for the environment in South Korea and reflective of the topographical characteristics thereof, and update and improve disaster prevention information. Accordingly, this study aimed to calculate the amount of debris flow, depending on disaster prevention performance targets for regions designated as steep slopes in South Korea, and develop an independent model to not only evaluate the impact of debris flow but also identify debris barriers that are optimal for mitigating damage. To validate the reliability of the two-dimensional debris flow model developed for the evaluation of debris barriers, the model's performance was compared with that of the hydraulic model. Furthermore, a 2-D debris model was constructed in consideration of the regional characteristics around the steep slopes to analyze the flow characteristics of the debris that directly reaches the damaged area. The flow characteristics of the debris delivered downstream were further analyzed, depending on the specifications (height) and installation locations of the debris barriers employed to reduce the damage. The experimental results showed that the reliability of the developed model is satisfactory; further, this study confirmed significant performance degradation of debris barriers in areas where the barriers were installed at a slope of 20° or more, which is the slope at which debris flows occur.

A Recidivism Prediction Model Based on XGBoost Considering Asymmetric Error Costs (비대칭 오류 비용을 고려한 XGBoost 기반 재범 예측 모델)

  • Won, Ha-Ram;Shim, Jae-Seung;Ahn, Hyunchul
    • Journal of Intelligence and Information Systems
    • /
    • v.25 no.1
    • /
    • pp.127-137
    • /
    • 2019
  • Recidivism prediction has been a subject of constant research by experts since the early 1970s. But it has become more important as committed crimes by recidivist steadily increase. Especially, in the 1990s, after the US and Canada adopted the 'Recidivism Risk Assessment Report' as a decisive criterion during trial and parole screening, research on recidivism prediction became more active. And in the same period, empirical studies on 'Recidivism Factors' were started even at Korea. Even though most recidivism prediction studies have so far focused on factors of recidivism or the accuracy of recidivism prediction, it is important to minimize the prediction misclassification cost, because recidivism prediction has an asymmetric error cost structure. In general, the cost of misrecognizing people who do not cause recidivism to cause recidivism is lower than the cost of incorrectly classifying people who would cause recidivism. Because the former increases only the additional monitoring costs, while the latter increases the amount of social, and economic costs. Therefore, in this paper, we propose an XGBoost(eXtream Gradient Boosting; XGB) based recidivism prediction model considering asymmetric error cost. In the first step of the model, XGB, being recognized as high performance ensemble method in the field of data mining, was applied. And the results of XGB were compared with various prediction models such as LOGIT(logistic regression analysis), DT(decision trees), ANN(artificial neural networks), and SVM(support vector machines). In the next step, the threshold is optimized to minimize the total misclassification cost, which is the weighted average of FNE(False Negative Error) and FPE(False Positive Error). To verify the usefulness of the model, the model was applied to a real recidivism prediction dataset. As a result, it was confirmed that the XGB model not only showed better prediction accuracy than other prediction models but also reduced the cost of misclassification most effectively.