• Title/Summary/Keyword: 통계적 일치

Search Result 206, Processing Time 0.022 seconds

The Inference of Early Goguryo Kings' Reign Period (고구려 초기 왕의 재위 기간 추정)

  • Lee, Geun-U;Rhee, Keun-Moo
    • Annual Conference of KIPS
    • /
    • 2013.05a
    • /
    • pp.1049-1052
    • /
    • 2013
  • 저자들은 1980 년대 후반부터 역사학적 논쟁들을 통계적 수학적 방법을 이용하여 분석해보고자 하는 시도들을 하여 왔다. 최근에는 역사학의 논쟁점 중이 하나인 화랑세기의 진위 여부를 분석하기 위한 일련의 연구를 시도하였다. 이러한 일련의 시도의 일환으로 이 연구는 기획되었다. 이 글에서는 일본의 고대 기년 추정 방법을 참고하면서 세대 평균치와 재위 평균치의 추정이라는 방법을 통하여 "삼국사기" 초기 기년 문제에 대한 해명을 시도해 보고자 한다. 종래에는 세대 평균치라는 개념을 사용하지 않아서 편차가 심한 왕별 재위기간 평균치만 사용하여 재위기간 추정의 오차가 컸다. 또한 여러 가지 추론들이 주관적인 판단에 의지한 부분이 많아서 폭넓은 지지를 얻지 못한 경우가 많았다. 따라서 세대 평균치와 재위 평균치는 반드시 일치하지 않는다는 점을 분명히 하는 한편, 양자를 결합한 추정 방안을 마련해 볼 것이다.

A Comparison of the Independent Verification Methods for the Results of Leksell GammaPlan for Gamma Knife Predecessor with the Hemispherical Collimators (반구형 시준기를 가진 감마나이프에 대한 렉셀감마플랜 결과물의 독립적인 검증방법들의 비교)

  • Hur, Beong Ik
    • Journal of the Korean Society of Radiology
    • /
    • v.10 no.7
    • /
    • pp.521-529
    • /
    • 2016
  • Since Gamma Knife(R) radiosurgery(GKRS) is based on a single-fraction high dose treatment strategy, independent verification for the results of Leksell GammaPlan(R) (LGP) is an important procedure in assuring patient safety and minimizing the risk of treatment errors. Several verification methods have been developed and reported previously. Thus these methods were tested statistically and tried on Leksell Gamma Knife(LGK) target treatments through the embodiment of the previously proposed algorithms(PPA). The purpose of this study was to apply and evaluate the accuracy of verification methods for LGK target treatments using PPA. In the study 10 patients with intracranial lesion treated by GKRS were included. We compared the data from PPA and LGP in terms of maximum dose, arbitrary point dose, and treatment time at the isocenter locations. All data were analyzed by Paired t-test, which is statistical method used to compare two different measurement techniques. No statistical significance in maximal dose at 10 cases was observed between PPA and LGP. Differences in average maximal dose ranged from -0.53 Gy to 3.71 Gy. The arbitrary point dose calculated by PPA and LGP was not statistically significant too. But we found out the statistical difference with p=0.021 between TMR and LGP for treatment time at the isocenter locations. PPA can be incorporated as part of a routine quality assurance(QA) procedure to minimize the chance of a wrong overdose. Statistical analyses demonstrated that PPA was in excellent agreement with LGP when considering the maximal dose and the arbitrary point dose for the best plan of GKRS. Due to the easy applicability we hope PPA can be widely used.

A Concordance Study of the Preprocessing Orders in Microarray Data (마이크로어레이 자료의 사전 처리 순서에 따른 검색의 일치도 분석)

  • Kim, Sang-Cheol;Lee, Jae-Hwi;Kim, Byung-Soo
    • The Korean Journal of Applied Statistics
    • /
    • v.22 no.3
    • /
    • pp.585-594
    • /
    • 2009
  • Researchers of microarray experiment transpose processed images of raw data to possible data of statistical analysis: it is preprocessing. Preprocessing of microarray has image filtering, imputation and normalization. There have been studied about several different methods of normalization and imputation, but there was not further study on the order of the procedures. We have no further study about which things put first on our procedure between normalization and imputation. This study is about the identification of differentially expressed genes(DEG) on the order of the preprocessing steps using two-dye cDNA microarray in colon cancer and gastric cancer. That is, we check for compare which combination of imputation and normalization steps can detect the DEG. We used imputation methods(K-nearly neighbor, Baysian principle comparison analysis) and normalization methods(global, within-print tip group, variance stabilization). Therefore, preprocessing steps have 12 methods. We identified concordance measure of DEG using the datasets to which the 12 different preprocessing orders were applied. When we applied preprocessing using variance stabilization of normalization method, there was a little variance in a sensitive way for detecting DEG.

A Statistical Testing of the Consistency Index in Analytic Hierarchy Process (계층적 의사결정론에서 일관성 지수에 대한 통계적 검정)

  • Lee, Jong Chan;Jhun, Myoungshic;Jeong, Hyeong Chul
    • The Korean Journal of Applied Statistics
    • /
    • v.27 no.1
    • /
    • pp.103-114
    • /
    • 2014
  • Significant research has been devoted to the consistency index of the Analytic Hierarchy Process(AHP) from several perspectives. Critics of the consistency index in AHP state that the critical value of consistency index depends on an average of the random index based simulation study using a 9 scale comparison matrix. We found that the distribution of the consistency index followed the skew distribution according to the dimension of the comparison matrix based on a simulation study with a 9 scale comparison matrix. From the simulation study, we suggest a consistency index quantile table to assist the decision-making process in AHP; in addition, we can approximate the distribution of the consistency index to the gamma distribution under the limited assumptions.

Statistical Voice Activity Detection Using Probabilistic Non-Negative Matrix Factorization (확률적 비음수 행렬 인수분해를 사용한 통계적 음성검출기법)

  • Kim, Dong Kook;Shin, Jong Won;Kwon, Kisoo;Kim, Nam Soo
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.41 no.8
    • /
    • pp.851-858
    • /
    • 2016
  • This paper presents a new statistical voice activity detection (VAD) based on the probabilistic interpretation of nonnegative matrix factorization (NMF). The objective function of the NMF using Kullback-Leibler divergence coincides with the negative log likelihood function of the data if the distribution of the data given the basis and encoding matrices is modeled as Poisson distributions. Based on this probabilistic NMF, the VAD is constructed using the likelihood ratio test assuming that speech and noise follow Poisson distributions. Experimental results show that the proposed approach outperformed the conventional Gaussian model-based and NMF-based methods at 0-15 dB signal-to-noise ratio simulation conditions.

Study on Characteristics Analysis and Countermessures of Traffic Accident in at-Grade Intersection (평면교차점(平面交叉點)의 교통사고특성분석(交通事故特性分析)과 그 대책(對策))

  • Kim, Dae Eung
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.4 no.2
    • /
    • pp.1-11
    • /
    • 1984
  • This aims of this study is to analyse the correlationship between traffic accident s and traffic characteristic variables in at-grade intersections of urban area, to build up an accident forecasting model and to propose an evaluation method of hazardous at-grade intersections. The accident forecasting model is formulated by the use of residual indexes that is selected by principal component analysis and its statistical significance is tested by step-wise regression analysis. Effective countermeasures for safety can be established on the basis of identifying high accident intersections, because the validity of this model was examined and found to coincide with real world situations.

  • PDF

On the Efficacy of Fiscal Policy in Korea during 1979~2000 (우리나라 재정정책의 유효성에 관한 연구)

  • Hur, Seok-Kyun
    • KDI Journal of Economic Policy
    • /
    • v.29 no.2
    • /
    • pp.1-40
    • /
    • 2007
  • This paper mainly estimates a trajectory of GDP induced by variations in fiscal expenditure and taxation policy using three variable structural VAR models. By assigning different combinations of identifying restrictions on the disturbances and measuring the corresponding fiscal multipliers, we compare how robust the estimated values of fiscal multipliers are with respect to the restrictions. Then, considering the dependency of Korean economy on the foreign sector, we extend the three variable SVARs to four variable ones by adding a variable reflecting external shocks. Empirical analyses into the Korean quarterly data (from 1979 to 2000) with the three variable SVARs reveal that the size and the significance of the estimated fiscal multipliers in Korea are very small and low or they decay very fast. Results from the four variable SVARs confirm these results while the significance of the effectiveness of fiscal policy is enhanced in some cases.

  • PDF

Development of Tomographic Scan Method for Industrial Plants (산업공정반응기의 감마선 전산 단층촬영기술 개발)

  • Kim, Jong-Bum;Jung, Sung-Hee;Moon, Jin-Ho;Kwon, Taek-Yong;Cho, Gyu-Seong
    • Journal of the Korean Society for Nondestructive Testing
    • /
    • v.30 no.1
    • /
    • pp.20-30
    • /
    • 2010
  • In this paper, a new tomographic scan method with fixed installed detectors and rotating source from gamma projector was presented to diagnose the industrial plants which were impossible to be examined by conventional tomographic systems. Weight matrix calculation method which was suitable for volumetric detector and statistical iterative reconstruction method were applied for reconstructing the simulation and experimental data. Monte Carlo simulations had been performed for two kinds of phantoms. Lab scale experiment with a same condition as one of phantoms, had been carried out. Simulation results showed that reconstruction from photopeak counting measurement gave the better results than from the gross counting measurement although photopeak counting measurement had large statistical errors. Experimental data showed the similar result as Monte Carlo simulation. Those results appeared to be promising for industrial tomographic applications, especially for petrochemical industries.

Focus Adjustment Method with Statistical Analysis for an Interchangeable Zoom Lens with Symmetric Error Factors (대칭성 공차를 갖는 교환렌즈용 줌 렌즈의 핀트 조정법과 통계적 해석)

  • Ryu, J.M.;Jo, J.H.;Kang, G.M.;Lee, H.J.;Yoneyama, Suji
    • Korean Journal of Optics and Photonics
    • /
    • v.22 no.5
    • /
    • pp.230-238
    • /
    • 2011
  • There are many types of interchangeable zoom lens in the digital single lens reflex camera and the compact digital still camera system in order to meet various specifications such as the field angle. Thus special cases for which the focus adjustment using only an auto-focus group is not available in the focal point correction (that is, the focus adjustment) of both wide and tele-zoom positions are sometimes generated. In order to make each BFL(back focal length, BFL) coincide at wide and tele-zoom positions with each designed BFL, focus adjustment processes must be performed at least in these two points within the zoom lens system. In this paper, we propose a method of focus adjustment by using the concept of focus sensitivity, and we calculate a limit on focus adjustment distance by means of statistical analysis.

The Statistical Analysis of Differential Probability Using GPGPU Technology (GPGPU 기술을 활용한 차분 확률의 통계적 분석)

  • Jo, Eunji;Kim, Seong-Gyeom;Hong, Deukjo;Sung, Jaechul;Hong, Seokhie
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.29 no.3
    • /
    • pp.477-489
    • /
    • 2019
  • In this paper, we experimentally verify the expected differential probability under the markov cipher assumption and the distribution of the differential probability. Firstly, we validate the expected differential probability of 6round-PRESENT of the lightweight block cipher under the markov cipher assumption by analyzing the empirical differential probability. Secondly, we demonstrate that even though the expected differential probability under the markov cipher assumption seems valid, the empirical distribution does not follow the well-known distribution of the differential probability. The results was deduced from the 4round-GIFT. Finally, in order to analyze whether the key-schedule affects the mis-matching phenomenon, we collect the results while changing the XOR positions of round keys on GIFT. The results show that the key-schedule is not the only factor to affect the mis-matching phenomenon. Leveraging on GPGPU technology, the data collection process can be performed about 157 times faster than using CPU only.