• Title/Summary/Keyword: null hypothesis

Search Result 197, Processing Time 0.026 seconds

Effect of Dimension in Optimal Dimension Reduction Estimation for Conditional Mean Multivariate Regression (다변량회귀 조건부 평균모형에 대한 최적 차원축소 방법에서 차원수가 결과에 미치는 영향)

  • Seo, Eun-Kyoung;Park, Chong-Sun
    • Communications for Statistical Applications and Methods
    • /
    • v.19 no.1
    • /
    • pp.107-115
    • /
    • 2012
  • Yoo and Cook (2007) developed an optimal sufficient dimension reduction methodology for the conditional mean in multivariate regression and it is known that their method is asymptotically optimal and its test statistic has a chi-squared distribution asymptotically under the null hypothesis. To check the effect of dimension used in estimation on regression coefficients and the explanatory power of the conditional mean model in multivariate regression, we applied their method to several simulated data sets with various dimensions. A small simulation study showed that it is quite helpful to search for an appropriate dimension for a given data set if we use the asymptotic test for the dimension as well as results from the estimation with several dimensions simultaneously.

Impact of Work stopped on Site Productivity and Productivity Achievement Ratio (작업중단이 현장 생산성과 생산성달성율에 미치는 영향)

  • Kim Tae-Wan;Yu Jung-Ho;Lee Hyun-Soo
    • Proceedings of the Korean Institute Of Construction Engineering and Management
    • /
    • autumn
    • /
    • pp.311-314
    • /
    • 2003
  • This study takes a statistical approach to quantify tile impact of wort stopped on the following day's site productivity. Therefore, this study is aimed to verify this null hypothesis: 'A work stopped in a construction field does not impact on the following day's productivity and productivity achievement ratio'. For the purpose, work, stopped is presumed as a dummy variable impacting site productivity in multiple linear regression analysis. Moreover, the quantified impact of work slopped on productivity achievement ratio is identified based on this study. This study shows that construction managers should persevere in their efforts to secure the work continuity in order to improve site productivity and productivity achievement ratio.

  • PDF

Negative Exponential Disparity Based Deviance and Goodness-of-fit Tests for Continuous Models: Distributions, Efficiency and Robustness

  • Jeong, Dong-Bin;Sahadeb Sarkar
    • Journal of the Korean Statistical Society
    • /
    • v.30 no.1
    • /
    • pp.41-61
    • /
    • 2001
  • The minimum negative exponential disparity estimator(MNEDE), introduced by Lindsay(1994), is an excellenet competitor to the minimum Hellinger distance estimator(Beran 1977) as a robust and yet efficient alternative to the maximum likelihood estimator in parametric models. In this paper we define the negative exponential deviance test(NEDT) as an analog of the likelihood ratio test(LRT), and show that the NEDT is asymptotically equivalent to he LRT at the model and under a sequence of contiguous alternatives. We establish that the asymptotic strong breakdown point for a class of minimum disparity estimators, containing the MNEDE, is at least 1/2 in continuous models. This result leads us to anticipate robustness of the NEDT under data contamination, and we demonstrate it empirically. In fact, in the simulation settings considered here the empirical level of the NEDT show more stability than the Hellinger deviance test(Simpson 1989). The NEDT is illustrated through an example data set. We also define a goodness-of-fit statistic to assess adequacy of a specified parametric model, and establish its asymptotic normality under the null hypothesis.

  • PDF

Traffic Accident Model of Urban Rotary and Roundabout by Type of Collision based on Land Use (토지이용에 따른 충돌 유형별 도시부 로터리 및 회전교차로 사고모형)

  • Lee, Min Yeong;Kim, Tae Yang;Park, Byung Ho
    • Journal of the Korean Society of Safety
    • /
    • v.32 no.4
    • /
    • pp.107-113
    • /
    • 2017
  • This paper deals with the traffic factors related to the collisions of circular intersections. The purpose of this study is to develop traffic accident models by type of collision based on land use. In pursuing the above, the traffic accident data from 2010 to 2014 were collected from the "Traffic Accident Analysis System (TAAS)" data set of the Road Traffic Authority. A multiple regression model was utilized in this study to develop the traffic accident models by type of collision. 17 explanatory variables such as geometry and traffic volume factors were used. The main results are as follows. First, the null hypothesis that the type of land use does not affect the number of accidents by type of collision is rejected. Second, 10 accident models by type of collision based on land use are developed, which are all statistically significant. Finally, the ADT, inscribed circle diameter, bicycle lane, area of central island, number of speed hump, circulatory roadway width, splitter island, area of circulatory roadway, mean number of entry lane and mean width of entry lane are analyzed to see how they affect accident by type of accident based on land use.

Studies on Sensory Evaluation -[Part IV] New Modified Triangle Test- (관능검사(官能檢査)에 관한 연구(硏究) -[제4보(第4報)] 3점비교법(點比較法)의 신변형(新變形)에 대하여-)

  • Hong, Jin
    • Applied Biological Chemistry
    • /
    • v.20 no.3
    • /
    • pp.285-291
    • /
    • 1977
  • In this paper the new statistical method called "New Modified Triangle Test" is studied. This method is to test the null hypothesis based on the result which comes from evaluating sample size "t", where $t{\geqq}3$, by using Triangle Preference Scale Test. It a confirmed that "New Modified Scheffe's Method 1" can be used for appraising "New Modified Triangle Test". In this report, the weight fraction of in-correct oddity chosen to correct oddity chosen is made 1/2 in terms of chance probability.

  • PDF

A Noncoherent Method for Sequential Code Acquisition with Simplified Structure Based on Approximated Bessel Function (어림 베셀함수를 바탕으로 얼개를 간단히 한 비동위상 순차 부호획득 방법)

  • Kwon, Hyoung-Moon;Lee, Ju-Mi;Yoon, Seok-Ho;Lee, Sung-Ro;Song, Iick-Ho
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.30 no.9C
    • /
    • pp.955-963
    • /
    • 2005
  • In this paper, we consider the noncoherent code acquisition problem using sequential schemes. We show that the outputs of the noncoherent receiver approximately have a central chi-square distribution under the null hypothesis through simulations. Based on this observation, simplified acquisition schemes are obtained using the approximations of the Bessel function. The performance of the simplified and original schemes are compared in additive white Gaussian noise and slowly varying fading channels. Numerical results show that the simplified schemes have essentially the same performance as the original schemes.

Adaptive Changes in the Grain-size of Word Recognition (단어재인에 있어서 처리단위의 적응적 변화)

  • Lee, Chang H.
    • Proceedings of the Korean Society for Cognitive Science Conference
    • /
    • 2002.05a
    • /
    • pp.111-116
    • /
    • 2002
  • The regularity effect for printed word recognition and naming depends on ambiguities between single letters (small grain-size) and their phonemic values. As a given word is repeated and becomes more familiar, letter-aggregate size (grain-size) is predicted to increase, thereby decreasing the ambiguity between spelling pattern and phonological representation and, therefore, decreasing the regularity effect. Lexical decision and naming tasks studied the effect of repetition on the regularity effect for words. The familiarity of a word from was manipulated by presenting low and high frequency words as well as by presenting half the stimuli in mixed upper- and lowercase letters (an unfamiliar form) and half in uniform case. In lexical decision, the regularity effect was initially strong for low frequency words but became null after two presentations; in naming it was also initially strong but was merely reduced (although still substantial) after three repetitions. Mixed case words were recognized and named more slowly and tended to show stronger regularity effects. The results were consistent with the primary hypothesis that familiar word forms are read faster because they are processed at a larger grain-size, which requires fewer operations to achieve lexical selection. Results are discussed in terms of a neurobiological model of word recognition based on brain imaging studies.

  • PDF

Controlling Linkage Disequilibrium in Association Tests: Revisiting APOE Association in Alzheimer's Disease

  • Park, Lee-Young
    • Genomics & Informatics
    • /
    • v.5 no.2
    • /
    • pp.61-67
    • /
    • 2007
  • The allele frequencies of markers as well as linkage disequilibrium (LD) can be changed in cases due to the LD between markers and the disease allele, exhibiting spurious associations of markers. To identify the true association, classical statistical tests for dealing with confounders have been applied to draw a conclusion as to whether the association of variants comes from LD with the known disease allele. However, a more direct test considering LD using estimated haplotype frequencies may be more efficient. The null hypothesis is that the different allele frequencies of a variant between cases and controls come solely from the increased disease allele frequency and the LD relationship with the disease allele. The haplotype frequencies of controls are estimated using the expectation maximization (EM) algorithm from the genotype data. The estimated frequencies are applied to calculate the expected haplotype frequencies in cases corresponding to the increase or decrease of the causative or protective alleles. The suggested method was applied to previously published data, and several APOE variants showed association with Alzheimer's disease independent from the APOE ${\varepsilon}4$ variant, rs429358, regardless of LD showing significant simulated p-values. The test results support the possibility that there may be more than one common disease variant in a locus.

Effect of Walking-Environment Factor on Pedestrian Safety (보행환경요인이 보행안전에 미치는 영향분석)

  • Lee, Su-Min;Hwang, Gi-Yeon
    • Journal of Korean Society of Transportation
    • /
    • v.27 no.1
    • /
    • pp.107-114
    • /
    • 2009
  • Human walking is essential and important mean of transportation. Pedestrian safety is recently important because accidents often happen while walking. This research is showing that Walking-environmental factors have effect on safety while walking. At first, exact 15 factors and conduct survey in the preceding research. After that, exact 4 important factors through factor analysis. At result of Multiple regression analysis, null hypothesis has proved to be true by satisfying therms which is F-value 9.211 and P-value 0.000. and come to the conclusion that walking-environmental factors influence pedestrian safety. 4 important factors can be listed by below. Pedestrian-road characteristic, landscape characteristic, commercial characteristic, walking characteristics by following influence. Especially, landscape characteristic and pedestrian-road characteristic can be vital factors.

A Proposal of Unit Hydrograph Using Statistical Analysis in Oedo Stream, Jeju (통계적 기법을 적용한 외도천의 단위유량도 제안)

  • Lee, Jun-Ho;Yang, Sung-Kee;Jung, Woo-Yul
    • Journal of Environmental Science International
    • /
    • v.24 no.4
    • /
    • pp.393-401
    • /
    • 2015
  • Rainfall-runoff model of Jeju Oedo Stream was used to compute the optimal unit hydrograph by HEC-HMS model that reflecting on watershed characteristics. Each rainfall event was comparatively analyzed with the actual flow measurement using Clark, Snyder and SCS synthetic methods for derived unit hydrograph. Subsequently, the null hypothesis was established as p-value for peak flow and peak time of each unit hydrograph by one-way ANOVA(Analysis of variance) was larger than significance level of 0.05. There was no significant difference in peak flow and peak time between different methods of unit hydrograph. As a result of comparing error rate with actual flow measurement data, Clark synthetic unit graph best reflected in Oedo Stream as compared to other methods, and error rate of Clark unit hydrograph was 0.02~1.93% and error rate at peak time was 0~2.74%.