• Title/Summary/Keyword: value estimation

Search Result 3,135, Processing Time 0.028 seconds

A Study on Driver's Perception over the Change of the Headlamp's Illuminance : 4. Development of the Standard and the Algorithm for Limiting Brightness Change (전조등 조도변동에 대한 운전자의 인식연구 : 4. 조도변동 기준 및 평가 알고리즘 개발)

  • Kim, Gi-Hoon;Kim, Huyn-Ji;An, Ok-Hee;Lim, Tae-Young;Min, Jae-Woong;Lim, Jun-Chae;Kang, Byung-Do;Kim, Hoon
    • Journal of the Korean Institute of Illuminating and Electrical Installation Engineers
    • /
    • v.21 no.10
    • /
    • pp.13-21
    • /
    • 2007
  • Based on the measurement results, a limit value for the change of the brightness of the headlamp, and an estimation algorithm considering the driver's safety was developed. Limitation values concerning being uncomfortable, blinking, and brightness change were indicated based on a subjective estimation of the psychological estimation. Also a safety estimation algorithm and a limitation value for stopping safely without the threat of obstacles to the vehicle were indicated by the perception measurement.

An OFDM Frequency Offset Estimation Scheme Robust to Timing Error (시간 오차에 강인한 OFDM 주파수 옵셋 추정 기법)

  • Kim Sang-Hun;Yoon Seok-Ho
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.31 no.6C
    • /
    • pp.623-628
    • /
    • 2006
  • This paper addresses the frequency offset estimation problem in the presence of the timing error for OFDM systems. When the timing error exists, the correlation value used for the frequency offset estimation could be reduced significantly due to the timing error, resulting in considerable degradation in estimation performance. In this paper, using the coherence phase bandwidth (CPB) and a threshold, a novel frequency offset estimation scheme is proposed and based on which, an efficient timing error estimation scheme is also proposed for the re-estimation of the frequency offset. The performance comparison results show that the proposed frequency offset estimation scheme is not only more robust to the timing error but also has less computational complexity, as compared with the conventional schemes. It is also demonstrated by simulation that theproposed timing error estimation scheme gives a reliable estimate of the timing error.

A Study on the Non Destructive Test by P Type Schmidt Hammer for Early Quality Control of Concrete (콘크리트의 초기강도품질관리를 위한 P형 슈미트햄머법 비파괴시험에 관한 연구)

  • 김기정;신병호;이용성;윤기원;한천구
    • Proceedings of the Korea Concrete Institute Conference
    • /
    • 2002.05a
    • /
    • pp.157-162
    • /
    • 2002
  • This study is intended to present a reference data for effective quality control of concrete through comparing the rebound value of P type schmidt hammer with the compressive strength with variation of mix proportion and curing condition. According to the results, the air-curing specimen shows the higher rebound value than standard specimen except high strength in the whole. Also the vertical stroke shows higher rebound value than horizontal stroke in standard specimen, however, the rebound value of the two does not show prominent difference in air-curing specimen. The estimation equation of compressive strength derived from this experiment estimates the compressive strength more largely than the estimation equation in P type schmidt hammer manual. Therefore it is thought that the new estimation equation that fits our condition will have to be presented.

  • PDF

Advanced Block Matching Algorithm for Motion Estimation and Motion Compensation

  • Cho, Hyo-Moon;Cho, Sang-Bock
    • Proceedings of the KIEE Conference
    • /
    • 2007.04a
    • /
    • pp.23-25
    • /
    • 2007
  • The partial distortion elimination (PDE) scheme is used to decrease the sum of absolute difference (SAD) computational complexity, since the SAD calculation has been taken much potion of the video compression. In motion estimation (ME) based on PDE, it is ideal that the initial value of SAD in summing performance has large value. The traditional scan order methods have many operation time and high operational complexity because these adopted the division or multiplication. In this paper, we introduce the new scan order and search order by using only adder. We define the average value which is called to rough average value (RAVR). Which is to reduce the computational complexity and increase the operational speed and then we can obtain the improvement of SAD performance. And also this RAVR is used to decide the search order sequence, since the difference RAVR between the current block and candidate block is small then this candidate block has high probability to suitable candidate. Thus, our proposed algorithm combines above two main concepts and suffers the improving SAD performance and the easy hardware implementation methods.

  • PDF

Improvement on the estimation of workable-quantity per unit time for boring machine (기초공사 천공기계 시간당작업량 산정 개선방안)

  • Ahn, Bang-Ryul
    • Proceedings of the Korean Institute of Building Construction Conference
    • /
    • 2015.05a
    • /
    • pp.138-139
    • /
    • 2015
  • Human productivity of Boring Machine for stack is provided but not its hourly workable quantity(Q-value) in the Equipment ownership cost and expenses section of the Poom-Same that is used for construction cost estimation of public sectors in Korea, which leads to less realistic and subjective estimation for the works. The optimized Q-value of the machine is proposed as a result of thorough investigation into many of its operations.

  • PDF

Implementation of Estimation and Inference on the Web

  • Kang, Heemo;Sim, Songyong
    • Communications for Statistical Applications and Methods
    • /
    • v.7 no.3
    • /
    • pp.913-926
    • /
    • 2000
  • An electronic statistics text on the web is implemented. The introduced text provide interactive instructions on the statistical estimation and inference. As a by-product, we also provide a calculation of quantiles and p-value of t-distribution and standard normal distribution. This program was written in JAVA programming language.

  • PDF

Value-at-Risk Estimation of the KOSPI Returns by Employing Long-Memory Volatility Models (장기기억 변동성 모형을 이용한 KOSPI 수익률의 Value-at-Risk의 추정)

  • Oh, Jeongjun;Kim, Sunggon
    • The Korean Journal of Applied Statistics
    • /
    • v.26 no.1
    • /
    • pp.163-185
    • /
    • 2013
  • In this paper, we investigate the need to employ long-memory volatility models in terms of Value-at-Risk(VaR) estimation. We estimate the VaR of the KOSPI returns using long-memory volatility models such as FIGARCH and FIEGARCH; in addition, via back-testing we compare the performance of the obtained VaR with short memory processes such as GARCH and EGARCH. Back-testing says that there exists a long-memory property in the volatility process of KOSPI returns and that it is essential to employ long-memory volatility models for the right estimation of VaR.

An Analysis on Characteristics and the Development of Estimation Model of Internal Heat Gain from Appliances in Apartment Units (공동주택 단위세대의 기기발열 특성 분석 및 추정모델 개발)

  • Lee, Soo-Jin;Jin, Hye-Sun;Kim, Sung-Im;Lim, Han-Young;Lim, Jae-Han;Song, Seung-Yeong
    • Journal of the Architectural Institute of Korea Structure & Construction
    • /
    • v.34 no.10
    • /
    • pp.19-26
    • /
    • 2018
  • The purpose of this study was to analyze characteristics and to develop estimation model of IHG(Internal Heat Gain) from appliance in domestic apartment units. To do this, it was defined the source of IHG from appliance and the calculation method through the case study of international and domestic codes. And the equipment related datum such possession, usage or not, etc were collected through field survey in apartment units, and the appliances' electricity consumption were measured separately from overall electricity consumption. Annual electricity consumption value were calculated with field survey datum and appliances' electricity consumption measurement datum, and then IHG value was calculated by applying PHPP v9 method. And it was conducted correlation analysis between IHG value and the area for exclusive use, the number of occupants, and then the IHG from applianace estimation model was deducted with regression analysis. Finally, it was analyzed the present level and of the domestic code(The Building Energy Efficiency Rating System) comparing with the value of estimation model, and the various international codes(HERS, Building America, SAP).

Parametric nonparametric methods for estimating extreme value distribution (극단값 분포 추정을 위한 모수적 비모수적 방법)

  • Woo, Seunghyun;Kang, Kee-Hoon
    • The Journal of the Convergence on Culture Technology
    • /
    • v.8 no.1
    • /
    • pp.531-536
    • /
    • 2022
  • This paper compared the performance of the parametric method and the nonparametric method when estimating the distribution for the tail of the distribution with heavy tails. For the parametric method, the generalized extreme value distribution and the generalized Pareto distribution were used, and for the nonparametric method, the kernel density estimation method was applied. For comparison of the two approaches, the results of function estimation by applying the block maximum value model and the threshold excess model using daily fine dust public data for each observatory in Seoul from 2014 to 2018 are shown together. In addition, the area where high concentrations of fine dust will occur was predicted through the return level.

A data-adaptive maximum penalized likelihood estimation for the generalized extreme value distribution

  • Lee, Youngsaeng;Shin, Yonggwan;Park, Jeong-Soo
    • Communications for Statistical Applications and Methods
    • /
    • v.24 no.5
    • /
    • pp.493-505
    • /
    • 2017
  • Maximum likelihood estimation (MLE) of the generalized extreme value distribution (GEVD) is known to sometimes over-estimate the positive value of the shape parameter for the small sample size. The maximum penalized likelihood estimation (MPLE) with Beta penalty function was proposed by some researchers to overcome this problem. But the determination of the hyperparameters (HP) in Beta penalty function is still an issue. This paper presents some data adaptive methods to select the HP of Beta penalty function in the MPLE framework. The idea is to let the data tell us what HP to use. For given data, the optimal HP is obtained from the minimum distance between the MLE and MPLE. A bootstrap-based method is also proposed. These methods are compared with existing approaches. The performance evaluation experiments for GEVD by Monte Carlo simulation show that the proposed methods work well for bias and mean squared error. The methods are applied to Blackstone river data and Korean heavy rainfall data to show better performance over MLE, the method of L-moments estimator, and existing MPLEs.