• Title/Summary/Keyword: EM Algorithm

Search Result 375, Processing Time 0.025 seconds

Measurements and Statistical Modeling of Man-made Noise (인위적인 전자파 잡음의 측정 및 통계적 모형)

  • 김종호;우종우;백락준;윤현보
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.9 no.2
    • /
    • pp.159-170
    • /
    • 1998
  • Man-made noise in PCS frequency range are measured using the three axis antennal with common port and the measured data are statistically treated for modeling. The can optimal parameters of the measured APD curves can be calculated fast by using the Composite Approximation Algorithm. The calculated APD parameters can reduce the EM-environments Data Base memory and also be applied to determine the output and sensitivity margin of the transmitter and the receiver.

  • PDF

Analysis of Marginal Count Failure Data by using Covariates

  • Karim, Md.Rezaul;Suzuki, Kazuyuki
    • International Journal of Reliability and Applications
    • /
    • v.4 no.2
    • /
    • pp.79-95
    • /
    • 2003
  • Manufacturers collect and analyze field reliability data to enhance the quality and reliability of their products and to improve customer satisfaction. To reduce the data collecting and maintenance costs, the amount of data maintained for evaluating product quality and reliability should be minimized. With this in mind, some industrial companies assemble warranty databases by gathering data from different sources for a particular time period. This “marginal count failure data” does not provide (i) the number of failures by when the product entered service, (ii) the number of failures by product age, or (iii) information about the effects of the operating season or environment. This article describes a method for estimating age-based claim rates from marginal count failure data. It uses covariates to identify variations in claims relative to variables such as manufacturing characteristics, time of manufacture, operating season or environment. A Poisson model is presented, and the method is illustrated using warranty claims data for two electrical products.

  • PDF

Estimating the Mixture of Proportional Hazards Model with the Constant Baseline Hazards Function

  • Kim Jong-woon;Eo Seong-phil
    • Proceedings of the Korean Reliability Society Conference
    • /
    • 2005.06a
    • /
    • pp.265-269
    • /
    • 2005
  • Cox's proportional hazards model (PHM) has been widely applied in the analysis of lifetime data, and it can be characterized by the baseline hazard function and covariates influencing systems' lifetime, where the covariates describe operating environments (e.g. temperature, pressure, humidity). In this article, we consider the constant baseline hazard function and a discrete random variable of a covariate. The estimation procedure is developed in a parametric framework when there are not only complete data but also incomplete one. The Expectation-Maximization (EM) algorithm is employed to handle the incomplete data problem. Simulation results are presented to illustrate the accuracy and some properties of the estimation results.

  • PDF

A New Material Sensitivity Analysis for Electromagnetic Inverse Problems

  • Byun, Jin-Kyu;Lee, Hyang-Beom;Kim, Hyeong-Seok;Kim, Dong-Hun
    • Journal of Magnetics
    • /
    • v.16 no.1
    • /
    • pp.77-82
    • /
    • 2011
  • This paper presents a new self-adjoint material sensitivity formulation for optimal designs and inverse problems in the high frequency domain. The proposed method is based on the continuum approach using the augmented Lagrangian method. Using the self-adjoint formulation, there is no need to solve the adjoint system additionally when the goal function is a function of the S-parameter. In addition, the algorithm is more general than most previous approaches because it is independent of specific analysis methods or gridding techniques, thereby enabling the use of commercial EM simulators and various custom solvers. For verification, the method was applied to the several numerical examples of dielectric material reconstruction problems in the high frequency domain, and the results were compared with those calculated using the conventional method.

Threshold-asymmetric volatility models for integer-valued time series

  • Kim, Deok Ryun;Yoon, Jae Eun;Hwang, Sun Young
    • Communications for Statistical Applications and Methods
    • /
    • v.26 no.3
    • /
    • pp.295-304
    • /
    • 2019
  • This article deals with threshold-asymmetric volatility models for over-dispersed and zero-inflated time series of count data. We introduce various threshold integer-valued autoregressive conditional heteroscedasticity (ARCH) models as incorporating over-dispersion and zero-inflation via conditional Poisson and negative binomial distributions. EM-algorithm is used to estimate parameters. The cholera data from Kolkata in India from 2006 to 2011 is analyzed as a real application. In order to construct the threshold-variable, both local constant mean which is time-varying and grand mean are adopted. It is noted via a data application that threshold model as an asymmetric version is useful in modelling count time series volatility.

Exploring COVID-19 in mainland China during the lockdown of Wuhan via functional data analysis

  • Li, Xing;Zhang, Panpan;Feng, Qunqiang
    • Communications for Statistical Applications and Methods
    • /
    • v.29 no.1
    • /
    • pp.103-125
    • /
    • 2022
  • In this paper, we analyze the time series data of the case and death counts of COVID-19 that broke out in China in December, 2019. The study period is during the lockdown of Wuhan. We exploit functional data analysis methods to analyze the collected time series data. The analysis is divided into three parts. First, the functional principal component analysis is conducted to investigate the modes of variation. Second, we carry out the functional canonical correlation analysis to explore the relationship between confirmed and death cases. Finally, we utilize a clustering method based on the Expectation-Maximization (EM) algorithm to run the cluster analysis on the counts of confirmed cases, where the number of clusters is determined via a cross-validation approach. Besides, we compare the clustering results with some migration data available to the public.

Automatic Detection of Forgery in Video Frames using Analysis of Imaging Device Profile based Pattern Trace (영상기기의 프로파일 분석 기반 패턴추적에 의한 비디오 프레임의 위변조탐지)

  • Shim, Jae-Youen;Chon, In-Hyuk;Kim, Seong-Whan
    • Proceedings of the Korea Information Processing Society Conference
    • /
    • 2011.04a
    • /
    • pp.1024-1027
    • /
    • 2011
  • 본 논문은 HD (High-definition) video, SD (Standard-definition) video, low quality video, handset video, 4 가지 Imaging Device 에 대한 프로파일 분석을 통해 비디오 프레임 상에 나타나는 위 변조를 검사하는 방법을 제안한다. High-definition video, Standard-definition video, low quality video, handset video 에 대한 분석을 하고 각 영상의 특이 점을 파악 하여 분류한 클래스에 대한 프로파일검사를 통해 EM Algorithm 을 이용하여 영상의 위 변조를 검사 하고 영상의 신뢰성을 높인다.

Estimation of Variance Component and Environment Effects on Somatic Cell Scores by Parity in Dairy Cattle (젖소집단의 산차에 따른 체세포점수의 환경효과 및 분산성분 추정)

  • 조광현;나승환;서강석;김시동;박병호;이영창;박종대;손삼규;최재관
    • Journal of Animal Science and Technology
    • /
    • v.48 no.1
    • /
    • pp.39-48
    • /
    • 2006
  • This study utilized test day of somatic cell score data of dairy cattle from 2000 to 2004. The number of data used were 124,635 of first parity, 134,308 of second parity, 77,862 of third parity, 41,787 of forth parity and 37,412 of fifth parity. The data was analyzed by least square mean method using GLM to estimate the effects of calving year, age, lactation stage, parity and season on somatic cell score. Variance component estimation using test day model was determined by using expectation maximization algorithm- restricted maximum likelihood (EM-REML) analysis method. In each parity, somatic cell score was low for younger group and was relatively high in older groups. Likewise, for lactation stage, the score was low in early-lactation and high in late-lactation in first parity and second parity. Nevertheless, for the third, fourth and fifth parity, however, high somatic cell score was observed in mid-lactation. Generally, the score was high in the peak. Although in fourth and fifth parity, the score was low in late-lactation. Environmental effect of season, somatic cell score was generally low from September to November for all parities. The score was high between June and August when the milk production is usually low. The heritability in each parity were 0.05, 0.09, 0.10, 0.05 and 0.05 for parity 1, 2, 3, 4, 5, respectively. Genetic variance value was estimated to be high in second, third and fifth parity in early-lactation and to be low in first and forth parity.

Study on Imputation Methods of Missing Real-Time Traffic Data (실시간 누락 교통자료의 대체기법에 관한 연구)

  • Jang Jin-hwan;Ryu Seung-ki;Moon Hak-yong;Byun Sang-cheal
    • The Journal of The Korea Institute of Intelligent Transport Systems
    • /
    • v.3 no.1 s.4
    • /
    • pp.45-52
    • /
    • 2004
  • There are many cities installing ITS(Intelligent Transportation Systems) and running TMC(Trafnc Management Center) to improve mobility and safety of roadway transportation by providing roadway information to drivers. There are many devices in ITS which collect real-time traffic data. We can obtain many valuable traffic data from the devices. But it's impossible to avoid missing traffic data for many reasons such as roadway condition, adversary weather, communication shutdown and problems of the devices itself. We couldn't do any secondary process such as travel time forecasting and other transportation related research due to the missing data. If we use the traffic data to produce AADT and DHV, essential data in roadway planning and design, We might get skewed data that could make big loss. Therefore, He study have explored some imputation techniques such as heuristic methods, regression model, EM algorithm and time-series analysis for the missing traffic volume data using some evaluating indices such as MAPE, RMSE, and Inequality coefficient. We could get the best result from time-series model generating 5.0$\%$, 0.03 and 110 as MAPE, Inequality coefficient and RMSE, respectively. Other techniques produce a little different results, but the results were very encouraging.

  • PDF

Congestion Control Scheme for Wide Area and High-Speed Networks (초고속-장거리 네트워크에서 혼잡 제어 방안)

  • Yang Eun Ho;Ham Sung Il;Cho Seongho;Kim Chongkwon
    • The KIPS Transactions:PartC
    • /
    • v.12C no.4 s.100
    • /
    • pp.571-580
    • /
    • 2005
  • In fast long-distance networks, TCP's congestion control algorithm has the problem of utilizing bandwidth effectively. Several window-based congestion control protocols for high-speed and large delay networks have been proposed to solve this problem. These protocols deliberate mainly three properties : scalability, TCP-friendliness, and RTT-fairness. These protocols, however, cannot satisfy above three properties at the same time because of the trade-off among them This paper presents a new window-based congestion control algorithm, called EM (Exponential Increase/ Multiplicative Decrease), that simultaneously supports all four properties including fast convergence, which is another important constraint for fast long-distance networks; it can support scalability by increasing congestion window exponentially proportional to the time elapsed since a packet loss; it can support RTT-fairness and TCP-friendliness by considering RTT in its response function; it can support last fair-share convergence by increasing congestion window inversely proportional to the congestion window just before packet loss. We evaluate the performance of EIMD and other algorithms by extensive computer simulations.