• 제목/요약/키워드: Choice Probability

검색결과 172건 처리시간 0.024초

Determining the Decision Limit of CUSUM Chart for A Fixed Sample Size

  • Kang, Chang Wook;Hawkins, Donglas M.
    • 품질경영학회지
    • /
    • 제20권1호
    • /
    • pp.1-10
    • /
    • 1992
  • When we compare different control charting schemes, the average run length of each control chart is usually used. The use of the average run length implies that there is unbounded number of samples or observations. The regression recursive residuals, however, have been applied to the cumulative sum chart to detect whether the mean or variance changes. To implement choice of decision interval, we calculate the probability that certain fixed number of control statistics stay in the in-control state. This probability can be used as the significance level of a test for detecting the change in the residual mean or variance of the data with a finite number of observations.

  • PDF

교차영향분석의 작용을 통한 국내 IT 환경 시나리오에 대한 연구 (A Study of IT Environment Scenario through the Application of Cross Impact Analysis)

  • 김진한;김성홍
    • 경영과학
    • /
    • 제21권3호
    • /
    • pp.129-147
    • /
    • 2004
  • Scenario analysis for strategic planning, unlike most forecasting methods, provides a qualitative, contextual description of how the present will evolve into the future. It normally tries to identify a set of possible futures, each of whose occurrence is plausible but not assured. In this paper, we propose the use of Cross Impact Analysis(CIA) approach for scenario generation about the future of Korean IT environments. In this analysis, we classified IT environments into technical, social, legislative, and economic factor. And various variables and events were defined in each factor. From the survey collected from IT related experts, we acquire probability of occurrence and compatibility estimates of every possible pairs of events as input. Then 2 phase analysis is used in order to choice events with high probability of occurrence and generate scenario. Finally, after CIA using Monte Carlo simulation, a detail scenario for 2010 was developed. These scenario drawn from the CIA approach is a result considered by cross impacts of various events.

Multimedia Watermark Detection Algorithm Based on Bayes Decision Theorys

  • Kwon, Seong-Geun;Lee, Suk-Hwan;Kwon, Kee-Koo;Kwon, Ki-Ryong;Lee, Kuhn-Il
    • 대한전자공학회:학술대회논문집
    • /
    • 대한전자공학회 2002년도 ITC-CSCC -2
    • /
    • pp.1272-1275
    • /
    • 2002
  • Watermark detection plays a crucial role in multimedia copyright protection and has traditionally been tackled using correlation-based algorithms. However, correlation-based detection is not actually the best choice, as it does not utilize the distributional characteristics of the image being marked. Accordingly, an efficient watermark detection scheme for DWT coefficients is proposed as optimal for non-additive schemes. Based on the statistical decision theory, the proposed method is derived according to Bayes' decision theory, the Neyman-Pearson criterion, and the distribution of the DWT coefficients, thereby minimizing the missed detection probability subject to a given false alarm probability. The proposed method was tested in the context of robustness, and the results confirmed the superiority of the proposed technique over conventional correlation-based detection method.

  • PDF

Unsaturated Throughput Analysis of IEEE 802.11 DCF under Imperfect Channel Sensing

  • Shin, Soo-Young
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제6권4호
    • /
    • pp.989-1005
    • /
    • 2012
  • In this paper, throughput of IEEE 802.11 carrier-sense multiple access (CSMA) with collision-avoidance (CA) protocols in non-saturated traffic conditions is presented taking into account the impact of imperfect channel sensing. The imperfect channel sensing includes both missed-detection and false alarm and their impact on the utilization of IEEE 802.11 analyzed and expressed as a closed form. To include the imperfect channel sensing at the physical layer, we modified the state transition probabilities of well-known two state Markov process model. Simulation results closely match the theoretical expressions confirming the effectiveness of the proposed model. Based on both theoretical and simulated results, the choice of the best probability detection while maintaining probability of false alarm is less than 0.5 is a key factor for maximizing utilization of IEEE 802.11.

외국어 발음오류 검출 음성인식기를 위한 MCE 학습 알고리즘 (MCE Training Algorithm for a Speech Recognizer Detecting Mispronunciation of a Foreign Language)

  • 배민영;정용주;권철홍
    • 음성과학
    • /
    • 제11권4호
    • /
    • pp.43-52
    • /
    • 2004
  • Model parameters in HMM based speech recognition systems are normally estimated using Maximum Likelihood Estimation(MLE). The MLE method is based mainly on the principle of statistical data fitting in terms of increasing the HMM likelihood. The optimality of this training criterion is conditioned on the availability of infinite amount of training data and the correct choice of model. However, in practice, neither of these conditions is satisfied. In this paper, we propose a training algorithm, MCE(Minimum Classification Error), to improve the performance of a speech recognizer detecting mispronunciation of a foreign language. During the conventional MLE(Maximum Likelihood Estimation) training, the model parameters are adjusted to increase the likelihood of the word strings corresponding to the training utterances without taking account of the probability of other possible word strings. In contrast to MLE, the MCE training scheme takes account of possible competing word hypotheses and tries to reduce the probability of incorrect hypotheses. The discriminant training method using MCE shows better recognition results than the MLE method does.

  • PDF

CLASSIFICATION ON ARITHMETIC FUNCTIONS AND CORRESPONDING FREE-MOMENT L-FUNCTIONS

  • Cho, Ilwoo
    • 대한수학회보
    • /
    • 제52권3호
    • /
    • pp.717-734
    • /
    • 2015
  • In this paper, we provide a classification of arithmetic functions in terms of identically-free-distributedness, determined by a fixed prime. We show then such classifications are free from the choice of primes. In particular, we obtain that the algebra $A_p$ of equivalence classes under the quotient on A by the identically-free-distributedness is isomorphic to an algebra $\mathbb{C}^2$, having its multiplication $({\bullet});(t_1,t_2){\bullet}(s_1,s_2)=(t_1s_1,t_1s_2+t_2s_1)$.

위험 인식이 위험성 수용 기준 설정에 미치는 역할 (The role of risk perception for the definition of acceptable risk)

  • 노삼규
    • 한국화재소방학회논문지
    • /
    • 제9권2호
    • /
    • pp.3-9
    • /
    • 1995
  • Acceptable risk problem are decision problems they requires a choice among different estimations of technological risks. The alternative option includes a threat to life among its consequences. However, the definition used to ignore the public's perceived risk which should be identified as acceptable risk. The study examine the role of perception of risk as acceptable risk between different situations of estimated consequence and probability of risk. The cost benefit principle for the reduction of risk applied to find the possible solutions with in decision making process.

  • PDF

Routing Decision with Link-weight Calculating Function in WDM Switching Networks

  • Charoenphetkul, Pongnatee;Thipchaksurat, Sakchai;Varakulsiripunth, Ruttikorn
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2004년도 ICCAS
    • /
    • pp.1346-1349
    • /
    • 2004
  • In this paper, we have proposed the new link-weight calculating function using for routing decision in WDM networks. The proposed link-weight calculating functions includes following factors those are available wavelengths per link, distance loss, total wavelengths, and limited wavelength conversion. The calculated link-weight is applied into the algorithm of routing decision in order to determine the available lightpath that qualifies user requests. The objective is to improve the performance of wavelengths assignment with fast determining the suitable lightpath by using the proposed link-weights calculating function. The analytical model of WDM switching networks is introduced for numerical analysis. The link-weight calculating function is performed. Finally, the performance of proposed algorithm is displayed with numerical results in term of the blocking probability, the probability that connection requests from users are rejected due to there are no available lightpath to be assigned for them. It is also shown that the blocking probability is varied in depending on the number of available wavelengths and the degree of wavelength conversion. The numerical results also show that the proposed link-weight calculating function is more cost-effective choice for the routing decision in WDM switching networks.

  • PDF

Performance Analysis of M-ary Optical Communication over Log-Normal Fading Channels for CubeSat Platforms

  • Lim, Hyung-Chul;Yu, Sung-Yeol;Sung, Ki-Pyoung;Park, Jong Uk;Choi, Chul-Sung;Choi, Mansoo
    • Journal of Astronomy and Space Sciences
    • /
    • 제37권4호
    • /
    • pp.219-228
    • /
    • 2020
  • A CubeSat platform has become a popular choice due to inexpensive commercial off-the-shelf (COTS) components and low launch cost. However, it requires more power-efficient and higher-data rate downlink capability for space applications related to remote sensing. In addition, the platform is limited by the size, weight and power (SWaP) constraints as well as the regulatory issue of licensing the radio frequency (RF) spectrum. The requirements and limitations have put optical communications on promising alternatives to RF communications for a CubeSat platform, owing to the power efficiency and high data rate as well as the license free spectrum. In this study, we analyzed the performance of optical downlink communications compatible with CubeSat platforms in terms of data rate, bit error rate (BER) and outage probability. Mathematical models of BER and outage probability were derived based on not only the log-normal model of atmospheric turbulence but also a transmitter with a finite extinction ratio. Given the fixed slot width, the optimal guard time and modulation orders were chosen to achieve the target data rate. And the two performance metrics, BER and outage data rate, were analyzed and discussed with respect to beam divergence angle, scintillation index and zenith angle.

Probabilistic analysis of gust factors and turbulence intensities of measured tropical cyclones

  • Tianyou Tao;Zao Jin;Hao Wang
    • Wind and Structures
    • /
    • 제38권4호
    • /
    • pp.309-323
    • /
    • 2024
  • The gust factor and turbulence intensity are two crucial parameters that characterize the properties of turbulence. In tropical cyclones (TCs), these parameters exhibit significant variability, yet there is a lack of established formulas to account for their probabilistic characteristics with consideration of their inherent connection. On this condition, a probabilistic analysis of gust factors and turbulence intensities of TCs is conducted based on fourteen sets of wind data collected at the Sutong Cable-stayed Bridge site. Initially, the turbulence intensities and gust factors of recorded data are computed, followed by an analysis of their probability densities across different ranges categorized by mean wind speed. The Gaussian, lognormal, and generalized extreme value (GEV) distributions are employed to fit the measured probability densities, with subsequent evaluation of their effectiveness. The Gumbel distribution, which is a specific instance of the GEV distribution, has been identified as an optimal choice for probabilistic characterizations of turbulence intensity and gust factor in TCs. The corresponding empirical models are then established through curve fitting. By utilizing the Gumbel distribution as a template, the nexus between the probability density functions of turbulence intensity and gust factor is built, leading to the development of a generalized probabilistic model that statistically describe turbulence intensity and gust factor in TCs. Finally, these empirical models are validated using measured data and compared with suggestions recommended by specifications.