• Title/Summary/Keyword: weighted distribution

Search Result 552, Processing Time 0.022 seconds

Pattern Analysis of Sea Surface Temperature Distribution in the Southeast Sea of Korea Using a Weighted Mean Center (가중공간중심을 활용한 한국 남동해역의 표층수온 분포 패턴 분석)

  • KIM, Bum-Kyu;YOON, Hong-Joo;KIM, Tae-Hoon;CHOI, Hyun-Woo
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.23 no.3
    • /
    • pp.263-274
    • /
    • 2020
  • In the Southeast Sea of Korea, a cold water mass is formed intensively in summer every year, causing frequent abnormal sea conditions. In order to analyze the spatial changes of sea surface temperature distribution in this area, ocean fields buoy data observed at Gori and Jeongja and reanalyzed sea surface temperature(SST) data from GHRSST Level 4 were used from June to September 2018. The buoy data were used to analyze the time-series water temperature changes at two stations, and the GHRSST data were used to calculate the daily SST variance and weighted mean center(WMC) across the study area. When the buoy's water temperature was lowered, the variance of SST in the study area trend to increase, but it did not appear consistently for the entire period. This is because GHRSST is a reanalysis data that does not reflect sensitive changes in water temperature along the coast. As such, there is a limit to grasping the local small-scale water temperature change in the coast or detecting the location and extent of the cold water zone only by the statistical variance representing the SST change in the entire sea area. Therefore, as a result of using WMC to quantitatively determine the spatial location of the cold water mass, when the cold water zone occurred, WMC was located in the northwest sea area from the mean center(MC) of the study area. This means that it is possible to quantitatively identify where and to what extent the distribution of cold surface water temperature appears through SST's WMC location information, and we could see the possibility of WMC's use in detecting the scale of cold water zones and the extent of regional spread in the future.

A study on the connected-digit recognition using MLP-VQ and Weighted DHMM (MLP-VQ와 가중 DHMM을 이용한 연결 숫자음 인식에 관한 연구)

  • Chung, Kwang-Woo;Hong, Kwang-Seok
    • Journal of the Korean Institute of Telematics and Electronics S
    • /
    • v.35S no.8
    • /
    • pp.96-105
    • /
    • 1998
  • The aim of this paper is to propose the method of WDHMM(Weighted DHMM), using the MLP-VQ for the improvement of speaker-independent connect-digit recognition system. MLP neural-network output distribution shows a probability distribution that presents the degree of similarity between each pattern by the non-linear mapping among the input patterns and learning patterns. MLP-VQ is proposed in this paper. It generates codewords by using the output node index which can reach the highest level within MLP neural-network output distribution. Different from the old VQ, the true characteristics of this new MLP-VQ lie in that the degree of similarity between present input patterns and each learned class pattern could be reflected for the recognition model. WDHMM is also proposed. It can use the MLP neural-network output distribution as the way of weighing the symbol generation probability of DHMMs. This newly-suggested method could shorten the time of HMM parameter estimation and recognition. The reason is that it is not necessary to regard symbol generation probability as multi-dimensional normal distribution, as opposed to the old SCHMM. This could also improve the recognition ability by 14.7% higher than DHMM, owing to the increase of small caculation amount. Because it can reflect phone class relations to the recognition model. The result of my research shows that speaker-independent connected-digit recognition, using MLP-VQ and WDHMM, is 84.22%.

  • PDF

Resource Weighted Load Distribution Policy for Effective Transcoding Load Distribution (효과적인 트랜스코딩 부하 분산을 위한 자원 가중치 부하분산 정책)

  • Seo, Dong-Mahn;Lee, Joa-Hyoung;Choi, Myun-Uk;Kim, Yoon;Jung, In-Bum
    • Journal of KIISE:Computing Practices and Letters
    • /
    • v.11 no.5
    • /
    • pp.401-415
    • /
    • 2005
  • Owing to the improved wireless communication technologies, it is possible to provide streaming service of multimedia with PDAs and mobile phones in addition to desktop PCs. Since mobile client devices have low computing power and low network bandwidth due to wireless network, the transcoding technology to adapt media for mobile client devices considering their characteristics is necessary. Transcoding servers transcode the source media to the target media within corresponding grades and provide QoS in real-time. In particular, an effective load balancing policy for transcoding servers is inevitable to support QoS for large scale mobile users. In this paper, the resource weighted load distribution policy is proposed for a fair load balance and a more scalable performance in cluster-based transcoding servers. Our proposed policy is based on the resource weighted table and number of maximum supported users, which are pre-computed for each pre-defined grade. We implement the proposed policy on cluster-based transcoding servers and evaluate its fair load distribution and scalable performance with the number of transcoding servers.

Optimum Yaw Moment Distribution with Electronic Stability Control and Active Rear Steering (자세 제어 장치와 능동 후륜 조향을 이용한 최적 요 모멘트 분배)

  • Yim, Seongjin
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.20 no.12
    • /
    • pp.1246-1251
    • /
    • 2014
  • This article presents an optimum yaw moment distribution scheme for a vehicle with electronic stability control (ESC) and active rear steering (ARS). After computing the control yaw moment in the yaw moment controller, it should be distributed into tire forces, generated by ESC and ARS. In this paper, yaw moment distribution is formulated as an optimization problem. New objective function is proposed to tune the relative magnitudes of the tire forces. Weighed pseudo-inverse control allocation (WPCA) is adopted to solve the problem. To check the effectiveness of the proposed scheme, simulation is performed on a vehicle simulation package, CarSim. From the simulation, the proposed optimum yaw moment distribution scheme is shown to effective for vehicle stability control.

A new generalization of exponentiated Frechet distribution

  • Diab, L.S.;Elbatal, I.
    • International Journal of Reliability and Applications
    • /
    • v.17 no.1
    • /
    • pp.65-84
    • /
    • 2016
  • Motivated by the recent work of Cordeiro and Castro (2011), we study the Kumaraswamy exponentiated Frechet distribution (KEF). We derive some mathematical properties of the (KEF) including moment generating function, moments, quantile function and incomplete moment. We provide explicit expressions for the density function of the order statistics and their moments. In addition, the method of maximum likelihood and least squares and weighted least squares estimators are discuss for estimating the model parameters. A real data set is used to illustrate the importance and flexibility of the new distribution.

New approach for analysis of progressive Type-II censored data from the Pareto distribution

  • Seo, Jung-In;Kang, Suk-Bok;Kim, Ho-Yong
    • Communications for Statistical Applications and Methods
    • /
    • v.25 no.5
    • /
    • pp.569-575
    • /
    • 2018
  • Pareto distribution is important to analyze data in actuarial sciences, reliability, finance, and climatology. In general, unknown parameters of the Pareto distribution are estimated based on the maximum likelihood method that may yield inadequate inference results for small sample sizes and high percent censored data. In this paper, a new approach based on the regression framework is proposed to estimate unknown parameters of the Pareto distribution under the progressive Type-II censoring scheme. The proposed method provides a new regression type estimator that employs the spacings of exponential progressive Type-II censored samples. In addition, the provided estimator is a consistent estimator with superior performance compared to maximum likelihood estimators in terms of the mean squared error and bias. The validity of the proposed method is assessed through Monte Carlo simulations and real data analysis.

Different estimation methods for the unit inverse exponentiated weibull distribution

  • Amal S Hassan;Reem S Alharbi
    • Communications for Statistical Applications and Methods
    • /
    • v.30 no.2
    • /
    • pp.191-213
    • /
    • 2023
  • Unit distributions are frequently used in probability theory and statistics to depict meaningful variables having values between zero and one. Using convenient transformation, the unit inverse exponentiated weibull (UIEW) distribution, which is equally useful for modelling data on the unit interval, is proposed in this study. Quantile function, moments, incomplete moments, uncertainty measures, stochastic ordering, and stress-strength reliability are among the statistical properties provided for this distribution. To estimate the parameters associated to the recommended distribution, well-known estimation techniques including maximum likelihood, maximum product of spacings, least squares, weighted least squares, Cramer von Mises, Anderson-Darling, and Bayesian are utilised. Using simulated data, we compare how well the various estimators perform. According to the simulated outputs, the maximum product of spacing estimates has lower values of accuracy measures than alternative estimates in majority of situations. For two real datasets, the proposed model outperforms the beta, Kumaraswamy, unit Gompartz, unit Lomax and complementary unit weibull distributions based on various comparative indicators.

Classification of Parkinson's Disease Using Defuzzification-Based Instance Selection (역퍼지화 기반의 인스턴스 선택을 이용한 파킨슨병 분류)

  • Lee, Sang-Hong
    • Journal of Internet Computing and Services
    • /
    • v.15 no.3
    • /
    • pp.109-116
    • /
    • 2014
  • This study proposed new instance selection using neural network with weighted fuzzy membership functions(NEWFM) based on Takagi-Sugeno(T-S) fuzzy model to improve the classification performance. The proposed instance selection adopted weighted average defuzzification of the T-S fuzzy model and an interval selection, same as the confidence interval in a normal distribution used in statistics. In order to evaluate the classification performance of the proposed instance selection, the results were compared with depending on whether to use instance selection from the case study. The classification performances of depending on whether to use instance selection show 77.33% and 78.19%, respectively. Also, to show the difference between the classification performance of depending on whether to use instance selection, a statistics methodology, McNemar test, was used. The test results showed that the instance selection was superior to no instance selection as the significance level was lower than 0.05.

Detection of Traumatic Cerebral Microbleeds by Susceptibility-Weighted Image of MRI

  • Park, Jong-Hwa;Park, Seung-Won;Kang, Suk-Hyung;Nam, Taek-Kyun;Min, Byung-Kook;Hwang, Sung-Nam
    • Journal of Korean Neurosurgical Society
    • /
    • v.46 no.4
    • /
    • pp.365-369
    • /
    • 2009
  • Objective : Susceptibility-weighted image (SWI) is a sensitive magnetic resonance image (MRI) technique to detect cerebral microbleeds (MBLs). which would not be detected by conventional MRI. We performed SWI to detect MBLs and investigated its usefulness in the evaluation of mild traumatic brain injury (MTBI) patients. Methods : From December 2006 to June 2007, twenty-one MTBI patients without any parenchymal hemorrhage on conventional MRI were selected. Forty-two patients without trauma were selected for control group. According to the presence of MBLs, we divided the MTBI group into MBLs positive [SWI (+)] and negative [SWI (-)] group. Regional distribution of MBLs and clinical factors were compared between groups. Results : Fifty-one MBLs appeared in 16 patients of SWI (+) group and 16 MBLs in 10 patients of control group [control (+)], respectively. In SWI (+) group, MBLs were located more frequently in white matters than in deep nucleus different from the control (+) group (p<0.05). Nine patients (56.3%) of SW (+) group had various neurological deficits (disorientation in 4, visual field defect in 2, hearing difficulty in 2 and Parkinson syndrome in 1). Initial Glasgow Coma Scale (GCS)/mean Glasgow Outcome Scale (GOS) were $13.9{\pm}1.5/4.7{\pm}0.8$ and $15.0{\pm}0.0/5.0{\pm}0.0$ in SWI (+) and SWI (-) groups, respectively (p<0.05). Conclusion : Traumatic cerebral MBLs showed characteristic regional distribution, and seemed to have an importance on the initial neurological status and the prognosis. SWI is useful for detection of traumatic cerebral MBLs, and can provide etiologic evidences for some post-traumatic neurologic deficits which were unexplainable with conventional MRI.

Parameter Estimation and Prediction methods for Hyper-Geometric Distribution software Reliability Growth Model (초기하분포 소프트웨어 신뢰성 성장 모델에서의 모수 추정과 예측 방법)

  • Park, Joong-Yang;Yoo, Chang-Yeul;Lee, Bu-Kwon
    • The Transactions of the Korea Information Processing Society
    • /
    • v.5 no.9
    • /
    • pp.2345-2352
    • /
    • 1998
  • The hyper-geometric distribution software reliability growth model was recently developed and successfully applied Due to mathematical difficultv of the maximum likclihmd method, the least squares method has hem suggested for parameter estimation by the previous studies. We first summarize and compare the minimization criteria adopted by the previous studies. It is theo shown that the weighted least squares method is more appropriate hecause of the nonhomogeneous variability of the number of newly detected faults. The adequacy of the weighted least squares method is illustrated by two numerical examples. Finally, we propose a new method fur predicting the number of faults newly discovered by next test instances. The new prediction method can be used for determining the time to stop testing.

  • PDF