• Title/Summary/Keyword: Variance Reduction Techniques

Search Result 29, Processing Time 0.023 seconds

An Intelligence Support System Research on KTX Rolling Stock Failure Using Case-based Reasoning and Text Mining (사례기반추론과 텍스트마이닝 기법을 활용한 KTX 차량고장 지능형 조치지원시스템 연구)

  • Lee, Hyung Il;Kim, Jong Woo
    • Journal of Intelligence and Information Systems
    • /
    • v.26 no.1
    • /
    • pp.47-73
    • /
    • 2020
  • KTX rolling stocks are a system consisting of several machines, electrical devices, and components. The maintenance of the rolling stocks requires considerable expertise and experience of maintenance workers. In the event of a rolling stock failure, the knowledge and experience of the maintainer will result in a difference in the quality of the time and work to solve the problem. So, the resulting availability of the vehicle will vary. Although problem solving is generally based on fault manuals, experienced and skilled professionals can quickly diagnose and take actions by applying personal know-how. Since this knowledge exists in a tacit form, it is difficult to pass it on completely to a successor, and there have been studies that have developed a case-based rolling stock expert system to turn it into a data-driven one. Nonetheless, research on the most commonly used KTX rolling stock on the main-line or the development of a system that extracts text meanings and searches for similar cases is still lacking. Therefore, this study proposes an intelligence supporting system that provides an action guide for emerging failures by using the know-how of these rolling stocks maintenance experts as an example of problem solving. For this purpose, the case base was constructed by collecting the rolling stocks failure data generated from 2015 to 2017, and the integrated dictionary was constructed separately through the case base to include the essential terminology and failure codes in consideration of the specialty of the railway rolling stock sector. Based on a deployed case base, a new failure was retrieved from past cases and the top three most similar failure cases were extracted to propose the actual actions of these cases as a diagnostic guide. In this study, various dimensionality reduction measures were applied to calculate similarity by taking into account the meaningful relationship of failure details in order to compensate for the limitations of the method of searching cases by keyword matching in rolling stock failure expert system studies using case-based reasoning in the precedent case-based expert system studies, and their usefulness was verified through experiments. Among the various dimensionality reduction techniques, similar cases were retrieved by applying three algorithms: Non-negative Matrix Factorization(NMF), Latent Semantic Analysis(LSA), and Doc2Vec to extract the characteristics of the failure and measure the cosine distance between the vectors. The precision, recall, and F-measure methods were used to assess the performance of the proposed actions. To compare the performance of dimensionality reduction techniques, the analysis of variance confirmed that the performance differences of the five algorithms were statistically significant, with a comparison between the algorithm that randomly extracts failure cases with identical failure codes and the algorithm that applies cosine similarity directly based on words. In addition, optimal techniques were derived for practical application by verifying differences in performance depending on the number of dimensions for dimensionality reduction. The analysis showed that the performance of the cosine similarity was higher than that of the dimension using Non-negative Matrix Factorization(NMF) and Latent Semantic Analysis(LSA) and the performance of algorithm using Doc2Vec was the highest. Furthermore, in terms of dimensionality reduction techniques, the larger the number of dimensions at the appropriate level, the better the performance was found. Through this study, we confirmed the usefulness of effective methods of extracting characteristics of data and converting unstructured data when applying case-based reasoning based on which most of the attributes are texted in the special field of KTX rolling stock. Text mining is a trend where studies are being conducted for use in many areas, but studies using such text data are still lacking in an environment where there are a number of specialized terms and limited access to data, such as the one we want to use in this study. In this regard, it is significant that the study first presented an intelligent diagnostic system that suggested action by searching for a case by applying text mining techniques to extract the characteristics of the failure to complement keyword-based case searches. It is expected that this will provide implications as basic study for developing diagnostic systems that can be used immediately on the site.

Factor Analysis for Exploratory Research in the Distribution Science Field (유통과학분야에서 탐색적 연구를 위한 요인분석)

  • Yim, Myung-Seong
    • Journal of Distribution Science
    • /
    • v.13 no.9
    • /
    • pp.103-112
    • /
    • 2015
  • Purpose - This paper aims to provide a step-by-step approach to factor analytic procedures, such as principal component analysis (PCA) and exploratory factor analysis (EFA), and to offer a guideline for factor analysis. Authors have argued that the results of PCA and EFA are substantially similar. Additionally, they assert that PCA is a more appropriate technique for factor analysis because PCA produces easily interpreted results that are likely to be the basis of better decisions. For these reasons, many researchers have used PCA as a technique instead of EFA. However, these techniques are clearly different. PCA should be used for data reduction. On the other hand, EFA has been tailored to identify any underlying factor structure, a set of measured variables that cause the manifest variables to covary. Thus, it is needed for a guideline and for procedures to use in factor analysis. To date, however, these two techniques have been indiscriminately misused. Research design, data, and methodology - This research conducted a literature review. For this, we summarized the meaningful and consistent arguments and drew up guidelines and suggested procedures for rigorous EFA. Results - PCA can be used instead of common factor analysis when all measured variables have high communality. However, common factor analysis is recommended for EFA. First, researchers should evaluate the sample size and check for sampling adequacy before conducting factor analysis. If these conditions are not satisfied, then the next steps cannot be followed. Sample size must be at least 100 with communality above 0.5 and a minimum subject to item ratio of at least 5:1, with a minimum of five items in EFA. Next, Bartlett's sphericity test and the Kaiser-Mayer-Olkin (KMO) measure should be assessed for sampling adequacy. The chi-square value for Bartlett's test should be significant. In addition, a KMO of more than 0.8 is recommended. The next step is to conduct a factor analysis. The analysis is composed of three stages. The first stage determines a rotation technique. Generally, ML or PAF will suggest to researchers the best results. Selection of one of the two techniques heavily hinges on data normality. ML requires normally distributed data; on the other hand, PAF does not. The second step is associated with determining the number of factors to retain in the EFA. The best way to determine the number of factors to retain is to apply three methods including eigenvalues greater than 1.0, the scree plot test, and the variance extracted. The last step is to select one of two rotation methods: orthogonal or oblique. If the research suggests some variables that are correlated to each other, then the oblique method should be selected for factor rotation because the method assumes all factors are correlated in the research. If not, the orthogonal method is possible for factor rotation. Conclusions - Recommendations are offered for the best factor analytic practice for empirical research.

Effects of Relaxation Techniques on Flexibility and Balance of the Lower Limb in Adults with Hamstring Shortening (넙다리뒤근 단축이 있는 성인에게 이완 기법의 적용이 하지의 유연성과 균형에 미치는 영향)

  • Jung-Woo Lee;Seong-Min Jeon;Ha-Yeong Kim;Jong-Yeon Bae;Song-Chan Son;Eun-Jin Song;Sang-Eun Sim;Hyeong-Uk Lee;Hye-Kyeong Lee;Baek-Gwang Jo;Sung-Bin Jo;Jin-Hee Joo;Ha-Yeon Jin;Jeong-Hyeon Hwang;Min-Hee Kim
    • PNF and Movement
    • /
    • v.22 no.1
    • /
    • pp.55-70
    • /
    • 2024
  • Purpose: The purpose of this study was to investigate the effects of three relaxation techniques, namely, Static Stretching Exercise (SSE), Eccentric Contraction Exercise (ECE), and Suboccipital Muscle Release (SMR) on the flexibility and balance of the lower limb in adults with hamstring shortening. Methods: The participants were 45 adults in their 20s with hamstring shortening. They performed three exercises (i.e., SSE, ECE, and SMR) for two weeks. We measured flexibility, muscle tone and stiffness, proprioception, and balance before and after the intervention, applying each relaxation technique. Data were analyzed using two-way repeated measures analysis of variance (ANOVA). The significance level was set at α=0.05. Results: Flexibility increased in the SSE, ECE, and SMR groups, with the SSE group showing the greatest improvement. Muscle tone and stiffness decreased in all groups, with the ECE group exhibiting the highest reduction. Proprioception increased in the SSE, ECE, and SMR groups, with SSE demonstrating the greatest enhancement. Balance also increased in all groups, with the ECE group showing the most pronounced improvement. Conclusion: Overall, all three relaxation techniques for hamstring shortening resulted in improved flexibility, muscle tone and stiffness, proprioception, and balance of the lower limb in adults with hamstring shortening. The findings of this study underscore the importance of selecting an appropriate technique according to the purpose of treatment and the condition of the patient.

Study on Dimensionality Reduction for Sea-level Variations by Using Altimetry Data around the East Asia Coasts

  • Hwang, Do-Hyun;Bak, Suho;Jeong, Min-Ji;Kim, Na-Kyeong;Park, Mi-So;Kim, Bo-Ram;Yoon, Hong-Joo
    • Korean Journal of Remote Sensing
    • /
    • v.37 no.1
    • /
    • pp.85-95
    • /
    • 2021
  • Recently, as data mining and artificial neural network techniques are developed, analyzing large amounts of data is proposed to reduce the dimension of the data. In general, empirical orthogonal function (EOF) used to reduce the dimension in the ocean data and recently, Self-organizing maps (SOM) algorithm have been investigated to apply to the ocean field. In this study, both algorithms used the monthly Sea level anomaly (SLA) data from 1993 to 2018 around the East Asia Coasts. There was dominated by the influence of the Kuroshio Extension and eddy kinetic energy. It was able to find the maximum amount of variance of EOF modes. SOM algorithm summarized the characteristic of spatial distributions and periods in EOF mode 1 and 2. It was useful to find the change of SLA variable through the movement of nodes. Node 1 and 5 appeared in the early 2000s and the early 2010s when the sea level was high. On the other hand, node 2 and 6 appeared in the late 1990s and the late 2000s, when the sea level was relatively low. Therefore, it is considered that the application of the SOM algorithm around the East Asia Coasts is well distinguished. In addition, SOM results processed by SLA data, it is able to apply the other climate data to explain more clearly SLA variation mechanisms.

Micro-CT evaluation of the removal of root fillings using rotary and reciprocating systems supplemented by XP-Endo Finisher, the Self-Adjusting File, or Er,Cr:YSGG laser

  • Gulsen Kiraz;Bulem Ureyen Kaya;Mert Ocak;Muhammet Bora Uzuner;Hakan Hamdi Celik
    • Restorative Dentistry and Endodontics
    • /
    • v.48 no.4
    • /
    • pp.36.1-36.15
    • /
    • 2023
  • Objectives: This study aimed to compare the effectiveness of a single-file reciprocating system (WaveOne Gold, WOG) and a multi-file rotary system (ProTaper Universal Retreatment, PTUR) in removing canal filling from severely curved canals and to evaluate the possible adjunctive effects of XP-Endo Finisher (XPF), the Self-Adjusting File (SAF), and an erbium, chromium: yttrium, scandium, gallium garnet (Er,Cr:YSGG) laser using microcomputed tomography (µCT). Materials and Methods: Sixty-six curved mandibular molars were divided into 2 groups based on the retreatment technique and then into 3 based on the supplementary method. The residual filling volumes and root canals were evaluated with µCT before and after retreatment, and after the supplementary steps. The data were statistically analyzed with the t-test, Mann-Whitney U test, analysis of covariance, and factorial analysis of variance (p < 0.05). Results: PTUR and WOG showed no significant difference in removing filling materials (p > 0.05). The supplementary techniques were significantly more effective than reciprocating or rotary systems only (p < 0.01). The supplementary steps showed no significant differences in canal filling removal effectiveness (p > 0.05), but XPF showed less dentin reduction than the SAF and Er,Cr:YSGG laser (p < 0.01). Conclusions: The supplementary methods significantly decreased the volume of residual filling materials. XPF caused minimal changes in root canal volume and might be preferred for retreatment in curved root canals. Supplementary approaches after retreatment procedures may improve root canal cleanliness.

Gradient Estimation for Progressive Photon Mapping (점진적 광자 매핑을 위한 기울기 계산 기법)

  • Donghee Jeon;Jeongmin Gu;Bochang Moon
    • Journal of the Korea Computer Graphics Society
    • /
    • v.30 no.3
    • /
    • pp.141-147
    • /
    • 2024
  • Progressive photon mapping is a widely adopted rendering technique that conducts a kernel-density estimation on photons progressively generated from lights. Its hyperparameter, which controls the reduction rate of the density estimation, highly affects the quality of its rendering image due to the bias-variance tradeoff of pixel estimates in photon-mapped results. We can minimize the errors of rendered pixel estimates in progressive photon mapping by estimating the optimal parameters based on gradient-based optimization techniques. To this end, we derived the gradients of pixel estimates with respect to the parameters when performing progressive photon mapping and compared our estimated gradients with finite differences to verify estimated gradients. The gradient estimated in this paper can be applied in an online learning algorithm that simultaneously performs progressive photon mapping and parameter optimization in future work.

Gaussian Noise Reduction Algorithm using Self-similarity (자기 유사성을 이용한 가우시안 노이즈 제거 알고리즘)

  • Jeon, Yougn-Eun;Eom, Min-Young;Choe, Yoon-Sik
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.44 no.5
    • /
    • pp.1-10
    • /
    • 2007
  • Most of natural images have a special property, what is called self-similarity, which is the basis of fractal image coding. Even though an image has local stationarity in several homogeneous regions, it is generally non-stationarysignal, especially in edge region. This is the main reason that poor results are induced in linear techniques. In order to overcome the difficulty we propose a non-linear technique using self-similarity in the image. In our work, an image is classified into stationary and non-stationary region with respect to sample variance. In case of stationary region, do-noising is performed as simply averaging of its neighborhoods. However, if the region is non-stationary region, stationalization is conducted as make a set of center pixels by similarity matching with respect to bMSE(block Mean Square Error). And then do-nosing is performed by Gaussian weighted averaging of center pixels of similar blocks, because the set of center pixels of similar blocks can be regarded as nearly stationary. The true image value is estimated by weighted average of the elements of the set. The experimental results show that our method has better performance and smaller variance than other methods as estimator.

The Understanding and Application of Noise Reduction Software in Static Images (정적 영상에서 Noise Reduction Software의 이해와 적용)

  • Lee, Hyung-Jin;Song, Ho-Jun;Seung, Jong-Min;Choi, Jin-Wook;Kim, Jin-Eui;Kim, Hyun-Joo
    • The Korean Journal of Nuclear Medicine Technology
    • /
    • v.14 no.1
    • /
    • pp.54-60
    • /
    • 2010
  • Purpose: Nuclear medicine manufacturers provide various softwares which shorten imaging time using their own image processing techniques such as UlatraSPECT, ASTONISH, Flash3D, Evolution, and nSPEED. Seoul National University Hospital has introduced softwares from Siemens and Philips, but it was still hard to understand algorithm difference between those two softwares. Thus, the purpose of this study was to figure out the difference of two softwares in planar images and research the possibility of application to images produced with high energy isotopes. Materials and Methods: First, a phantom study was performed to understand the difference of softwares in static studies. Various amounts of count were acquired and the images were analyzed quantitatively after application of PIXON, Siemens and ASTONISH, Philips, respectively. Then, we applied them to some applicable static studies and searched for merits and demerits. And also, they have been applied to images produced with high energy isotopes. Finally, A blind test was conducted by nuclear medicine doctors except phantom images. Results: There was nearly no difference between pre and post processing image with PIXON for FWHM test using capillary source whereas ASTONISH was improved. But, both of standard deviation(SD) and variance were decreased for PIXON while ASTONISH was highly increased. And in background variability comparison test using IEC phantom, PIXON has been decreased over all while ASTONISH has shown to be somewhat increased. Contrast ratio in each spheres has also been increased for both methods. For image scale, window width has been increased for 4~5 times after processing with PIXON while ASTONISH showed nearly no difference. After phantom test analysis, ASTONISH seemed to be applicable for some studies which needs quantitative analysis or high contrast, and PIXON seemed to be applicable for insufficient counts studies or long time studies. Conclusion: Quantitative values used for usual analysis were generally improved after application of the two softwares, however it seems that it's hard to maintain the consistency for all of nuclear medicine studies because result images can not be the same due to the difference of algorithm characteristic rather than the difference of gamma cameras. And also, it's hard to expect high image quality with the time shortening method such as whole body scan. But it will be possible to apply to static studies considering the algorithm characteristic or we can expect a change of image quality through application to high energy isotope images.

  • PDF

Density Estimation Technique for Effective Representation of Light In-scattering (빛의 내부산란의 효과적인 표현을 위한 밀도 추정기법)

  • Min, Seung-Ki;Ihm, In-Sung
    • Journal of the Korea Computer Graphics Society
    • /
    • v.16 no.1
    • /
    • pp.9-20
    • /
    • 2010
  • In order to visualize participating media in 3D space, they usually calculate the incoming radiance by subdividing the ray path into small subintervals, and accumulating their respective light energy due to direct illumination, scattering, absorption, and emission. Among these light phenomena, scattering behaves in very complicated manner in 3D space, often requiring a great deal of simulation efforts. To effectively simulate the light scattering effect, several approximation techniques have been proposed. Volume photon mapping takes a simple approach where the light scattering phenomenon is represented in volume photon map through a stochastic simulation, and the stored information is explored in the rendering stage. While effective, this method has a problem that the number of necessary photons increases very fast when a higher variance reduction is needed. In an attempt to resolve such problem, we propose a different approach for rendering particle-based volume data where kernel smoothing, one of several density estimation methods, is explored to represent and reconstruct the light in-scattering effect. The effectiveness of the presented technique is demonstrated with several examples of volume data.