• Title/Summary/Keyword: Software process improvement

Search Result 448, Processing Time 0.032 seconds

Online Signature Verification by Visualization of Dynamic Characteristics using New Pattern Transform Technique (동적 특성의 시각화를 수행하는 새로운 패턴변환 기법에 의한 온라인 서명인식 기술)

  • Chi Suyoung;Lee Jaeyeon;Oh Weongeun;Kim Changhun
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.7
    • /
    • pp.663-673
    • /
    • 2005
  • An analysis model for the dynamics information of two-dimensional time-series patterns is described. In the proposed model, two novel transforms that visualize the dynamic characteristics are proposed. The first transform, referred to as speed equalization, reproduces a time-series pattern assuming a constant linear velocity to effectively model the temporal characteristics of the signing process. The second transform, referred to as velocity transform, maps the signal onto a horizontal vs. vertical velocity plane where the variation oi the velocities over time is represented as a visible shape. With the transforms, the dynamic characteristics in the original signing process are reflected in the shape of the transformed patterns. An analysis in the context of these shapes then naturally results in an effective analysis of the dynamic characteristics. The proposed transform technique is applied to an online signature verification problem for evaluation. Experimenting on a large signature database, the performance evaluated in EER(Equal Error Rate) was improved to 1.17$\%$ compared to 1.93$\%$ of the traditional signature verification algorithm in which no transformed patterns are utilized. In the case of skilled forgery experiments, the improvement was more outstanding; it was demonstrated that the parameter set extracted from the transformed patterns was more discriminative in rejecting forgeries

Estimate Customer Churn Rate with the Review-Feedback Process: Empirical Study with Text Mining, Econometrics, and Quai-Experiment Methodologies (리뷰-피드백 프로세스를 통한 고객 이탈률 추정: 텍스트 마이닝, 계량경제학, 준실험설계 방법론을 활용한 실증적 연구)

  • Choi Kim;Jaemin Kim;Gahyung Jeong;Jaehong Park
    • Information Systems Review
    • /
    • v.23 no.3
    • /
    • pp.159-176
    • /
    • 2021
  • Obviating user churn is a prominent strategy to capitalize on online games, eluding the initial investments required for the development of another. Extant literature has examined factors that may induce user churn, mainly from perspectives of motives to play and game as a virtual society. However, such works largely dismiss the service aspects of online games. Dissatisfaction of user needs constitutes a crucial aspect for user churn, especially with online services where users expect a continuous improvement in service quality via software updates. Hence, we examine the relationship between a game's quality management and its user base. With text mining and survival analysis, we identify complaint factors that act as key predictors of user churn. Additionally, we find that enjoyment-related factors are greater threats to user base than usability-related ones. Furthermore, subsequent quasi-experiment shows that improvements in the complaint factors (i.e., via game patches) curb churn and foster user retention. Our results shed light on the responsive role of developers in retaining the user base of online games. Moreover, we provide practical insights for game operators, i.e., to identify and prioritize more perilous complaint factors in planning successive game patches.

Optimal Selection of Classifier Ensemble Using Genetic Algorithms (유전자 알고리즘을 이용한 분류자 앙상블의 최적 선택)

  • Kim, Myung-Jong
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.4
    • /
    • pp.99-112
    • /
    • 2010
  • Ensemble learning is a method for improving the performance of classification and prediction algorithms. It is a method for finding a highly accurateclassifier on the training set by constructing and combining an ensemble of weak classifiers, each of which needs only to be moderately accurate on the training set. Ensemble learning has received considerable attention from machine learning and artificial intelligence fields because of its remarkable performance improvement and flexible integration with the traditional learning algorithms such as decision tree (DT), neural networks (NN), and SVM, etc. In those researches, all of DT ensemble studies have demonstrated impressive improvements in the generalization behavior of DT, while NN and SVM ensemble studies have not shown remarkable performance as shown in DT ensembles. Recently, several works have reported that the performance of ensemble can be degraded where multiple classifiers of an ensemble are highly correlated with, and thereby result in multicollinearity problem, which leads to performance degradation of the ensemble. They have also proposed the differentiated learning strategies to cope with performance degradation problem. Hansen and Salamon (1990) insisted that it is necessary and sufficient for the performance enhancement of an ensemble that the ensemble should contain diverse classifiers. Breiman (1996) explored that ensemble learning can increase the performance of unstable learning algorithms, but does not show remarkable performance improvement on stable learning algorithms. Unstable learning algorithms such as decision tree learners are sensitive to the change of the training data, and thus small changes in the training data can yield large changes in the generated classifiers. Therefore, ensemble with unstable learning algorithms can guarantee some diversity among the classifiers. To the contrary, stable learning algorithms such as NN and SVM generate similar classifiers in spite of small changes of the training data, and thus the correlation among the resulting classifiers is very high. This high correlation results in multicollinearity problem, which leads to performance degradation of the ensemble. Kim,s work (2009) showedthe performance comparison in bankruptcy prediction on Korea firms using tradition prediction algorithms such as NN, DT, and SVM. It reports that stable learning algorithms such as NN and SVM have higher predictability than the unstable DT. Meanwhile, with respect to their ensemble learning, DT ensemble shows the more improved performance than NN and SVM ensemble. Further analysis with variance inflation factor (VIF) analysis empirically proves that performance degradation of ensemble is due to multicollinearity problem. It also proposes that optimization of ensemble is needed to cope with such a problem. This paper proposes a hybrid system for coverage optimization of NN ensemble (CO-NN) in order to improve the performance of NN ensemble. Coverage optimization is a technique of choosing a sub-ensemble from an original ensemble to guarantee the diversity of classifiers in coverage optimization process. CO-NN uses GA which has been widely used for various optimization problems to deal with the coverage optimization problem. The GA chromosomes for the coverage optimization are encoded into binary strings, each bit of which indicates individual classifier. The fitness function is defined as maximization of error reduction and a constraint of variance inflation factor (VIF), which is one of the generally used methods to measure multicollinearity, is added to insure the diversity of classifiers by removing high correlation among the classifiers. We use Microsoft Excel and the GAs software package called Evolver. Experiments on company failure prediction have shown that CO-NN is effectively applied in the stable performance enhancement of NNensembles through the choice of classifiers by considering the correlations of the ensemble. The classifiers which have the potential multicollinearity problem are removed by the coverage optimization process of CO-NN and thereby CO-NN has shown higher performance than a single NN classifier and NN ensemble at 1% significance level, and DT ensemble at 5% significance level. However, there remain further research issues. First, decision optimization process to find optimal combination function should be considered in further research. Secondly, various learning strategies to deal with data noise should be introduced in more advanced further researches in the future.

Analysis on the Trends of Studies Related to the National Competency Standard in Korea throughout the Semantic Network Analysis (언어네트워크 분석을 적용한 국가직무능력표준(NCS) 연구 동향 분석)

  • Lim, Yun-Jin;Son, Da-Mi
    • 대한공업교육학회지
    • /
    • v.41 no.2
    • /
    • pp.48-68
    • /
    • 2016
  • This study was conducted to identify the NCS-related research trends, Keywords, the Keywords Networks and the extension of the Keywords using the sementic network analysis and to seek for the development plans about NCS. For this, the study searched 345 the papers, with the National Competency Standards or NCS as a key word, among master's theses, dissertations and scholarly journals that RISS provides, and selected a total of 345 papers. Annual frequency analysis of the selected papers was carried out, and Semantic Network Analysis was carried out for 68 key words which can be seen as key terms of the terms shown by the subject. The method of analysis were KrKwic software, UCINET6.0 and NetDraw. The study results were as follows: First, NCS-related research increased gradually after starting in 2002, and has been accomplishing a significant growth since 2014. Second, as a result of analysis of keyword network, 'NCS, development, curriculum, analysis, application, job, university, education,' etc. appeared as priority key words. Third, as a result of sub-cluster analysis of NCS-related research, it was classified into four clusters, which could be seen as a research related to a specific strategy for realization of NCS's purpose, an exploratory research on improvement in core competency and exploration of college students' possibility related to employment using NCS, an operational research for junior college-centered curriculum and reorganization of the specialized subject, and an analysis of demand and perception of a high school-level vocational education curriculum. Fourth, the connection forming process among key words of domestic study results about NCS was expanding in the form of 'job${\rightarrow}$job ability${\rightarrow}$NCS${\rightarrow}$education${\rightarrow}$process, curriculum${\rightarrow}$development, university${\rightarrow}$analysis, utilization${\rightarrow}$qualification, application, improvement${\rightarrow}$plan, operation, industry${\rightarrow}$design${\rightarrow}$evaluation.'

The Numerical Study on the Flow Control of Ammonia Injection According to the Inlet NOx Distribution in the DeNOx Facilities (탈질설비 내에서 입구유동 NOx 분포에 따른 AIG유동제어의 전산해석적 연구)

  • Seo, Deok-Cheol;Kim, Min-Kyu;Chung, Hee-Taeg
    • Clean Technology
    • /
    • v.25 no.4
    • /
    • pp.324-330
    • /
    • 2019
  • The selective catalytic reduction system is a highly effective technique for the denitrification of the flue gases emitted from the industrial facilities. The distribution of mixing ratio between ammonia and nitrogen oxide at the inlet of the catalyst layers is important to the efficiency of the de-NOx process. In this study, computational analysis tools have been applied to improve the uniformity of NH3/NO molar ratio by controlling the flow rate of the ammonia injection nozzles according to the distribution pattern of the nitrogen oxide in the inlet flue gas. The root mean square of NH3/NO molar ratio was chosen as the optimization parameter while the design of experiment was used as the base of the optimization algorithm. As the inlet conditions, four (4) types of flow pattern were simulated; i.e. uniform, parabolic, upper-skewed, and random. The flow rate of the eight nozzles installed in the ammonia injection grid was adjusted to the inlet conditions. In order to solve the two-dimensional, steady, incompressible, and viscous flow fields, the commercial software ANSYS-FLUENT was used with the k-𝜖 turbulence model. The results showed that the improvement of the uniformity ranged between 9.58% and 80.0% according to the inlet flow pattern of the flue gas.

Evaluation of marginal and internal gaps of Ni-Cr and Co-Cr alloy copings manufactured by microstereolithography

  • Kim, Dong-Yeon;Kim, Chong-Myeong;Kim, Ji-Hwan;Kim, Hae-Young;Kim, Woong-Chul
    • The Journal of Advanced Prosthodontics
    • /
    • v.9 no.3
    • /
    • pp.176-181
    • /
    • 2017
  • PURPOSE. The purpose of this study was to evaluate the marginal and internal gaps of Ni-Cr and Co-Cr copings, fabricated using the dental ${\mu}-SLA$ system. MATERIALS AND METHODS. Ten study dies were made using a two-step silicone impression with a dental stone (type IV) from the master die of a tooth. Ni-Cr (NC group) and Co-Cr (CC group) alloy copings were designed using a dental scanner, CAD software, resin coping, and casting process. In addition, 10 Ni-Cr alloy copings were manufactured using the lost-wax technique (LW group). The marginal and internal gaps in the 3 groups were measured using a digital microscope ($160{\times}$) with the silicone replica technique, and the obtained data were analyzed using the non-parametric Kruskal-Wallis H test. Post-hoc comparisons were performed using Bonferroni-corrected Mann-Whitney U tests (${\alpha}=.05$). RESULTS. The mean (${\pm}$ standard deviation) values of the marginal, chamfer, axial wall, and occlusal gaps in the 3 groups were as follows: $81.5{\pm}73.8$, $98.1{\pm}76.1$, $87.1{\pm}44.8$, and $146.8{\pm}78.7{\mu}m$ in the LW group; $76.8{\pm}48.0$, $141.7{\pm}57.1$, $80.7{\pm}47.5$, and $194.69{\pm}63.8{\mu}m$ in the NC group; and $124.2{\pm}52.0$, $199.5{\pm}71.0$, $67.1{\pm}37.6$, and $244.5{\pm}58.9{\mu}m$ in the CC group. CONCLUSION. The marginal gap in the LW and NC groups were clinically acceptable. Further improvement is needed for CC group to be used clinical practice.

Development of an Verification System for Enhancing BIM Design Base on Usability (활용성을 고려한 BIM 설계 오류 검증시스템 개발)

  • Yang, Dong-Suk
    • Land and Housing Review
    • /
    • v.8 no.1
    • /
    • pp.23-29
    • /
    • 2017
  • The BIM design is expected to expand to the domestic and overseas construction industries, depending on the effect of construction productivity and quality improvement. However, with the obligation of Public Procurement Service to design the BIM design, it includes a design error and the problem of utilization of 3D design by choosing a simple 2D to 3D remodelling method that can not be modelled in 3D modeling or use of the construction and maintenance phases. The results reviewed by BIM design results were largely underutilized and were not even performed with the verification of the error. In order to resolve this, one must develop the check system that secures the quality of BIM design and ensure that the reliability of BIM results are available. In this study, it is designed to develop a program that can automatically verify the design of the BIM design results such as violation of the rules of the BIM design, design flaws, and improve the usability of the BIM design. In particular, this programs were developed not only to identify programmes that were not commercially available, but also to validate drawings in low-light computer environments. The developed program(LH-BIM) store the information of attribute extracted from the Revit file(ArchiCAD, IFC file included) in the integrated DB. This provides the ability to freely lookup the features and properties of drawings delivered exclusively by the LH-BIM Program without using the Revit tools. By doing so, it was possible to resolve the difficulties of using traditional commercial programs and to ensure that they operate only with traditional PC performance. Further, the results of the various BIM software can be readily validated, which can be solved the conversion process error of IFC in the case of SMC. Additionally, the developed program has the ability to automatically check the error and design criteria of the drawings, as well as the ability to calculate the area estimation. These functions allow businesses to apply simple and easy tasks to operate tasks of BIM modelling. The developed system(LH-BIM) carried out a verification test by reviewing the review of the BIM Design model of the Korea Land & Housing Corporation. It is hoped that the verification system will not only be able to achieve the Quality of BIM design, but also contribute to the expansion of BIM and future construction BIM.

Normalization of Face Images Subject to Directional Illumination using Linear Model (선형모델을 이용한 방향성 조명하의 얼굴영상 정규화)

  • 고재필;김은주;변혜란
    • Journal of KIISE:Software and Applications
    • /
    • v.31 no.1
    • /
    • pp.54-60
    • /
    • 2004
  • Face recognition is one of the problems to be solved by appearance based matching technique. However, the appearance of face image is very sensitive to variation in illumination. One of the easiest ways for better performance is to collect more training samples acquired under variable lightings but it is not practical in real world. ]:n object recognition, it is desirable to focus on feature extraction or normalization technique rather than focus on classifier. This paper presents a simple approach to normalization of faces subject to directional illumination. This is one of the significant issues that cause error in the face recognition process. The proposed method, ICR(illumination Compensation based on Multiple Linear Regression), is to find the plane that best fits the intensity distribution of the face image using the multiple linear regression, then use this plane to normalize the face image. The advantages of our method are simple and practical. The planar approximation of a face image is mathematically defined by the simple linear model. We provide experimental results to demonstrate the performance of the proposed ICR method on public face databases and our database. The experimental results show a significant improvement of the recognition accuracy.

A Study on Perceptual Skill Training for Improving Performance - Focusing on sports cognitive aspects - (경기력 향상을 위한 지각기술훈련에 대한 고찰 - 스포츠 인지적 측면 중심으로-)

  • Song, Young-Hoon
    • Journal of the Korean Applied Science and Technology
    • /
    • v.35 no.1
    • /
    • pp.299-305
    • /
    • 2018
  • Perception refers to the process of acquiring all the information about the environment through various sensory organs such as the visual, auditory, tactile, and olfactory senses and integrating and interpreting the information transmitted to the brain. The ability to use these perceptions efficiently is called perceptual skill, and perceptual skill is an important factor for improving performance in the field of sports. As a result, many researchers have developed various perceptual training programs to maximize these perceptual skills while they have also progressed on attempting to verify their effects. The perceptual skill training introduced in this study is a training method that focuses on visual perception and is a training method that is applied in the United States and Europe. to improve sports performance. As a result of carrying out the perceptual skills training based on the kicker's important clue (the kicker's hip - the angle of the body and foot before kicking) to the goalkeeper in the situation of a soccer penalty kick improved the ability of predicting the direction of the ball while even in tennis, carrying out the perceptual skills training based on the server's important clue (position, ball, racket) improved the accuracy of the ability to predict in the direction of serve. Recently, there have been numerous research studies that were carried out on such perceptual skills training, but the number of studies conducted are insufficient, especially in Korea where research studies on perceptual training seem to be in a relatively neglected state. In addition, extensive studies need to be carried out to investigate whether the improvement of perceptual skills in the laboratory situation can be transitioned to an actual performance situation. Therefore, in order to elevate sports performance, researchers need to examine the perceptual training program's extent of necessity as well as the research direction regarding its effects.

A Study of Improving Combustion Stability with Sonic Wave Radiation (음파를 이용한 연소 안정성 개선에 관한 연구)

  • Min, Sun-ki
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.21 no.8
    • /
    • pp.401-406
    • /
    • 2020
  • NOx (nitrogen oxide) in the exhaust gas engines causes severe air pollution. NOx is produced under high-temperature combustion conditions. EGR (exhaust gas recirculation) is normally used to reduce the combustion temperature and NOx production. As the EGR ratio increases, the NOx level becomes low. On the other hand, an excessively high EGR ratio makes the combustion unstable resulting in other air pollution problems, such as unburned hydrocarbon and higher CO levels. In this study, the improvement of fuel droplets moving by the radiation of sonic waves was studied for the stable combustion using analytic and experimental methods. For the analytical study, the effects of the radiation of a sonic wave on the fuel droplet velocity were studied using Fluent software. The results showed that the small droplet velocity increased more under high-frequency sonic wave conditions, and the large droplet velocity increased more under low-frequency sonic wave conditions. For the experimental study, the combustion chamber was made to measure the combustion pressure under the sonic wave effect. The measured pressure was used to calculate the heat release rate in the combustion chamber. With the heat release rate data, the heat release rate increased during the initial combustion process under low-frequency sonic wave conditions.