• Title/Summary/Keyword: problem analysis

Search Result 16,360, Processing Time 0.037 seconds

The Evolution of Screening Center for COVID-19 Analyzed by TRIZ (트리즈로 분석한 코로나19 대응 선별진료소의 진화)

  • Song, Chang-Yong
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.45 no.3
    • /
    • pp.139-149
    • /
    • 2022
  • Korea's Corona 19(COVID-19) quarantine, referred to as 'K-Quarantine', is a globally recognized quarantine system that has achieved both conflicting goals: health and economy. The quarantine system represented by 3T(Test-Trace-Treat) is not a method of blocking an area, but a method of screening and treating infected and non-infected persons. The screening center, one of the key elements of this screening treatment system, has evolved to suit the timing and situation of COVID-19, and has succeeded in initial response by conducting large-scale tests quickly and safely. By analyzing the evolution of screening centers that produced such significant results from a problem-solving point of view, it proved its meaning as a practical success case of creative problem-solving. In addition, the usefulness of TRIZ (Russian abbreviation of Theory of Solving Inventive Problem), a creative problem-solving theory, was confirmed through an analysis of actual verified cases of COVID-19 response. TRIZ is a problem-solving theory created by analyzing the regularity of invention patents, and is widely used not only in the technical field but also in the non-technical fields such as design, management, and education. The results of this study are expected to provide useful meaning and practical examples to researchers interested in system analysis and TRIZ application from a problem-solving perspective.

A Study on the Application of PBL in Library and Information Science I: Course Developing and Analysis of Self-Reflective Journal (문헌정보학에서 문제중심학습 (Problem-Based Learning) 적용 연구 I - 설계 모형 적용과 성찰일지 분석을 중심으로 -)

  • Kang, Ji Hei
    • Journal of the Korean BIBLIA Society for library and Information Science
    • /
    • v.28 no.4
    • /
    • pp.321-340
    • /
    • 2017
  • The purpose of this study is to design a teaching model applying a problem-based learning model and to analyze the educational benefits that students felt. This study initiated a problem-based learning model from an analysis of existing studies. Through the consultation of experts, the scenario was modified. The problem was designed according to the design stage activity (problem analysis, PBL class suitability judgment, contents analysis, learner analysis, environment analysis, PBL operating environment decision, PBL class) and Strategic Design (problem situation design, learning resource design, Facilitation design, operational strategy design, evaluation design, PBL operating environment design). Based on the initial scenarios, the researcher analyzed the results of the problem - based learning through learners' reflective diaries. The researcher was able to confirm that the critical thinking and creativity were improved in the first PBL problem situation, and the method for smooth communication and cooperation was utilized. The results on analyzing the effects of education about the first problem-based learning and students' opinions about modification will be used for the second revision and supplement of the course design. This study introduces a case of PBL course development and expects further application and research.

Local Similarity based Discriminant Analysis for Face Recognition

  • Xiang, Xinguang;Liu, Fan;Bi, Ye;Wang, Yanfang;Tang, Jinhui
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.9 no.11
    • /
    • pp.4502-4518
    • /
    • 2015
  • Fisher linear discriminant analysis (LDA) is one of the most popular projection techniques for feature extraction and has been widely applied in face recognition. However, it cannot be used when encountering the single sample per person problem (SSPP) because the intra-class variations cannot be evaluated. In this paper, we propose a novel method called local similarity based linear discriminant analysis (LS_LDA) to solve this problem. Motivated by the "divide-conquer" strategy, we first divide the face into local blocks, and classify each local block, and then integrate all the classification results to make final decision. To make LDA feasible for SSPP problem, we further divide each block into overlapped patches and assume that these patches are from the same class. To improve the robustness of LS_LDA to outliers, we further propose local similarity based median discriminant analysis (LS_MDA), which uses class median vector to estimate the class population mean in LDA modeling. Experimental results on three popular databases show that our methods not only generalize well SSPP problem but also have strong robustness to expression, illumination, occlusion and time variation.

Verification of Reduced Order Modeling based Uncertainty/Sensitivity Estimator (ROMUSE)

  • Khuwaileh, Bassam;Williams, Brian;Turinsky, Paul;Hartanto, Donny
    • Nuclear Engineering and Technology
    • /
    • v.51 no.4
    • /
    • pp.968-976
    • /
    • 2019
  • This paper presents a number of verification case studies for a recently developed sensitivity/uncertainty code package. The code package, ROMUSE (Reduced Order Modeling based Uncertainty/Sensitivity Estimator) is an effort to provide an analysis tool to be used in conjunction with reactor core simulators, in particular the Virtual Environment for Reactor Applications (VERA) core simulator. ROMUSE has been written in C++ and is currently capable of performing various types of parameter perturbations and associated sensitivity analysis, uncertainty quantification, surrogate model construction and subspace analysis. The current version 2.0 has the capability to interface with the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) code, which gives ROMUSE access to the various algorithms implemented within DAKOTA, most importantly model calibration. The verification study is performed via two basic problems and two reactor physics models. The first problem is used to verify the ROMUSE single physics gradient-based range finding algorithm capability using an abstract quadratic model. The second problem is the Brusselator problem, which is a coupled problem representative of multi-physics problems. This problem is used to test the capability of constructing surrogates via ROMUSE-DAKOTA. Finally, light water reactor pin cell and sodium-cooled fast reactor fuel assembly problems are simulated via SCALE 6.1 to test ROMUSE capability for uncertainty quantification and sensitivity analysis purposes.

Development of Analysis Method of Ordered Categorical Data for Optimal Parameter Design (순차 범주형 데이타의 최적 모수 설계를 위한 분석법 개발)

  • Jeon, Tae-Jun;Park, Ho-Il;Hong, Nam-Pyo;Choe, Seong-Jo
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.20 no.1
    • /
    • pp.27-38
    • /
    • 1994
  • Accumulation analysis is difficult to analyze the ordered categorical data except smaller-the-better type problem. The purpose of this paper is to develop the statistic and method that can be easily applied to general type of problem, including nominal-the-best type problem. The experimental data of contact window process is analyzed and new procedure is compared with accumulation analysis.

  • PDF

A Penalized Principal Component Analysis using Simulated Annealing

  • Park, Chongsun;Moon, Jong Hoon
    • Communications for Statistical Applications and Methods
    • /
    • v.10 no.3
    • /
    • pp.1025-1036
    • /
    • 2003
  • Variable selection algorithm for principal component analysis using penalty function is proposed. We use the fact that usual principal component problem can be expressed as a maximization problem with appropriate constraints and we will add penalty function to this maximization problem. Simulated annealing algorithm is used in searching for optimal solutions with penalty functions. Comparisons between several well-known penalty functions through simulation reveals that the HARD penalty function should be suggested as the best one in several aspects. Illustrations with real and simulated examples are provided.

Probabilistic Finite Element Analysis of Eigenvalue Problem(Buckling Reliability Analysis of Frame Structure) (고유치 문제의 확률 유한요소 해석(Frame 구조물의 좌굴 신뢰성 해석))

  • 양영순;김지호
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 1990.10a
    • /
    • pp.22-27
    • /
    • 1990
  • Since an eigenvalue problem in structural analysis has been recognized as an important process for the assessment of structural strength, it is usually to be carried out the eigenvalue analysis or buckling analysis of structures when the compression behabiour of the member is dorminant. In general, various variables involved in the eigenvalue problem have also shown their variability. So it is natural to apply the probabilistic analysis into such problem. Since the limit state equation for the eigenvalue analysis or buckling reliability analysis is expressed implicitly in terms of random variables involved, the probabilistic finite element method is combined with the conventional reliability method such as MVFOSM and AFOSM for the determination of probability of failure due to buckling. The accuracy of the results obtained by this method is compared with results from the Monte Carlo simulations. Importance sampling method is specially chosen for overcomming the difficulty in a large simulation number needed for appropriate accurate result. From the results of the case study, it is found that the method developed here has shown good performance for the calculation of probability of buckling failure and could be used for checking the safety of the calculation of probability of buckling failure and could be used for checking the safely of frame structure which might be collapsed by either yielding or buckling.

  • PDF

Incremental Eigenspace Model Applied To Kernel Principal Component Analysis

  • Kim, Byung-Joo
    • Journal of the Korean Data and Information Science Society
    • /
    • v.14 no.2
    • /
    • pp.345-354
    • /
    • 2003
  • An incremental kernel principal component analysis(IKPCA) is proposed for the nonlinear feature extraction from the data. The problem of batch kernel principal component analysis(KPCA) is that the computation becomes prohibitive when the data set is large. Another problem is that, in order to update the eigenvectors with another data, the whole eigenvectors should be recomputed. IKPCA overcomes this problem by incrementally updating the eigenspace model. IKPCA is more efficient in memory requirement than a batch KPCA and can be easily improved by re-learning the data. In our experiments we show that IKPCA is comparable in performance to a batch KPCA for the classification problem on nonlinear data set.

  • PDF

Latent Mean Analysis of Health Behavior between Adolescents with a Health Problem and Those without: Using the 2009 Korean Youth Health Behavior Survey

  • Park, Jeong-Mo;Kim, Mi-Won;Cho, Yoon Hee
    • Research in Community and Public Health Nursing
    • /
    • v.24 no.4
    • /
    • pp.488-497
    • /
    • 2013
  • Purpose: The purpose of this study was to identify the construct equivalence of the general five factors of health behavior and to compare the latent means between adolescents with a health problem and those without in Korea. Methods: The 2009 KYRBS (Korean Youth Risk Behavior Survey) data were used for the analysis. Multi-group confirmatory factor analysis was performed to test whether the scale had configural, metric, and scalar invariances across the existence of health problems in adolescents. Results: Configural, metric, and factor invariances were satisfied for the latent mean analysis (LMA) between adolescents with health problem and those without. Adolescents with health problem and those without were not different in the LMA of all factors. Conclusion: Health providers should give more interest to the group of adolescents with health problems and consider prudential school life to the same group.

An Exploration on the Use of Data Envelopment Analysis for Product Line Selection

  • Lin, Chun-Yu;Okudan, Gul E.
    • Industrial Engineering and Management Systems
    • /
    • v.8 no.1
    • /
    • pp.47-53
    • /
    • 2009
  • We define product line (or mix) selection problem as selecting a subset of potential product variants that can simultaneously minimize product proliferation and maintain market coverage. Selecting the most efficient product mix is a complex problem, which requires analyses of multiple criteria. This paper proposes a method based on Data Envelopment Analysis (DEA) for product line selection. Data Envelopment Analysis (DEA) is a linear programming based technique commonly used for measuring the relative performance of a group of decision making units with multiple inputs and outputs. Although DEA has been proved to be an effective evaluation tool in many fields, it has not been applied to solve the product line selection problem. In this study, we construct a five-step method that systematically adopts DEA to solve a product line selection problem. We then apply the proposed method to an existing line of staplers to provide quantitative evidence for managers to generate desirable decisions to maximize the company profits while also fulfilling market demands.