• Title/Summary/Keyword: Analysis Technique

Search Result 16,430, Processing Time 0.079 seconds

A Study of Personal Style Analysis and Fashion Coordination Method Applied by the Method of Morphological Analysis (개인이 소유한 의복디자인 특성 분석을 통한 착용자 중심의 패션 연출 방법 개발 - 형태 분석법을 중심으로 -)

  • Lee, Hyun-Jung;Choi, Yoon-Mi
    • The Research Journal of the Costume Culture
    • /
    • v.16 no.5
    • /
    • pp.785-794
    • /
    • 2008
  • This study systematically analyzes the specific characteristics of clothes that each person owned by applying morphological analysis, one of the creative conception technique and suggests a way that person owned on clothes through better fashion coordination according to personal style change. The result of this study is like below. Firstly, we could draw a successful fashion coordination technique by compounding solutions classified by each important variable totally through morphological analysis. Second, we suggested a personal style analysis technique that is systematic and visual through morphological analysis. Third, we could draw a various fashion coordination technique by mixture of partial solution for important variables in short time, and concluded that it is helpful to idea quest to emphasize personal individuality in shopping. Finally a study on program development that can do expansion of fashion coordination efficiently in submitting a morphological analysis table should continue.

  • PDF

A Comparative Quantitative Analysis of IDEAL (Iterative Decomposition of Water and Fat with Echo Asymmetry and Least Squares Estimation) and CHESS (Chemical Shift Selection Suppression) Technique in 3.0T Musculoskeletal MRI

  • Kim, Myoung-Hoon;Cho, Jae-Hwan;Shin, Seong-Gyu;Dong, Kyung-Rae;Chung, Woon-Kwan;Park, Tae-Hyun;Ahn, Jae-Ouk;Park, Cheol-Soo;Jang, Hyon-Chol;Kim, Yoon-Shin
    • Journal of Magnetics
    • /
    • v.17 no.2
    • /
    • pp.145-152
    • /
    • 2012
  • Patients who underwent hip arthroplasty using the conventional fat suppression technique (CHESS) and a new technique (IDEAL) were compared quantitatively to assess the effectiveness and usefulness of the IDEAL technique. In 20 patients who underwent hip arthroplasty from March 2009 to December 2010, fat suppression T2 and T1 weighted images were obtained on a 3.0T MR scanner using the CHESS and IDEAL techniques. The level of distortion in the area of interest, the level of the development of susceptibility artifacts, and homogeneous fat suppression were analyzed from the acquired images. Quantitative analysis revealed the IDEAL technique to produce a lower level of image distortion caused by the development of susceptibility artifacts due to metal on the acquired images compared to the CHESS technique. Qualitative analysis of the anterior area revealed the IDEAL technique to generate fewer susceptibility artifacts than the CHESS technique but with homogeneous fat suppression. In the middle area, the IDEAL technique generated fewer susceptibility artifacts than the CHESS technique but with homogeneous fat suppression. In the posterior area, the IDEAL technique generated fewer susceptibility artifacts than the CHESS technique. Fat suppression was not statistically different, and the two techniques achieved homogeneous fat suppression. In conclusion, the IDEAL technique generated fewer susceptibility artifacts caused by metals and less image distortion than the CHESS technique. In addition, homogeneous fat suppression was feasible. In conclusion, the IDEAL technique generates high quality images, and can provide good information for diagnosis.

A Study on a Repair Technique for a Reinforced Concrete Frame Subjected to Seismic Damage Using Prestressing Cable Bracing

  • Lee, Jin Ho;EI-Ganzory, Hisham
    • Architectural research
    • /
    • v.3 no.1
    • /
    • pp.53-60
    • /
    • 2001
  • The proposed building upgrading technique employs prestressing cables to function as bracing to improve the seismic performance during future events. A four-story reinforced concrete moment resisting frame damaged from an ultimate limit state earthquake is assessed and upgraded using the proposed technique. Both existing and upgraded buildings are evaluated in regard of seismic performance parameters performing static lateral load to collapse analysis and dynamic nonlinear time history analysis as well. To obtain realistic comparison of seismic performance between existing and upgraded frames, each frame is subjected to its critical ground motion that has strength demand exceeding the building strength supply. Furthermore, reliability of static lateral load to collapse analysis as a substitute to time history analysis is evaluated. The results reveal that the proposed upgrading technique improves the stiffness distribution compared to the ideal distribution that gives equal inter-story drift. As a result, the upgraded building retains more stories that contribute to energy dissipation. The overall behavior of upgraded building beyond yield is also enhanced due to the gradual change of building stiffness as the lateral load increases.

  • PDF

An Adequacy Based Test Data Generation Technique Using Genetic Algorithms

  • Malhotra, Ruchika;Garg, Mohit
    • Journal of Information Processing Systems
    • /
    • v.7 no.2
    • /
    • pp.363-384
    • /
    • 2011
  • As the complexity of software is increasing, generating an effective test data has become a necessity. This necessity has increased the demand for techniques that can generate test data effectively. This paper proposes a test data generation technique based on adequacy based testing criteria. Adequacy based testing criteria uses the concept of mutation analysis to check the adequacy of test data. In general, mutation analysis is applied after the test data is generated. But, in this work, we propose a technique that applies mutation analysis at the time of test data generation only, rather than applying it after the test data has been generated. This saves significant amount of time (required to generate adequate test cases) as compared to the latter case as the total time in the latter case is the sum of the time to generate test data and the time to apply mutation analysis to the generated test data. We also use genetic algorithms that explore the complete domain of the program to provide near-global optimum solution. In this paper, we first define and explain the proposed technique. Then we validate the proposed technique using ten real time programs. The proposed technique is compared with path testing technique (that use reliability based testing criteria) for these ten programs. The results show that the adequacy based proposed technique is better than the reliability based path testing technique and there is a significant reduce in number of generated test cases and time taken to generate test cases.

A study on Korean language processing using TF-IDF (TF-IDF를 활용한 한글 자연어 처리 연구)

  • Lee, Jong-Hwa;Lee, MoonBong;Kim, Jong-Weon
    • The Journal of Information Systems
    • /
    • v.28 no.3
    • /
    • pp.105-121
    • /
    • 2019
  • Purpose One of the reasons for the expansion of information systems in the enterprise is the increased efficiency of data analysis. In particular, the rapidly increasing data types which are complex and unstructured such as video, voice, images, and conversations in and out of social networks. The purpose of this study is the customer needs analysis from customer voices, ie, text data, in the web environment.. Design/methodology/approach As previous study results, the word frequency of the sentence is extracted as a word that interprets the sentence has better affects than frequency analysis. In this study, we applied the TF-IDF method, which extracts important keywords in real sentences, not the TF method, which is a word extraction technique that expresses sentences with simple frequency only, in Korean language research. We visualized the two techniques by cluster analysis and describe the difference. Findings TF technique and TF-IDF technique are applied for Korean natural language processing, the research showed the value from frequency analysis technique to semantic analysis and it is expected to change the technique by Korean language processing researcher.

A New Method for Measuring Fiber Length and Fiber Coarseness Using Image Analysis Technique (화상분석법을 응용한 섬유장 및 섬유 조도 측정법 개발)

  • 배진한;김철환;박종열
    • Journal of Korea Technical Association of The Pulp and Paper Industry
    • /
    • v.34 no.2
    • /
    • pp.13-21
    • /
    • 2002
  • A new method for measuring fiber length and fiber coarseness was developed using image analysis technique. Measured fibers were transferred to a glass slide on a filter paper placed on a wire of the laboratory paper machine. After staining the fibers on the slide, mean fiber lengths and coarseness were measured by a commercial image analysis software, named KS400. The resultant data obtained from the image analysis displayed a close correlation with those from FS-200 and also showed excellent reproducibility as well as those from FS-200. The length of synthetic fibers over 10 mm long could be readily measured by this new analysis technique. Finally, a substantial improvement in precision for measuring fiber length and coarseness was made with less operator's effort for a given time.

Software Technique for Automation of Moving Load Analysis for Three-dimensional Frame Model of Girder Bridge (거더 교의 3차원 뼈대 모델 해석시에 이동 하중의 자동화를 위한 소프트웨어 기법)

  • 정대열
    • Proceedings of the Computational Structural Engineering Institute Conference
    • /
    • 1999.04a
    • /
    • pp.218-225
    • /
    • 1999
  • In order to completely automatize the moving load analysis for the three-dimensional frame model of the girder bridge, the efficient software technique is presented, which makes use of the signal among processes. If this software technique is used in automation, the separate algorithm is not needed for the transverse loading analysis, and the complete automatic moving load analysis algorithm can be easily developed. The program, which has the complete automatic moving load analysis function, has been developed with using this software technique, and has been verified by comparing the results with the one in the famous design book.

  • PDF

A performance anaylsis technique for guided weapons (유도무기체계의 성능분석기법)

  • 이연석;이장규;장상근
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1991.10a
    • /
    • pp.274-279
    • /
    • 1991
  • The development of a guided weapon system, such as a tactical missile, requires a performance analysis of a nonlinear system. Generally, the Monte Carlo analysis method is used for this purpose. The limitation of this method, a large number of simulations, for a nonlinear system performance analysis strongly motivated the development of a more efficient analytic technique. In this paper, the statisfical linearization methods is used for the performance analysis to the guided weapon system with the help of covariance analysis technique. Because the statistical linearization methods cannot be used to the look-up table nonlinear form such as aerodynamic coefficients, the second order polynomial representations is obtained from the table using the Lagrange interpolating polynomial and linearized statistically. Simple simulations about initial state conditions and random component in guidance command shows the results of this technique.

  • PDF

A Technique for the Quantitative Analysis of the Noise Jamming Effect (잡음재밍 효과에 대한 정량적 분석 기법)

  • Kim, Sung-Jin;Kang, Jong-Jin
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.8 no.4 s.23
    • /
    • pp.91-101
    • /
    • 2005
  • In this paper, a technique for the quantitative analysis of the noise jamming effect is proposed. This technique based upon the mathematical modeling for noise jammers and the probability theory for random processes analyses the jamming effect by means of the modeling of the relationship among jammer, radar variables and radar detection probability under noise jamming environment. Computer simulation results show that the proposed technique not only makes the quantitative analysis of the jamming effect possible, but also provides the basis for quantitative analysis of the electronic warfare environment.

A Study on the Risk Assessment System for Human Factors (휴먼에러를 중심으로 한 위험요인 도출 방법론에 관한 연구)

  • Jung, Sang Kyo;Chang, Seong Rok
    • Journal of the Korean Society of Safety
    • /
    • v.29 no.3
    • /
    • pp.79-84
    • /
    • 2014
  • Human error is one of the major contributors to the accidents. A lot of risk assessment techniques have been developed for prevention of accidents. Nevertheless, most of them were interested in physical factors, because quantitative evaluation of human errors was difficult quantitatively. According to lack of risk assessment techniques about human errors, most of industrial risk assessment for human errors were based on data of accident analysis. In order to develop an effective countermeasure to reduce the risk caused by human errors, a systematic analysis is needed. Generally, risk assessment system is composed of 5 step(classification of work activity, identification of hazards, risk estimation, evaluation and improvement). This study aimed to develop a risk identification technique for human errors that could mainly be applied to industrial fields. In this study, Ergo-HAZOP and Comprehensive Human Error Analysis Technique were used for developing the risk identification technique. In the proposed risk identification technique, Ergo-HAZOP was used for broad-brush risk identification. More critical risks were analysed by Comprehensive Human Error Analysis Technique. In order to verify applicability, the proposed risk identification technique was applied to the work of pile head cutting. As a consequence, extensive hazards were identified and fundamental countermeasures were established. It is expected that much attention would be paid to prevent accidents by human error in industrial fields since safety personnel can easily fint out hazards of human factors if utilizing the proposed risk identification technique.