• Title/Summary/Keyword: Complexity measure

Search Result 312, Processing Time 0.027 seconds

A Small-area Hardware Implementation of EGML-based Moving Object Detection Processor (EGML 기반 이동객체 검출 프로세서의 저면적 하드웨어 구현)

  • Sung, Mi-ji;Shin, Kyung-wook
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.21 no.12
    • /
    • pp.2213-2220
    • /
    • 2017
  • This paper proposes an efficient approach for hardware implementation of moving object detection (MOD) processor using effective Gaussian mixture learning (EGML)-based background subtraction method. Arithmetic units used in background generation were implemented using LUT-based approximation to reduce hardware complexity. Hardware resources used for both background subtraction and Gaussian probability density calculation were shared. The MOD processor was verified by FPGA-in-the-loop simulation using MATLAB/Simulink. The MOD performance was evaluated by using six types of video defined in IEEE CDW-2014 dataset, which resulted the average of recall value of 0.7700, the average of precision value of 0.7170, and the average of F-measure value of 0.7293. The MOD processor was implemented with 882 slices and block RAM of $146{\times}36kbits$ on Virtex5 FPGA, resulting in 60% hardware reduction compared to conventional design based on EGML. It was estimated that the MOD processor could operate with 75 MHz clock, resulting in real-time processing of $800{\times}600$ video with a frame rate of 39 fps.

Feature Analysis of Different In Vitro Antioxidant Capacity Assays and Their Application to Fruit and Vegetable Samples (In Vitro 항산화능 측정법에 대한 특징 분석과 채소.과일 시료에 대한 적용 사례 고찰)

  • Kim, Min-Jung;Park, Eun-Ju
    • Journal of the Korean Society of Food Science and Nutrition
    • /
    • v.40 no.7
    • /
    • pp.1053-1062
    • /
    • 2011
  • Reactive oxygen species (ROS), including singlet oxygen (${O_2}^1$), superoxide anion radical ($O_2{\cdot}^-$), hydroxyl radical ($HO{\cdot}$), peroxyl radical ($ROO{\cdot}$), hydrogen peroxide ($H_2O_2$), and hypochlorous (HOCl), are generated as byproducts of normal cellular metabolism. ROS induce damage to many biological molecules, such as lipids, proteins, carbohydrates, and DNA. It is widely believed that some degenerative diseases caused by ROS can be prevented by the high intake of fruits and vegetables due to their antioxidant activities. Recently, research on natural antioxidants has become increasingly active in various fields. Several assays have been developed to measure the total antioxidant capacity of antioxidants in fruits and vegetables in vitro. These assays include those for DPPH radical scavenging activity, SOD-like activity, total polyphenol content, oxygen radical absorbance capacity, reducing power, trolox equivalent antioxidant capacity (ABTS assay), single-cell gel electrophoresis (comet assay), and a cellular antioxidant activity assay. Because different antioxidant compounds may act through different mechanisms in vitro, no single assay can fully evaluate the total antioxidant capacity of foods. Due to the complexity of the composition of foods, it is important to be able to measure antioxidant activity using biologically relevant assays. In this review, recently used assays were selected for extended discussion, including a comparison of the advantages and disadvantages of each assay and their application to fruits and vegetables.

Development of a Tool to Measure Suffering in Patients with Cancer (암환자의 고통 측정도구 개발에 관한 연구)

  • 강경아
    • Journal of Korean Academy of Nursing
    • /
    • v.29 no.6
    • /
    • pp.1365-1378
    • /
    • 1999
  • This study is a methodological research study to develop an instrument to measure in patients with cancer and to test the validity and reliability of the instrument. The research procedure was as follows : 1) The first step was to develop conceptual framework based on a comprehensive review of the literature and in-depth interviews with patients with cancer. This conceptual framework was organized in to three dimensions (the intrapersonal dimension, the significant-other and context related dimension, the transcendental dimension). Initially 59 items were adopted. 2) These items were analyzed through the index of content validity(CVI) and 53 items were selected which met more than 80% on the CVI. 3) The pretest was carried out with 87 patients with cancer. After the pretest results were analyzed by item analysis, 44 items were selected. A second test of content validity was conducted and 6 items were eliminated considering the 80% CVI. 4) To test for reliability and validity, data collection was done during the period from January 25, 1999, to February 26, 1999. The subjects for the test were 160 patients with cancer and 185 healthy persons. analysis, item analysis and multitrait-multimethod method to analyze validity. The findings are as follows : 1) The Cronbach's alpha coefficient for internal consistency was .92 for the total 38 items and .79, .82, .85, for the three dimensions in that order. 2) The item analysis was based on the corrected item to total correlation coefficient( .30 or more) and information about the alpha estimate if this item was dropped from the scale. 3) As a result of the initial factor analysis using principal component analysis and varimax rotation, one item was deleted because of factor complexity (indiscriminate factor loadings). In the secondary factor analysis, 7 factors with eigenvalue of more than 1.0 were extracted and these factors explained 56 percents of the total variance. The seven factors were labeled as 'family relationship', 'emotional condition', 'physical discomfort', 'meaning and goal of life', 'contextual stimuli', 'change of body image', 'guilt feelings'. 4) The convergence effect between this instrument and the life satisfaction scale was identified and there was significant positive correlation(r= .52, p= .00). The discriminant validity between this instrument and the depression scale(CES-D) was tested and there was significant negative correlation(r= -.50, p= .00). The instrument for accessing the suffering of patients with cancer developed in this study was identified as a tool with a high degree of reliability and validity. In this sense, this tool can be effectively utilized for assessment in caring for patients with cancer.

  • PDF

Optimal Design of Generalized Process-storage Network Applicable To Polymer Processes (고분자 공정에 적용할 수 있는 일반화된 공정-저장조 망구조 최적설계)

  • Yi, Gyeongbeom;Lee, Euy-Soo
    • Korean Chemical Engineering Research
    • /
    • v.45 no.3
    • /
    • pp.249-257
    • /
    • 2007
  • The periodic square wave (PSW) model was successfully applied to the optimal design of a batch-storage network. The network structure can cover any type of batch production, distribution and inventory system, including recycle streams. Here we extend the coverage of the PSW model to multitasking semi-continuous processes as well as pure continuous and batch processes. In previous solutions obtained using the PSW model, the feedstock composition and product yield were treated as known constants. This constraint is relaxed in the present work, which treats the feedstock composition and product yield as free variables to be optimized. This modification makes it possible to deal with the pooling problem commonly encountered in oil refinery processes. Despite the greater complexity that arises when the feedstock composition and product yield are free variables, the PSW model still gives analytic lot sizing equations. The ability of the proposed method to determine the optimal plant design is demonstrated through the example of a high density polyethylene (HDPE) plant. Based on the analytical optimality results, we propose a practical process optimality measure that can be used for any kind of process. This measure facilitates direct comparison of the performance of multiple processes, and hence is a useful tool for diagnosing the status of process systems. The result that the cost of a process is proportional to the square root of average flow rate is similar to the well-known six-tenths factor rule in plant design.

Illegal Cash Accommodation Detection Modeling Using Ensemble Size Reduction (신용카드 불법현금융통 적발을 위한 축소된 앙상블 모형)

  • Lee, Hwa-Kyung;Han, Sang-Bum;Jhee, Won-Chul
    • Journal of Intelligence and Information Systems
    • /
    • v.16 no.1
    • /
    • pp.93-116
    • /
    • 2010
  • Ensemble approach is applied to the detection modeling of illegal cash accommodation (ICA) that is the well-known type of fraudulent usages of credit cards in far east nations and has not been addressed in the academic literatures. The performance of fraud detection model (FDM) suffers from the imbalanced data problem, which can be remedied to some extent using an ensemble of many classifiers. It is generally accepted that ensembles of classifiers produce better accuracy than a single classifier provided there is diversity in the ensemble. Furthermore, recent researches reveal that it may be better to ensemble some selected classifiers instead of all of the classifiers at hand. For the effective detection of ICA, we adopt ensemble size reduction technique that prunes the ensemble of all classifiers using accuracy and diversity measures. The diversity in ensemble manifests itself as disagreement or ambiguity among members. Data imbalance intrinsic to FDM affects our approach for ICA detection in two ways. First, we suggest the training procedure with over-sampling methods to obtain diverse training data sets. Second, we use some variants of accuracy and diversity measures that focus on fraud class. We also dynamically calculate the diversity measure-Forward Addition and Backward Elimination. In our experiments, Neural Networks, Decision Trees and Logit Regressions are the base models as the ensemble members and the performance of homogeneous ensembles are compared with that of heterogeneous ensembles. The experimental results show that the reduced size ensemble is as accurate on average over the data-sets tested as the non-pruned version, which provides benefits in terms of its application efficiency and reduced complexity of the ensemble.

Diagnosis by Rough Set and Information Theory in Reinforcing the Competencies of the Collegiate (러프집합과 정보이론을 이용한 대학생역량강화 진단)

  • Park, In-Kyoo
    • Journal of Digital Convergence
    • /
    • v.12 no.8
    • /
    • pp.257-264
    • /
    • 2014
  • This paper presents the core competencies diagnosis system which targeted our collegiate students in an attempt to induce the core competencies for reinforcing the learning and employment capabilities. Because these days data give rise to a high level of redundancy and dimensionality with time complexity, they are more likely to have spurious relationships, and even the weakest relationships will be highly significant by any statistical test. So as to address the measurement of uncertainties from the classification of categorical data and the implementation of its analytic system, an uncertainty measure of rough entropy and information entropy is defined so that similar behaviors analysis is carried out and the clustering ability is demonstrated in the comparison with the statistical approach. Because the acquired and necessary competencies of the collegiate is deduced by way of the results of the diagnosis, i.e. common core competencies and major core competencies, they facilitate not only the collegiate life and the employment capability reinforcement but also the revitalization of employment and the adjustment to college life.

Crosswind effects on high-sided road vehicles with and without movement

  • Wang, Bin;Xu, You-Lin;Zhu, Le-Dong;Li, Yong-Le
    • Wind and Structures
    • /
    • v.18 no.2
    • /
    • pp.155-180
    • /
    • 2014
  • The safety of road vehicles on the ground in crosswind has been investigated for many years. One of the most important fundamentals in the safety analysis is aerodynamic characteristics of a vehicle in crosswind. The most common way to study the aerodynamic characteristics of a vehicle in crosswind is wind tunnel tests to measure the aerodynamic coefficients and/or pressure coefficients of the vehicle. Due to the complexity of wind tunnel test equipment and procedure, the features of flow field around the vehicle are seldom explored in a wind tunnel, particularly for the vehicle moving on the ground. As a complementary to wind tunnel tests, the numerical method using computational fluid dynamics (CFD) can be employed as an effective tool to explore the aerodynamic characteristics of as well as flow features around the vehicle. This study explores crosswind effects on a high-sided lorry on the ground with and without movement through CFD simulations together with wind tunnel tests. Firstly, the aerodynamic forces on a stationary lorry model are measured in a wind tunnel, and the results are compared with the previous measurement results. The CFD with unsteady RANS method is then employed to simulate wind flow around and wind pressures on the stationary lorry. The numerical aerodynamic forces are compared with the wind tunnel test results. Furthermore, the same CFD method is extended to investigate the moving vehicle on the ground in crosswind. The results show that the CFD results match with wind tunnel test results and the current way using aerodynamic coefficients from a stationary vehicle in crosswind is acceptable. The CFD simulation can provide more insights on flow field and pressure distribution which are difficult to be obtained by wind tunnel tests.

Multilevel Threshold Selection Method Based on Gaussian-Type Finite Mixture Distributions (가우시안형 유한 혼합 분포에 기반한 다중 임계값 결정법)

  • Seo, Suk-T.;Lee, In-K.;Jeong, Hye-C.;Kwon, Soon-H.
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.17 no.6
    • /
    • pp.725-730
    • /
    • 2007
  • Gray-level histogram-based threshold selection methods such as Otsu's method, Huang and Wang's method, and etc. have been widely used for the threshold selection in image processing. They are simple and effective, but take too much time to determine the optimal multilevel threshold values as the number of thresholds are increased. In this paper, we measure correlation between gray-levels by using the Gaussian function and define a Gaussian-type finite mixture distribution which is combination of the Gaussian distribution function with the gray-level histogram, and propose a fast and effective threshold selection method using it. We show the effectiveness of the proposed through experimental results applied it to three images and the efficiency though comparison of the computational complexity of the proposed with that of Otsu's method.

A Study on the Development of an Infertility Stress Scale (불임 스트레스 척도 개발에 관한 연구)

  • 김선행;박영주;장성옥
    • Journal of Korean Academy of Nursing
    • /
    • v.25 no.1
    • /
    • pp.141-155
    • /
    • 1995
  • The objective of this study was to develop a scale to measure stress in infertile couples and to test its reliability and validity. Prior to item generation, a basic decision was made to conceptualize stress in infertile couples as including two dimensions and four subdimensions. The dimensions were, intrapersonal stress including cognitive and affective stress, and interpersonal stress including marital and social stress. Initially 95 items were generated from the inter-view data of 31 primary or secondary infertile women and from a literature review. These items were analyzed through the Index of Content Validity(CVI) and 69 items were selected which met 70% or more of the CVI. This preliminary Infertility Stress Scale were analyzed for reliability and construct validity. Item analysis and factor analysis were applied for construct validity. Forty items were selected through item analysis. This procedure was based on the inter-item correlation matrix, a corrected average inter-item correlation coefficient(.30~.70), a corrected item to total correlation coefficient (.03 or more) and information about the alpha estimate if this item was dropped from the scale. The result of the initial factor analysis including varimax rotation produced eight factors. Five items deleted because of factor complexity(indiscriminate factor loadings). The secondary factor analysis including varimax rotation produced seven factors that coincided with the conceptual framework posed for the scale developed. The seven factors were labeled as ‘meaning of children’,‘worthiness’,‘tenacious linking’,‘marital satisfaction’,‘sexual satisfaction’,‘familial adjustment’ and ‘social adjustment’. The alpha coefficient relating to internal consistency was .93 for reliability The results of this study suggest that the measurement derived from the Infertility Stress Scale is useful in assessing the stress of infertile couples.

  • PDF

Development of Progress Measurement Framework for Mega Construction Project (대규모 건설프로젝트의 진도율 측정 프레임워크 개발에 관한 연구)

  • Ko, Sungjin;Chi, Seokho;Kim, Jinwoo;Song, Junho
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.37 no.2
    • /
    • pp.419-425
    • /
    • 2017
  • Managing mega construction project is a very challenging task due to its large scale and complexity. To deal with the challenge, progress measurement must be estimated reliably and effectively. Many researchers have presented progress measurement methods, but previous studies are limited to single or several types of facilities. Thus, they have difficulties to being applied to mega construction projects that covers multiple facilities. To overcome the limitations, this paper proposes a progress measurement framework considering characteristics of mega construction projects. The framework consists of four phases : development of Work Breakdown Structure, determination of weights of main facilities and sub facilities, and calculation of the integrated progress rate. To validate the proposed approach, a case study of Sejong city in Korea was performed. The results of the case study showed the applicability of the proposed framework and confirmed that it enables to reliably measure progress rate of mega construction projects.