• Title/Summary/Keyword: model complexity

Search Result 1,954, Processing Time 0.029 seconds

Design of Low Complexity Human Anxiety Classification Model based on Machine Learning (기계학습 기반 저 복잡도 긴장 상태 분류 모델)

  • Hong, Eunjae;Park, Hyunggon
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.66 no.9
    • /
    • pp.1402-1408
    • /
    • 2017
  • Recently, services for personal biometric data analysis based on real-time monitoring systems has been increasing and many of them have focused on recognition of emotions. In this paper, we propose a classification model to classify anxiety emotion using biometric data actually collected from people. We propose to deploy the support vector machine to build a classification model. In order to improve the classification accuracy, we propose two data pre-processing procedures, which are normalization and data deletion. The proposed algorithms are actually implemented based on Real-time Traffic Flow Measurement structure, which consists of data collection module, data preprocessing module, and creating classification model module. Our experiment results show that the proposed classification model can infers anxiety emotions of people with the accuracy of 65.18%. Moreover, the proposed model with the proposed pre-processing techniques shows the improved accuracy, which is 78.77%. Therefore, we can conclude that the proposed classification model based on the pre-processing process can improve the classification accuracy with lower computation complexity.

Task Complexity of Movement Skills for Robots (로봇 운동솜씨의 작업 복잡도)

  • Kwon, Woo-Young;Suh, Il-Hong;Lee, Jun-Goo;You, Bum-Jae;Oh, Sang-Rok
    • The Journal of Korea Robotics Society
    • /
    • v.7 no.3
    • /
    • pp.194-204
    • /
    • 2012
  • Measuring task complexity of movement skill is an important factor to evaluate a difficulty of learning and/or imitating a task for autonomous robots. Although many complexity-measures are proposed in research areas such as neuroscience, physics, computer science, and biology, there have been little attention on the robotic tasks. To cope with measuring complexity of robotic task, we propose an information-theoretic measure for task complexity of movement skills. By modeling proprioceptive as well as exteroceptive sensor data as multivariate Gaussian distribution, movements of a task can be modeled as probabilistic model. Additionally, complexity of temporal variations is modeled by sampling in time and modeling as individual random variables. To evaluate our proposed complexity measure, several experiments are performed on the real robotic movement tasks.

A Study on Applying Amphibious Warfare Using EINSTein Model Based on Complexity Theory (복잡계이론 기반하 EINSTein 모형을 이용한 상륙전 적용에 관한 연구)

  • Lee, Sang-Heon
    • Journal of the military operations research society of Korea
    • /
    • v.32 no.2
    • /
    • pp.114-130
    • /
    • 2006
  • This paper deals with complexity theory to describe amphibious warfare situation using EINSTein (Enhanced ISAAC Neural Simulation Tool) simulation model. EINSTein model is an agent-based artificial "laboratory" for exploring self-organized emergent behavior in land combat. Many studies have shown that existing Lanchester equations used in most war simulation models does not describe changes of combat. Future warfare will be information warfare with various weapon system and complex combat units. We have compared and tested combat results with Lanchester models and EINSTein model. Furthermore, the EINSTein model has been applied and analyzed to amphibious warfare model such as amphibious assault and amphibious sudden attack. The results show that the EINSTein model has a possibility to apply and analyze amphibious warfare more properly than Lanchester models.

A Study of Risk Analysis Model on Web Software (웹 소프트웨어의 위험분석 모델에 관한 연구)

  • Kim, Jee-Hyun;Oh, Sung-Kyun
    • Journal of the Korea Society of Computer and Information
    • /
    • v.11 no.3
    • /
    • pp.281-289
    • /
    • 2006
  • Even though software developing environment has been changing to Web basis very fast, there are just few studies of quality metric or estimation model for Web software. In this study after analyzing the correlation between the risk level and property of objects using linear regression, six middle sized industrial system has been used to propose the correlation model of size and Number of Classes(NOC), size and Number of Methods(NOM), complexity and NOC, and complexity and NOM. Among of six systems 5 systems(except S06) have high correlation between size(LOC) and NOM, and four systems(except S04 & S06) have high correlation between complexity and NOC / NOM. As Web software architecture with three sides of Server, Client and HTML, complexity of each sides has been compared, two system(S04, S06) has big differences of each sides compleity values and one system(S06) has very higher complexity value of HTML, So the risk level could be estimated through NOM to improve maintenance in case of that the system has no big differences of each sides complexity.

  • PDF

Investigation into Longitudinal Writing Development Using Linear Mixed Effects Model (선형 혼합 모형을 통해 살펴본 쓰기 능력의 장기적인 발전 양상 탐색)

  • Lee, Young-Ju
    • The Journal of the Convergence on Culture Technology
    • /
    • v.8 no.2
    • /
    • pp.315-319
    • /
    • 2022
  • This study investigates longitudinal writing development in terms of syntactic complexity using linear mixed effects (LME) model. This study employs essays written by four case study participants. Participants voluntarily wrote essays outside of the classroom and submitted the first and second drafts, after reflecting on the automated writing evaluation feedback (i.e., Criterion) every month over one year. A total of 48 first drafts were analyzed and syntactic complexity features were selected from Syntactic Complexity Analyzer. Results of LME showed that there was a significant positive linear relationship between time and mean length of T-unit and also between time and the ratio of dependent clauses to independent clauses, indicating that case study participants wrote longer T-units and also a higher proportion of dependent clauses over one year.

A Fuzzy Model for Assessing IT Governance Complexity (IT 거버넌스 복잡성 평가를 위한 퍼지 모델)

  • Lee, Sang-Hyun;Lee, Sang-Joon;Moon, Kyung-Il;Cho, Sung-Eui
    • Journal of Digital Convergence
    • /
    • v.7 no.4
    • /
    • pp.169-180
    • /
    • 2009
  • IT governance implies a system in which all stakeholders with a given organization, including the board, internal customers, and related areas such as finance provide the necessary input into their decision-making process. However, the concepts of IT governance are broad and ambiguous, so IT governance is eventually needed multi-criteria decision making. This paper presents a hierarchical structure to better understand the relationship between control structure and the complexity of collective behavior with respect to IT governance and proposes a corresponding fuzzy model for analyzing IT governance complexity based on an extensive literature review. The results of this study are expected to provide a clearer understanding of how the concerns of IT governance behave and how they interact and form the collective behavior of the entire system.

  • PDF

A Study on the Complexity Measurement of Architecture Assets (아키텍처 자산의 복잡도 측정에 관한 연구)

  • Choi, Han-Yong
    • Journal of Convergence for Information Technology
    • /
    • v.7 no.5
    • /
    • pp.111-116
    • /
    • 2017
  • In this paper, we propose a method to measure the complexity of assets when a software component is constructed as a basic asset, a standardized design model is acquired, and a reusable extended asset is designed based on the standardized design model. However, each asset of our proposed asset management system consists of composite assets that combine assets of two domains. So this method can not make accurate measurements. Therefore, the complexity of the overall asset can be measured by reflecting the property value of the basic asset stored under the architecture. In conclusion, it is possible to measure the composite-complexity of a composed asset that is inversely proportional to cohesion and proportional to the cumulative sum of the associated values of each asset in the asset-related design.

Numerical Experiments for the Stress-Reducing Preventive Maintenance Model (수치실험을 통한 스트레스 감소 예방보수모형의 고찰)

  • Park, Jong Hun
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.43 no.3
    • /
    • pp.41-52
    • /
    • 2020
  • This paper investigates the stress-reducing preventive maintenance model through numerical experiments. The preventive maintenance model is used to analyze the relationship between related conditions and variables to gain insight into the efficient operation of the system when performing preventive maintenance in real-world situations. Various preventive maintenance models have been developed over the past decades and their complexity has increased in recent years. Increasing complexity is essential to reflect reality, but recent models can only be interpreted through numerical experiments. The stress-reducing preventive maintenance is a newly introduced preventive maintenance concept and can only be interpreted numerically due to its complexity, and has received little attention because the concept is unfamiliar. Therefore, for information purposes, this paper investigates the characteristics of the stress-reducing preventive maintenance and the relationship between parameters and variables through numerical experiments. In particular, this paper is focusing on the economic feasibility of stress-reducing preventive maintenance by observing changes in the optimal preventive maintenance period in response to changes in environmental stress and the improvement factor. As a result, when either the environmental stress or the improve effect of stress-reducing preventive maintenance is low, it is not necessary to carry out the stress-reducing preventive maintenance at excessive cost. In addition, it was found that the age reduction model is more economical than the failure rate reduction model.

An XPDL-Based Workflow Control-Structure and Data-Sequence Analyzer

  • Kim, Kwanghoon Pio
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.3
    • /
    • pp.1702-1721
    • /
    • 2019
  • A workflow process (or business process) management system helps to define, execute, monitor and manage workflow models deployed on a workflow-supported enterprise, and the system is compartmentalized into a modeling subsystem and an enacting subsystem, in general. The modeling subsystem's functionality is to discover and analyze workflow models via a theoretical modeling methodology like ICN, to graphically define them via a graphical representation notation like BPMN, and to systematically deploy those graphically defined models onto the enacting subsystem by transforming into their textual models represented by a standardized workflow process definition language like XPDL. Before deploying those defined workflow models, it is very important to inspect its syntactical correctness as well as its structural properness to minimize the loss of effectiveness and the depreciation of efficiency in managing the corresponding workflow models. In this paper, we are particularly interested in verifying very large-scale and massively parallel workflow models, and so we need a sophisticated analyzer to automatically analyze those specialized and complex styles of workflow models. One of the sophisticated analyzers devised in this paper is able to analyze not only the structural complexity but also the data-sequence complexity, especially. The structural complexity is based upon combinational usages of those control-structure constructs such as subprocesses, exclusive-OR, parallel-AND and iterative-LOOP primitives with preserving matched pairing and proper nesting properties, whereas the data-sequence complexity is based upon combinational usages of those relevant data repositories such as data definition sequences and data use sequences. Through the devised and implemented analyzer in this paper, we are able eventually to achieve the systematic verifications of the syntactical correctness as well as the effective validation of the structural properness on those complicate and large-scale styles of workflow models. As an experimental study, we apply the implemented analyzer to an exemplary large-scale and massively parallel workflow process model, the Large Bank Transaction Workflow Process Model, and show the structural complexity analysis results via a series of operational screens captured from the implemented analyzer.

A Study on Power Plant Modeling for Control System Design

  • Kim, Tae-Shin;Kwon, Oh-Kyu
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.1449-1454
    • /
    • 2003
  • For many industrial processes there are good static models used for process design and steady state operation. By using system identification techniques, it is possible to obtain black-box models with reasonable complexity that describe the system well in specific operating conditions [1]. But black-box models using inductive modeling(IM) is not suitable for model based control because they are only valid for specific operating conditions. Thus we need to use deductive modeling(DM) for a wide operating range. Furthermore, deductive modeling is several merits: First, the model is possible to be modularized. Second, we can increase and decrease the model complexity. Finally, we are able to use model for plant design. Power plant must be able to operate well at dramatic load change and consider safety and efficiency. This paper proposes a simplified nonlinear model of an industrial boiler, one of component parts of a power plant, by DM method and applies optimal control to the model.

  • PDF