• Title/Summary/Keyword: data complexity

Search Result 2,388, Processing Time 0.024 seconds

Determining the complexity level of proceduralized tasks in a digitalized main control room using the TACOM measure

  • Inseok Jang;Jinkyun Park
    • Nuclear Engineering and Technology
    • /
    • v.54 no.11
    • /
    • pp.4170-4180
    • /
    • 2022
  • The task complexity (TACOM) measure was previously developed to quantify the complexity of proceduralized tasks conducted by nuclear power plant operators. Following the development of the TACOM measure, its appropriateness has been validated by investigating the relationship between TACOM scores and three kinds of human performance data, namely response times, human error probabilities, and subjective workload scores. However, the information reflected in quantified TACOM scores is still insufficient to determine the levels of complexity of proceduralized tasks for human reliability analysis (HRA) applications. In this regard, the objective of this study is to suggest criteria for determining the levels of task complexity based on logistic regression between human error occurrences in digitalized main control rooms and TACOM scores. Analysis results confirmed that the likelihood of human error occurrence according to the TACOM score is secured. This result strongly implies that the TACOM measure can be used to identify the levels of task complexity, which could be applicable to various research domains including HRA.

Task Complexity of Movement Skills for Robots (로봇 운동솜씨의 작업 복잡도)

  • Kwon, Woo-Young;Suh, Il-Hong;Lee, Jun-Goo;You, Bum-Jae;Oh, Sang-Rok
    • The Journal of Korea Robotics Society
    • /
    • v.7 no.3
    • /
    • pp.194-204
    • /
    • 2012
  • Measuring task complexity of movement skill is an important factor to evaluate a difficulty of learning and/or imitating a task for autonomous robots. Although many complexity-measures are proposed in research areas such as neuroscience, physics, computer science, and biology, there have been little attention on the robotic tasks. To cope with measuring complexity of robotic task, we propose an information-theoretic measure for task complexity of movement skills. By modeling proprioceptive as well as exteroceptive sensor data as multivariate Gaussian distribution, movements of a task can be modeled as probabilistic model. Additionally, complexity of temporal variations is modeled by sampling in time and modeling as individual random variables. To evaluate our proposed complexity measure, several experiments are performed on the real robotic movement tasks.

A Study on the effects of programming Languages on Software Complexity : Comparison of FORTRAN IV vs. FORTRAN 77 and PASCAL vs. C (프로그래밍 언어가 소프트웨어 복잡도에 미치는 영향에 관한 연구 : FORTRAN IV와 FORTRAN 77, PASCAL과 C의 비교)

  • Yoon, Jung-Mo
    • Journal of Korean Institute of Industrial Engineers
    • /
    • v.19 no.3
    • /
    • pp.59-70
    • /
    • 1993
  • This paper presents the results of experiments which compare the software complexity between programming languages, i.e, FORTRAN IV and FORTRAN 77, PASCAL and C language, respectively. Each experiment is performed to compare the complexity between programs of same problems using Halstead's method based on operator, and McCabe's based on data flow. As the results of 25 test programs experiments, FORTRAN 77 languages shows superiority to FORTRAN IV languages, and C than that of PASCAL languages, in the aspect of the global software complexity.

  • PDF

Improved Impossible Differential Attack on 7-round Reduced ARIA-256

  • Shen, Xuan;He, Jun
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.11
    • /
    • pp.5773-5784
    • /
    • 2019
  • ARIA is an involutory SPN block cipher. Its block size is 128-bit and the master key sizes are 128/192/256-bit, respectively. Accordingly, they are called ARIA-128/192/256. As we all know, ARIA is a Korean Standard block cipher nowadays. This paper focuses on the security of ARIA against impossible differential attack. We firstly construct a new 4-round impossible differential of ARIA. Furthermore, based on this impossible differential, a new 7-round impossible differential attack on ARIA-256 is proposed in our paper. This attack needs 2118 chosen plaintexts and 2210 7-round encryptions. Comparing with the previous best result, we improve both the data complexity and time complexity. To our knowledge, it is the best impossible differential attack on ARIA-256 so far.

Related-key Impossible Boomerang Cryptanalysis on LBlock-s

  • Xie, Min;Zeng, Qiya
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.11
    • /
    • pp.5717-5730
    • /
    • 2019
  • LBlock-s is the core block cipher of authentication encryption algorithm LAC, which uses the same structure of LBlock and an improved key schedule algorithm with better diffusion property. Using the differential properties of the key schedule algorithm and the cryptanalytic technique which combines impossible boomerang attacks with related-key attacks, a 15-round related-key impossible boomerang distinguisher is constructed for the first time. Based on the distinguisher, an attack on 22-round LBlock-s is proposed by adding 4 rounds on the top and 3 rounds at the bottom. The time complexity is about only 268.76 22-round encryptions and the data complexity is about 258 chosen plaintexts. Compared with published cryptanalysis results on LBlock-s, there has been a sharp decrease in time complexity and an ideal data complexity.

The Cognitive Complexity of Clothing Attributes -Focused on Clothing Involvement- (의류 제품의 속성에 대한 인지적 복잡성에 관한 연구 -의복 관여를 중심으로-)

  • Park Sung-Eun
    • Journal of the Korean Society of Clothing and Textiles
    • /
    • v.30 no.4 s.152
    • /
    • pp.497-506
    • /
    • 2006
  • The purpose of this study is to clarify the cognitive complexity of clothing attributes, which influence the preference and purchase intentions. The subjects of this study are 434 female college students and formal survey methodologies were used for collecting data. Those data were analyzed with SAS program, and various methods such as factor analysis, cluster analysis, conjoint analysis, path analysis of Lisrel were followed. The results of this study were as follows: 1) Clothing involvement consists of the affective factor and the cognitive factor. 2) The consumers were divided into three groups with regards to the degree of their clothing involvement. 3) Significant differences were found regarding the cognitive complexity of clothing attributes among these groups.

Learning Behavior Analysis of Bayesian Algorithm Under Class Imbalance Problems (클래스 불균형 문제에서 베이지안 알고리즘의 학습 행위 분석)

  • Hwang, Doo-Sung
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.45 no.6
    • /
    • pp.179-186
    • /
    • 2008
  • In this paper we analyse the effects of Bayesian algorithm in teaming class imbalance problems and compare the performance evaluation methods. The teaming performance of the Bayesian algorithm is evaluated over the class imbalance problems generated by priori data distribution, imbalance data rate and discrimination complexity. The experimental results are calculated by the AUC(Area Under the Curve) values of both ROC(Receiver Operator Characteristic) and PR(Precision-Recall) evaluation measures and compared according to imbalance data rate and discrimination complexity. In comparison and analysis, the Bayesian algorithm suffers from the imbalance rate, as the same result in the reported researches, and the data overlapping caused by discrimination complexity is the another factor that hampers the learning performance. As the discrimination complexity and class imbalance rate of the problems increase, the learning performance of the AUC of a PR measure is much more variant than that of the AUC of a ROC measure. But the performances of both measures are similar with the low discrimination complexity and class imbalance rate of the problems. The experimental results show 4hat the AUC of a PR measure is more proper in evaluating the learning of class imbalance problem and furthermore gets the benefit in designing the optimal learning model considering a misclassification cost.

Complexity Analysis of the Viking Labeled Release Experiments

  • Bianciardi, Giorgio;Miller, Joseph D.;Straat, Patricia Ann;Levin, Gilbert V.
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.13 no.1
    • /
    • pp.14-26
    • /
    • 2012
  • The only extraterrestrial life detection experiments ever conducted were the three which were components of the 1976 Viking Mission to Mars. Of these, only the Labeled Release experiment obtained a clearly positive response. In this experiment $^{14}C$ radiolabeled nutrient was added to the Mars soil samples. Active soils exhibited rapid, substantial gas release. The gas was probably $CO_2$ and, possibly, other radiocarbon-containing gases. We have applied complexity analysis to the Viking LR data. Measures of mathematical complexity permit deep analysis of data structure along continua including signal vs. noise, entropy vs.negentropy, periodicity vs. aperiodicity, order vs. disorder etc. We have employed seven complexity variables, all derived from LR data, to show that Viking LR active responses can be distinguished from controls via cluster analysis and other multivariate techniques. Furthermore, Martian LR active response data cluster with known biological time series while the control data cluster with purely physical measures. We conclude that the complexity pattern seen in active experiments strongly suggests biology while the different pattern in the control responses is more likely to be non-biological. Control responses that exhibit relatively low initial order rapidly devolve into near-random noise, while the active experiments exhibit higher initial order which decays only slowly. This suggests a robust biological response. These analyses support the interpretation that the Viking LR experiment did detect extant microbial life on Mars.

Cost Driver Analysis in General Hospitals Using Simultaneous Equation Model and Path Model (연립방정식모형과 경로모형을 이용한 종합병원의 원가동인 분석)

  • 양동현;이원식
    • Health Policy and Management
    • /
    • v.14 no.1
    • /
    • pp.89-120
    • /
    • 2004
  • The purpose of this empirical study is to test hypotheses in order to identify the cost drivers that drive indirect costs in general hospitals in Korea. In various cases' studies, it has been suggested that overhead costs are driven by volume and complexity variables, how they are structurally related and how the cost impacts of these variables can be A unique feature of the research is the treatment of complexity as an endogenous variable. It is hypothesized that level of hospital complexity in terms of the number of services provided(i.e., “breath" complexity) and the intensity of individual estimated in practice. overhead services(ie., “depth" complexity) are simultaneous determined with the level of costs needed to support the complexity. Data used in this study were obtained from the Database of Korean Health Industry Development Institute, Health Insurance Review Agency and analyzed using simultaneous equation model, path model. The results found those volume and complexity variables are all statistically signi-ficance drivers of general hospital overhead costs. This study has documented that the level of service complexity is a significant determinant of hospital overhead costs, caution should be exercised in interpreting this as supportive of the cost accounting procedures associated with ABC. with ABC.

A High-Speed 2-Parallel Radix-$2^4$ FFT Processor for MB-OFDM UWB Systems (MB-OFDM UWB 통신 시스템을 위한 고속 2-Parallel Radix-$2^4$ FFT 프로세서의 설계)

  • Lee, Jee-Sung;Lee, Han-Ho
    • Proceedings of the IEEK Conference
    • /
    • 2006.06a
    • /
    • pp.533-534
    • /
    • 2006
  • This paper presents the architecture design of a high-speed, low-complexity 128-point radix-$2^4$ FFT processor for ultra-wideband (UWB) systems. The proposed high-speed, low-complexity FFT architecture can provide a higher throughput rate and low hardware complexity by using 2-parallel data-path scheme and single-path delay-feedback (SDF) structure. This paper presents the key ideas applied to the design of high-speed, low-complexity FFT processor, especially that for achieving high throughput rate and reducing hardware complexity. The proposed FFT processor has been designed and implemented with the 0.18-m CMOS technology in a supply voltage of 1.8 V. The throughput rate of proposed FFT processor is up to 1 Gsample/s while it requires much smaller hardware complexity.

  • PDF