• 제목/요약/키워드: Information Complexity

Search Result 5,349, Processing Time 0.602 seconds

The Effect of Occupational Information on the Cognitive Complexity of Adolescents (직업정보제공방식의 차이에 따른 청소년의 직업인지복잡성의 증대효과)

  • Lee, Yok
    • Korean Journal of Child Studies
    • /
    • v.12 no.2
    • /
    • pp.67-77
    • /
    • 1991
  • An investigation of the effect of occupational information on vocational cognitive complexity was conducted with 331 male and female adolescents in ninth grade. There were 2 experimental groups and 1 control group. Experimental group I was given only occupational information sheets (written form information) while group II was given occupational information through verbal instruction in addition to the occupational information sheets. A modified form of the cognitive complexity grid originally developed by Bodden (1970) was utilized to collect data on the subjects' vocational cognitive complexity. ANOVA and $Scheff{\acute{e}}$ tests revealed that there were significant differences between experimental group II and the other groups in vocational cognitive complexity. The cognitive complexity level of experimental group I and the control group for the most aspired occupation was significantly lower than for the least aspired occupation. However, the cognitive complexity level of experimental group II for the most aspired occupation was higher than for the least aspired occupation. The results suggest that just giving occupational information to adolescents may not be effective and giving occupational information may be effective only when the method of giving occupational information is active enough to induce adolescents' self-confirming cognitive process.

  • PDF

Developing Visual Complexity Metrics for Automotive Human-Machine Interfaces

  • Kim, Ji Man;Hwangbo, Hwan;Ji, Yong Gu
    • Journal of the Ergonomics Society of Korea
    • /
    • v.34 no.3
    • /
    • pp.235-245
    • /
    • 2015
  • Objective: The purpose of this study is to develop visual complexity metrics based on theoretical bases. Background: With the development of IT technologies, drivers process a large amount of information caused by automotive human-machine interface (HMI), such as a cluster, a head-up display, and a center-fascia. In other words, these systems are becoming more complex and dynamic than traditional driving systems. Especially, these changes can lead to the increase of visual demands. Thus, a concept and tool is required to evaluate the complicated systems. Method: We reviewed prior studies in order to analyze the visual complexity. Based on complexity studies and human perceptual characteristics, the dimensions characterizing the visual complexity were determined and defined. Results: Based on a framework and complexity dimensions, a set of metrics for quantifying the visual complexity was developed. Conclusion: We suggest metrics in terms of perceived visual complexity that can evaluate the in-vehicle displays. Application: This study can provide the theoretical bases in order to evaluate complicated systems. In addition, it can quantitatively measure the visual complexity of In-vehicle information system and be helpful to design in terms of preventing risks, such as human error and distraction.

A QUALITATIVE METHOD TO ESTIMATE HSI DISPLAY COMPLEXITY

  • Hugo, Jacques;Gertman, David
    • Nuclear Engineering and Technology
    • /
    • v.45 no.2
    • /
    • pp.141-150
    • /
    • 2013
  • There is mounting evidence that complex computer system displays in control rooms contribute to cognitive complexity and, thus, to the probability of human error. Research shows that reaction time increases and response accuracy decreases as the number of elements in the display screen increase. However, in terms of supporting the control room operator, approaches focusing on addressing display complexity solely in terms of information density and its location and patterning, will fall short of delivering a properly designed interface. This paper argues that information complexity and semantic complexity are mandatory components when considering display complexity and that the addition of these concepts assists in understanding and resolving differences between designers and the preferences and performance of operators. This paper concludes that a number of simplified methods, when combined, can be used to estimate the impact that a particular display may have on the operator's ability to perform a function accurately and effectively. We present a mixed qualitative and quantitative approach and a method for complexity estimation.

Adaptive Frame Rate Up-Conversion Algorithms using Block Complexity Information

  • Lee, Kangjun
    • Journal of Korea Multimedia Society
    • /
    • v.21 no.8
    • /
    • pp.813-820
    • /
    • 2018
  • This paper proposes new frame rate up-conversion algorithms. Adaptive motion estimation based on block complexity information are used to obtain more accurate motion vectors. Because the information on block complexity is extracted from the motion estimation prediction size from the original frame, additional computational complexity is not imparted. In experimental results, the proposed algorithms provide robust frame interpolation performance for whole test sequences. Also, the computational complexity of the proposed algorithm is reduced to a benchmark algorithm.

Identification and Organization of Task Complexity Factors Based on a Model Combining Task Design Aspects and Complexity Dimensions

  • Ham, Dong-Han
    • Journal of the Ergonomics Society of Korea
    • /
    • v.32 no.1
    • /
    • pp.59-68
    • /
    • 2013
  • Objective: The purpose of this paper is to introduce a task complexity model combining task design aspects and complexity dimensions and to explain an approach to identifying and organizing task complexity factors based on the model. Background: Task complexity is a critical concept in describing and predicting human performance in complex systems such as nuclear power plants(NPPs). In order to understand the nature of task complexity, task complexity factors need to be identified and organized in a systematic manner. Although several methods have been suggested for identifying and organizing task complexity factors, it is rare to find an analytical approach based on a theoretically sound model. Method: This study regarded a task as a system to be designed. Three levels of design ion, which are functional, behavioral, and structural level of a task, characterize the design aspects of a task. The behavioral aspect is further classified into five cognitive processing activity types(information collection, information analysis, decision and action selection, action implementation, and action feedback). The complexity dimensions describe a task complexity from different perspectives that are size, variety, and order/organization. Combining the design aspects and complexity dimensions of a task, we developed a model from which meaningful task complexity factors can be identified and organized in an analytic way. Results: A model consisting of two facets, each of which is respectively concerned with design aspects and complexity dimensions, were proposed. Additionally, twenty-one task complexity factors were identified and organized based on the model. Conclusion: The model and approach introduced in this paper can be effectively used for examining human performance and human-system interface design issues in NPPs. Application: The model and approach introduced in this paper could be used for several human factors problems, including task allocation and design of information aiding, in NPPs and extended to other types of complex systems such as air traffic control systems as well.

Software Complexity and Management for Real-Time Systems

  • Agarwal Ankur;Pandya A.S.;Lbo Young-Ubg
    • Journal of information and communication convergence engineering
    • /
    • v.4 no.1
    • /
    • pp.23-27
    • /
    • 2006
  • The discipline of software performance is very broad; it influences all aspects of the software development lifecycle, including architecture, design, deployment, integration, management, evolution and servicing. Thus, the complexity of software is an important aspect of development and maintenance activities. Much research has been dedicated to defining different software measures that capture what software complexity is. In most cases, the description of complexity is given to humans in forms of numbers. These quantitative measures reflect human-seen complexity with different levels of success. Software complexity growth has been recognized to be beyond human control. In this paper, we have focused our discussion on the increasing software complexity and the issue with the problems being faced in managing this complexity. This increasing complexity in turn affects the software productivity, which is declining with increase in its complexity.

A Slice-based Complexity Measure (슬라이스 기반 복잡도 척도)

  • Moon, Yu-Mi;Choi, Wan-Kyoo;Lee, Sung-Joo
    • The KIPS Transactions:PartD
    • /
    • v.8D no.3
    • /
    • pp.257-264
    • /
    • 2001
  • We developed a SIFG (Slice-based Information Graph), which modelled the information flow on program on the basis of the information flow of data tokens on data slices. Then we defied a SCM (Slice-based complexity measure), which measured the program complexity by measuring the complexity of information flow on SIFG. SCM satisfied the necessary properties for complexity measure proposed by Briand et al. SCM could measure not only the control and data flow on program but also the physical size of program unlike the existing measures.

  • PDF

Low-Complexity Triple-Error-Correcting Parallel BCH Decoder

  • Yeon, Jaewoong;Yang, Seung-Jun;Kim, Cheolho;Lee, Hanho
    • JSTS:Journal of Semiconductor Technology and Science
    • /
    • v.13 no.5
    • /
    • pp.465-472
    • /
    • 2013
  • This paper presents a low-complexity triple-error-correcting parallel Bose-Chaudhuri-Hocquenghem (BCH) decoder architecture and its efficient design techniques. A novel modified step-by-step (m-SBS) decoding algorithm, which significantly reduces computational complexity, is proposed for the parallel BCH decoder. In addition, a determinant calculator and a error locator are proposed to reduce hardware complexity. Specifically, a sharing syndrome factor calculator and a self-error detection scheme are proposed. The multi-channel multi-parallel BCH decoder using the proposed m-SBS algorithm and design techniques have considerably less hardware complexity and latency than those using a conventional algorithms. For a 16-channel 4-parallel (1020, 990) BCH decoder over GF($2^{12}$), the proposed design can lead to a reduction in complexity of at least 23 % compared to conventional architecttures.

The Use of VFG for Measuring the Slice Complexity (슬라이스 복잡도 측정을 위한 VFG의 사용)

  • 문유미;최완규;이성주
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.5 no.1
    • /
    • pp.183-191
    • /
    • 2001
  • We develop the new data s]ice representation, called the data value flow graph(VFG), for modeling the information flow oil data slices. Then we define a slice complexity measure by using the existing flow complexity measure in order to measure the complexity of the information flow on VFG. We show relation of the slice complexity on a slice with the slice complexity on a procedure. We also demonstrate the measurement scale factors through a set of atomic modifications and a concatenation operator on VFG.

  • PDF

Low-Complexity Network Coding Algorithms for Energy Efficient Information Exchange

  • Wang, Yu;Henning, Ian D.
    • Journal of Communications and Networks
    • /
    • v.10 no.4
    • /
    • pp.396-402
    • /
    • 2008
  • The use of network coding in wireless networks has been proposed in the literature for energy efficient broadcast. However, the decoding complexity of existing algorithms is too high for low-complexity devices. In this work we formalize the all-to-all information exchange problem and shows how to optimize the transmission scheme in terms of energy efficiency. Furthermore, we prove by construction that there exists O(1) -complexity network coding algorithms for grid networks which can achieve such optimality. We also present low-complexity heuristics for random. topology networks. Simulation results show that network coding algorithms outperforms forwarding algorithms in most cases.