• Title/Summary/Keyword: complexity level

Search Result 1,003, Processing Time 0.026 seconds

Cost Driver Analysis in General Hospitals Using Simultaneous Equation Model and Path Model (연립방정식모형과 경로모형을 이용한 종합병원의 원가동인 분석)

  • 양동현;이원식
    • Health Policy and Management
    • /
    • v.14 no.1
    • /
    • pp.89-120
    • /
    • 2004
  • The purpose of this empirical study is to test hypotheses in order to identify the cost drivers that drive indirect costs in general hospitals in Korea. In various cases' studies, it has been suggested that overhead costs are driven by volume and complexity variables, how they are structurally related and how the cost impacts of these variables can be A unique feature of the research is the treatment of complexity as an endogenous variable. It is hypothesized that level of hospital complexity in terms of the number of services provided(i.e., “breath" complexity) and the intensity of individual estimated in practice. overhead services(ie., “depth" complexity) are simultaneous determined with the level of costs needed to support the complexity. Data used in this study were obtained from the Database of Korean Health Industry Development Institute, Health Insurance Review Agency and analyzed using simultaneous equation model, path model. The results found those volume and complexity variables are all statistically signi-ficance drivers of general hospital overhead costs. This study has documented that the level of service complexity is a significant determinant of hospital overhead costs, caution should be exercised in interpreting this as supportive of the cost accounting procedures associated with ABC. with ABC.

SOC Verification Based on WGL

  • Du, Zhen-Jun;Li, Min
    • Journal of Korea Multimedia Society
    • /
    • v.9 no.12
    • /
    • pp.1607-1616
    • /
    • 2006
  • The growing market of multimedia and digital signal processing requires significant data-path portions of SoCs. However, the common models for verification are not suitable for SoCs. A novel model--WGL (Weighted Generalized List) is proposed, which is based on the general-list decomposition of polynomials, with three different weights and manipulation rules introduced to effect node sharing and the canonicity. Timing parameters and operations on them are also considered. Examples show the word-level WGL is the only model to linearly represent the common word-level functions and the bit-level WGL is especially suitable for arithmetic intensive circuits. The model is proved to be a uniform and efficient model for both bit-level and word-level functions. Then Based on the WGL model, a backward-construction logic-verification approach is presented, which reduces time and space complexity for multipliers to polynomial complexity(time complexity is less than $O(n^{3.6})$ and space complexity is less than $O(n^{1.5})$) without hierarchical partitioning. Finally, a construction methodology of word-level polynomials is also presented in order to implement complex high-level verification, which combines order computation and coefficient solving, and adopts an efficient backward approach. The construction complexity is much less than the existing ones, e.g. the construction time for multipliers grows at the power of less than 1.6 in the size of the input word without increasing the maximal space required. The WGL model and the verification methods based on WGL show their theoretical and applicable significance in SoC design.

  • PDF

The Effect of Occupational Information on the Cognitive Complexity of Adolescents (직업정보제공방식의 차이에 따른 청소년의 직업인지복잡성의 증대효과)

  • Lee, Yok
    • Korean Journal of Child Studies
    • /
    • v.12 no.2
    • /
    • pp.67-77
    • /
    • 1991
  • An investigation of the effect of occupational information on vocational cognitive complexity was conducted with 331 male and female adolescents in ninth grade. There were 2 experimental groups and 1 control group. Experimental group I was given only occupational information sheets (written form information) while group II was given occupational information through verbal instruction in addition to the occupational information sheets. A modified form of the cognitive complexity grid originally developed by Bodden (1970) was utilized to collect data on the subjects' vocational cognitive complexity. ANOVA and $Scheff{\acute{e}}$ tests revealed that there were significant differences between experimental group II and the other groups in vocational cognitive complexity. The cognitive complexity level of experimental group I and the control group for the most aspired occupation was significantly lower than for the least aspired occupation. However, the cognitive complexity level of experimental group II for the most aspired occupation was higher than for the least aspired occupation. The results suggest that just giving occupational information to adolescents may not be effective and giving occupational information may be effective only when the method of giving occupational information is active enough to induce adolescents' self-confirming cognitive process.

  • PDF

Determining the complexity level of proceduralized tasks in a digitalized main control room using the TACOM measure

  • Inseok Jang;Jinkyun Park
    • Nuclear Engineering and Technology
    • /
    • v.54 no.11
    • /
    • pp.4170-4180
    • /
    • 2022
  • The task complexity (TACOM) measure was previously developed to quantify the complexity of proceduralized tasks conducted by nuclear power plant operators. Following the development of the TACOM measure, its appropriateness has been validated by investigating the relationship between TACOM scores and three kinds of human performance data, namely response times, human error probabilities, and subjective workload scores. However, the information reflected in quantified TACOM scores is still insufficient to determine the levels of complexity of proceduralized tasks for human reliability analysis (HRA) applications. In this regard, the objective of this study is to suggest criteria for determining the levels of task complexity based on logistic regression between human error occurrences in digitalized main control rooms and TACOM scores. Analysis results confirmed that the likelihood of human error occurrence according to the TACOM score is secured. This result strongly implies that the TACOM measure can be used to identify the levels of task complexity, which could be applicable to various research domains including HRA.

A Study on the Relationships between Complex and Preference by Perceptual-cognitive and Affective Judgement - Focused on the Commercial Interior Design - (지각적-인지적 판단과 감정적 판단에 따른 복잡성과 선호도의 관계 - 상업공간의 실내디자인을 중심으로 -)

  • Choi Eun-Hee;Kwon Young-Gull
    • Korean Institute of Interior Design Journal
    • /
    • v.15 no.3 s.56
    • /
    • pp.173-183
    • /
    • 2006
  • Design is inseparably related to aesthetics. In spite of that, it is difficult to explain the precise aesthetic variables that affect the aesthetic value of space or environment. Therefore, this study intended to find the relationships between aesthetic variables by perceptual and affective judgement for space design with focus on complexity and preference variables. The research found low level of 'arousing' as well as high levels of affective dimension variables 'pleasant' and 'relaxing' evoked high preference. High preference also appeared in space design cases with high unity, order, and clarity with low contrast and complexity, which are variables of perceptual dimension. Complexity, one variables of preference by Kaplan, is in an inverse proportion to space preference. Thus, space design with high complexity has high level of 'exciting' and 'arousing' affective responses and relatively low level of 'relaxing' response. Additionally, it was confirmed that the most importantly influential factor on complexity was diverse components rather than visual richness and ornamentation.

A Low Complexity Multi-level Sphere Decoder for MIMO Systems with QAM signals

  • Pham, Van-Su;Yoon, Gi-Wan
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2008.10a
    • /
    • pp.890-893
    • /
    • 2008
  • In this paper, we present a low complexity modified multi-level sphere decoder (SD) for multiple-input multiple-output (MIMO) systems employing quadrature amplitude modulation (QAM) signals. The proposed decoder, exploiting the multi-level structure of the QAM signal scheme, first decomposes the high-level constellation into low-level 4-QAM constellations, so-called sub-constellations. Then, it deploys SD in the sub-constellations in parallel. In addition, in the searching stage, it uses the optimal low-complexity sort method. Computer simulation results show that the proposed decoder can provide near optimal maximum-likelihood (ML) performance while it significantly reduces the computational load.

  • PDF

High Speed Motion Match Utilizing A Multi-Resolution Algorithm (다중해상도 알고리즘을 이용한 고속 움직임 정합)

  • Joo, Heon-Sik
    • Journal of the Korea Society of Computer and Information
    • /
    • v.12 no.2 s.46
    • /
    • pp.131-139
    • /
    • 2007
  • This paper proposed a multi-resolution algorithm. Its search point and complexity were compared with those of block match algorithm. Also the speed up comparison was made with the block match algorithm. The proposed multi-resolution NTSS-3 Level algorithm was compared again with its targets, TSS-3 Level algorithm and NTSS algorithm. The comparison results showed that the NTSS-3 Level algorithm was superior in search point and speed up. Accordingly, the proposed NTSS-3 Level algorithm was two to three times better in search point and two to four times better in complexity calculation than those of the compared object, the block match algorithm. In speed up, the proposed NTSS-3 Level algorithm was two times better. Accordingly, the proposed multi-resolution NTSS-3 Level algorithm showed PSNR ration portion excellency in search point and speed up.

  • PDF

Effects of Phonetic Complexity and Articulatory Severity on Percentage of Correct Consonant and Speech Intelligibility in Adults with Dysarthria (조음복잡성 및 조음중증도에 따른 마비말장애인의 자음정확도와 말명료도)

  • Song, HanNae;Lee, Youngmee;Sim, HyunSub;Sung, JeeEun
    • Phonetics and Speech Sciences
    • /
    • v.5 no.1
    • /
    • pp.39-46
    • /
    • 2013
  • This study examined the effects of phonetic complexity and articulatory severity on Percentage of Correct Consonant (PCC) and speech intelligibility in adults with dysarthria. Speech samples of thirty-two words from APAC (Assessment of Phonology and Articulation of Children) were collected from 38 dysarthric speakers with one of two different levels of articulatory severities (mild or mild-moderate). A PCC and speech intelligibility score was calculated by the 4 levels of phonetic complexity. Two-way mixed ANOVA analysis revealed: (1) the group with mild severity showed significantly higher PCC and speech intelligibility scores than the mild-moderate articulatory severity group, (2) PCC at the phonetic complexity level 4 was significantly lower than those at the other levels and (3) an interaction effect of articulatory severity and phonetic complexity was observed only on the PCC. Pearson correlation analysis demonstrated the degree of correlation between PCC and speech intelligibility varied depending on the level of articulatory severity and phonetic complexity. The clinical implications of the findings were discussed.

Nonlinear Quality Indices Based on a Novel Lempel-Ziv Complexity for Assessing Quality of Multi-Lead ECGs Collected in Real Time

  • Zhang, Yatao;Ma, Zhenguo;Dong, Wentao
    • Journal of Information Processing Systems
    • /
    • v.16 no.2
    • /
    • pp.508-521
    • /
    • 2020
  • We compared a novel encoding Lempel-Ziv complexity (ELZC) with three common complexity algorithms i.e., approximate entropy (ApEn), sample entropy (SampEn), and classic Lempel-Ziv complexity (CLZC) so as to determine a satisfied complexity and its corresponding quality indices for assessing quality of multi-lead electrocardiogram (ECG). First, we calculated the aforementioned algorithms on six artificial time series in order to compare their performance in terms of discerning randomness and the inherent irregularity within time series. Then, for analyzing sensitivity of the algorithms to content level of different noises within the ECG, we investigated their change trend in five artificial synthetic noisy ECGs containing different noises at several signal noise ratios. Finally, three quality indices based on the ELZC of the multi-lead ECG were proposed to assess the quality of 862 real 12-lead ECGs from the MIT databases. The results showed the ELZC could discern randomness and the inherent irregularity within six artificial time series, and also reflect content level of different noises within five artificial synthetic ECGs. The results indicated the AUCs of three quality indices of the ELZC had statistical significance (>0.500). The ELZC and its corresponding three indices were more suitable for multi-lead ECG quality assessment than the other three algorithms.

Predicting Learning Achievements with Indicators of Perceived Affordances Based on Different Levels of Content Complexity in Video-based Learning

  • Dasom KIM;Gyeoun JEONG
    • Educational Technology International
    • /
    • v.25 no.1
    • /
    • pp.27-65
    • /
    • 2024
  • The purpose of this study was to identify differences in learning patterns according to content complexity in video-based learning environments and to derive variables that have an important effect on learning achievement within particular learning contexts. To achieve our aims, we observed and collected data on learners' cognitive processes through perceived affordances, using behavioral logs and eye movements as specific indicators. These two types of reaction data were collected from 67 male and female university students who watched two learning videos classified according to their task complexity through the video learning player. The results showed that when the content complexity level was low, learners tended to navigate using other learners' digital logs, but when it was high, students tended to control the learning process and directly generate their own logs. In addition, using derived prediction models according to the degree of content complexity level, we identified the important variables influencing learning achievement in the low content complexity group as those related to video playback and annotation. In comparison, in the high content complexity group, the important variables were related to active navigation of the learning video. This study tried not only to apply the novel variables in the field of educational technology, but also attempt to provide qualitative observations on the learning process based on a quantitative approach.