• Title/Summary/Keyword: complexity level

Search Result 1,007, Processing Time 0.026 seconds

The Software Complexity Estimation Method in Algorithm Level by Analysis of Source code (소스코드의 분석을 통한 알고리즘 레벨에서의 소프트웨어 복잡도 측정 방법)

  • Lim, Woong;Nam, Jung-Hak;Sim, Dong-Gyu;Cho, Dae-Sung;Choi, Woong-Il
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.47 no.5
    • /
    • pp.153-164
    • /
    • 2010
  • A program consumes energy by executing its instructions. The amount of cosumed power is mainly proportional to algorithm complexity and it can be calculated by using complexity information. Generally, the complexity of a S/W is estimated by the microprocessor simulator. But, the simulation takes long time why the simulator is a software modeled the hardware and it only provides the information about computational complexity quantitatively. In this paper, we propose a complexity estimation method of analysis of S/W on source code level and produce the complexity metric mathematically. The function-wise complexity metrics give the detailed information about the calculation-concentrated location in function. The performance of the proposed method is compared with the result of the gate-level microprocessor simulator 'SimpleScalar'. The used softwares for performance test are $4{\times}4$ integer transform, intra-prediction and motion estimation in the latest video codec, H.264/AVC. The number of executed instructions are used to estimate quantitatively and it appears about 11.6%, 9.6% and 3.5% of error respectively in contradistinction to the result of SimpleScalar.

A Comparison of the Cognitive Effect of Three-dimensional Images on a Computer Monitor and a Mixed Reality Device (컴퓨터 모니터와 혼합현실기기의 3차원 이미지 인지 효과 비교 연구)

  • Choi, Sung-Jin;Liu, Shu-Jun
    • Journal of KIBIM
    • /
    • v.13 no.4
    • /
    • pp.45-53
    • /
    • 2023
  • The educational benefits and potential of XR as a new medium are well recognized. However, there are still limitations in understanding the specific effects of XR compared to the more widely utilized representation of images on computer monitors. This study therefore aims to demonstrate the differences in effectiveness between the two technologies and to draw implications from a cognitive comparison of three-dimensional objects represented on a flat surface and virtually. The study was conducted a quantitative research method with an experiment involving two independent groups, and the results were tested using regression analysis. The results showed that for low-level, two-dimensional objects, the computer monitor method may be more effective, but above a certain level of complexity, the effectiveness of learning through the monitor tends to decrease rapidly. On the other hand, the group that used extended reality technology showed relatively high comprehension compared to the monitor group even as the complexity increased, and in particular, unlike the monitor group's rapidly decreasing comprehension level, the extended reality technology group showed a trend of decreasing comprehension with the level of complexity, suggesting the potential for compatibility and predictability in the use of technology.

New Video Compression Method based on Low-complexity Interpolation Filter-bank (저 복잡도 보간 필터 뱅크 기반의 새로운 비디오 압축 방법)

  • Nam, Jung-Hak;Jo, Hyun-Ho;Sim, Dong-Gyu;Choi, Byeong-Doo;Cho, Dae-Sung
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.47 no.5
    • /
    • pp.165-174
    • /
    • 2010
  • The H.264/AVC standard obtained better performance than previous compression standards, but it also increased the computational complexity of CODEC simultaneously. Various techniques recently included at the KTA software developed by VCEG also were increasing its complexity. Especially adaptive interpolation filter has more complexity than two times due to development for coding efficiency. In this paper, we propose low-complexity filter bank to improve speed up of decoding and coding gain. We consists of filter bank of a fixed-simple filter for low-complexity and adaptive interpolation filter for high coding efficiency. Then we compensated using optimal filter at each macroblock-level or frame-level. Experimental results shows a similar coding efficiency compared to existing adaptive interpolation filter and decoding speed of approximately 12% of the entire decoder gained.

Moderating Effect of Structural Complexity on the Relationship between Surgery Volume and in Hospital Mortality of Cancer Patients (일부 암 종의 수술량과 병원 내 사망률의 관계에서 구조적 복잡성의 조절효과)

  • Youn, Kyungil
    • Health Policy and Management
    • /
    • v.24 no.4
    • /
    • pp.380-388
    • /
    • 2014
  • Background: The volume of surgery has been examined as a major source of variation in outcome after surgery. This study investigated the direct effect of surgery volume to in hospitals mortality and the moderating effect of structural complexity-the level of diversity and sophistication of technology a hospital applied in patient care-to the volume outcome relationship. Methods: Discharge summary data of 11,827 cancer patients who underwent surgery and were discharged during a month period in 2010 and 2011 were analyzed. The analytic model included the independent variables such as surgery volume of a hospital, structural complexity measured by the number of diagnosis a hospital examined, and their interaction term. This study used a hierarchical logistic regression model to test for an association between hospital complexity and mortality rates and to test for the moderating effect in the volume outcome relationship. Results: As structural complexity increased the probability of in-hospital mortality after cancer surgery reduced. The interaction term between surgery volume and structural complexity was also statistically significant. The interaction effect was the strongest among the patients group who had surgery in low volume hospitals. Conclusion: The structural complexity and volume of surgery should be considered simultaneously in studying volume outcome relationship and in developing policies that aim to reduce mortality after cancer surgery.

Predicting CEFR Levels in L2 Oral Speech, Based on Lexical and Syntactic Complexity

  • Hu, Xiaolin
    • Asia Pacific Journal of Corpus Research
    • /
    • v.2 no.1
    • /
    • pp.35-45
    • /
    • 2021
  • With the wide spread of the Common European Framework of Reference (CEFR) scales, many studies attempt to apply them in routine teaching and rater training, while more evidence regarding criterial features at different CEFR levels are still urgently needed. The current study aims to explore complexity features that distinguish and predict CEFR proficiency levels in oral performance. Using a quantitative/corpus-based approach, this research analyzed lexical and syntactic complexity features over 80 transcriptions (includes A1, A2, B1 CEFR levels, and native speakers), based on an interview test, Standard Speaking Test (SST). ANOVA and correlation analysis were conducted to exclude insignificant complexity indices before the discriminant analysis. In the result, distinctive differences in complexity between CEFR speaking levels were observed, and with a combination of six major complexity features as predictors, 78.8% of the oral transcriptions were classified into the appropriate CEFR proficiency levels. It further confirms the possibility of predicting CEFR level of L2 learners based on their objective linguistic features. This study can be helpful as an empirical reference in language pedagogy, especially for L2 learners' self-assessment and teachers' prediction of students' proficiency levels. Also, it offers implications for the validation of the rating criteria, and improvement of rating system.

System-level Function and Architecture Codesign for Optimization of MPEG Encoder

  • Choi, Jin-Ku;Togawa, Nozomu;Yanagisawa, Masao;Ohtsuki, Tatsuo
    • Proceedings of the IEEK Conference
    • /
    • 2002.07c
    • /
    • pp.1736-1739
    • /
    • 2002
  • The advanced in semiconductor, hardware, and software technologies enables the integration of more com- plex systems and the increasing design complexity. As system design complexity becomes more complicated, System-level design based on the If block and processor model is more needed in most of the RTL level or low level. In this paper, we present a novel approach fur the system-level design, which satisfies the various required constraints and an optimization method of image encoder based on codesign of function, algorithm, and architecture. In addition, we show an MPEG-4 encoder as a design case study. The best tradeoffs between algorithm and architecture are necessary to deliver the design with satisfying performance and area constraints. The evaluations provide the effective optimization of motion estimation, which is in charge of an amount of performance in the MPEG-4 encoder module.

  • PDF

Cube selection using function complexity and minimizatio of two-level reed-muller expressions (함수복잡도를 이용한 큐브선택과 이단계 리드뮬러표현의 최소화)

  • Lee, Gueesang
    • Journal of the Korean Institute of Telematics and Electronics A
    • /
    • v.32A no.6
    • /
    • pp.104-110
    • /
    • 1995
  • In this paper, an effective method for the minimization of two-level Reed-muller expressions by cube selection whcih considers functional complexity is presented. In contrast to the previous methods which use Xlinking operations to join two cubes for minimizatio, the cube selection method tries to select cubes one at a time until they cover the ON-set of the given function. This method works for most benchmark circuits, but for parity-type functions it shows power performance. To solve this problem, a cost function which computes the functional complexity instead of only the size of ON-set of the function is used. Therefore the optimization is performed considering how the trun minterms are grouped together so that they can be realized by only a small number of cubes. In other words, it considers how the function is changed and how the change affects the next optimization step. Experimental results shows better performance in many cases including parity-type functions compared to pervious results.

  • PDF

Improved Method for the Macroblock-Level Deblocking Scheme

  • Le, Thanh Ha;Jung, Seung-Won;Baek, Seung-Jin;Ko, Sung-Jea
    • ETRI Journal
    • /
    • v.33 no.2
    • /
    • pp.194-200
    • /
    • 2011
  • This paper presents a deblocking method for video compression in which the blocking artifacts are effectively extracted and eliminated based on both spatial and frequency domain operations. Firstly, we use a probabilistic approach to analyze the performance of the conventional macroblock-level deblocking scheme. Then, based on the results of the analysis, an algorithm to reduce the computational complexity is introduced. Experimental results show that the proposed algorithm outperforms the conventional video coding methods in terms of computation complexity while coding efficiency is maintained.

Do the Technostress Creators Predict Job Satisfaction and Teacher Efficacy of Primary School Teachers in Korea?

  • LEE, Mignon;LIM, Kyu Yon
    • Educational Technology International
    • /
    • v.21 no.1
    • /
    • pp.69-95
    • /
    • 2020
  • The purpose of this research is to analyze the predictive powers of the five technostress creators - techno-overload, techno-invasion, techno-complexity, techno-insecurity, and techno-uncertainty - in job satisfaction and teacher efficacy of primary school teachers in Korea when they incorporated mobile technology into teaching. A questionnaire was designed to measure the level of teacher's stress from technology, job satisfaction and teacher efficacy. Data were collected from 164 teachers. Multiple regression analysis was conducted to explain which area of technostress led to varying degrees of job satisfaction and teacher efficacy. The results showed that techno-complexity alone predicted both job satisfaction and teacher efficacy. The reason why techno-complexity was the only predictor is that teachers would have first needed to understand how to incorporate mobile technology into teaching, before feeling overloaded, invaded, insecure, or uncertain about it, meaning techno-complexity precedes other constructs. Therefore, the only stress factor that affected them was how to understand the complexity of mobile technology. This calls for adequate training and support from schools and governments in order for the teachers to fully incorporate technology into teaching.

An Adaptive Decoding Algorithm Using the Differences Between Level Radii for MIMO Systems (다중 송수신 안테나 시스템에서 단계별 반경의 차이를 이용한 적응 복호화 알고리즘)

  • Kim, Sang-Hyun;Park, So-Ryoung
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.35 no.7C
    • /
    • pp.618-627
    • /
    • 2010
  • In this paper, we propose an adaptive K-best algorithm in which the number K of candidates is changed according to the differences of level radii. We also compare the bit error performance and complexity of the proposed algorithm with those of several conventional K-best algorithms, where the complexity is defined as the total number of candidates of which partial Euclidean distances have to be calculated. The proposed algorithm adaptively decides K at each level by eliminating the symbols, whose differences of radii are larger than a threshold, from the set of candidates, and the maximum or average value of differences can be adopted as the threshold. The proposed decoding algorithm shows the better bit error performance and the lower complexity than a conventional K-best decoding algorithm with a constant K, and also has a similar bit error performance and the lower complexity than other adaptive K-best algorithms.