• Title/Summary/Keyword: complexity analysis

Search Result 2,403, Processing Time 0.032 seconds

Efficient Modifications of Cubic Convolution Interpolation Based on Even-Odd Decomposition (짝수 홀수 분해법에 기초한 CCI의 효율적인 변형)

  • Cho, Hyun-Ji;Yoo, Hoon
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.63 no.5
    • /
    • pp.690-695
    • /
    • 2014
  • This paper presents a modified CCI image interpolation method based on the even-odd decomposition (EOD). The CCI method is a well-known technique to interpolate images. Although the method provides better image quality than the linear interpolation, its complexity still is a problem. To remedy the problem, this paper introduces analysis on the EOD decomposition of CCI and then proposes a reduced CCI interpolation in terms of complexity, providing better image quality in terms of PSNR. To evaluate the proposed method, we conduct experiments and complexity comparison. The results indicate that our method do not only outperforms the existing methods by up to 43% in terms of MSE but also requires low-complexity with 37% less computing time than the CCI method.

Do the Technostress Creators Predict Job Satisfaction and Teacher Efficacy of Primary School Teachers in Korea?

  • LEE, Mignon;LIM, Kyu Yon
    • Educational Technology International
    • /
    • v.21 no.1
    • /
    • pp.69-95
    • /
    • 2020
  • The purpose of this research is to analyze the predictive powers of the five technostress creators - techno-overload, techno-invasion, techno-complexity, techno-insecurity, and techno-uncertainty - in job satisfaction and teacher efficacy of primary school teachers in Korea when they incorporated mobile technology into teaching. A questionnaire was designed to measure the level of teacher's stress from technology, job satisfaction and teacher efficacy. Data were collected from 164 teachers. Multiple regression analysis was conducted to explain which area of technostress led to varying degrees of job satisfaction and teacher efficacy. The results showed that techno-complexity alone predicted both job satisfaction and teacher efficacy. The reason why techno-complexity was the only predictor is that teachers would have first needed to understand how to incorporate mobile technology into teaching, before feeling overloaded, invaded, insecure, or uncertain about it, meaning techno-complexity precedes other constructs. Therefore, the only stress factor that affected them was how to understand the complexity of mobile technology. This calls for adequate training and support from schools and governments in order for the teachers to fully incorporate technology into teaching.

Managing Mega-Project Complexity in Five Dimensions

  • Gransberg, Douglas D.;Jeong, H. David
    • International conference on construction engineering and project management
    • /
    • 2015.10a
    • /
    • pp.6-9
    • /
    • 2015
  • Traditional project management theory is based on a three-dimensional life cycle approach where the project managerseeks to optimize the dimensions of cost-schedule-technical (quality or design). This paper reports the findings of a case study analysis of two complex mega-projects in Michigan which confirm the findings of previous research and illustrates the use of a framework for five-dimensional project management (5DPM) that is for conceptualizing a complex project's scope of work. The framework elevates the recognition that the project's social/political context and the financial arrangements create complexity adding two new dimensions. This paper also demonstrates a methodology to graphically display a project's complexity to better understand and prioritize the available resources. The result is a "complexity footprint" that may help a complex project manager identify the boundary between controllable and uncontrollable projects impacts. The paper finds that applying 5DPM to the two case study projects has given the project delivery team a tool which is actually adding value to the complex project management process.

  • PDF

A Comparison Study of Structure Behavior of Flexible Riser Using Numerical and Theoretical Methods (유연식 라이저에 대한 유한요소법과 이론적 방법에 의한 구조 거동의 비교 연구)

  • Yim, Ki-Ho;Jang, Beom-Seon;Yoo, Dong-Hyun
    • Journal of the Society of Naval Architects of Korea
    • /
    • v.53 no.4
    • /
    • pp.258-265
    • /
    • 2016
  • A flexible riser consists of several layers which have different materials, shapes and functions. The layers designed properly can take the design load safely, and each property of layer provides a complexity of flexible riser. Such complexity/unit-property is an input for global analysis of flexible riser. There are several approaches to calculate the complexity of flexible riser, those are experimental, numerical and theoretical methods. This paper provides a complexity from numerical and theoretical analysis for 2.5 inch flexible riser of which details and the experimental data are already produced under tension, external pressure, and bending moment. In addition, comparison of stiffness and stress are also provided. Especially, analysis of stress could lead to researches on ultimate strength or fatigue strength of flexible risers.

Conflict Detection for Multi-agent Motion Planning using Mathematical Analysis of Extended Collision Map (확장충돌맵의 수학적 분석을 이용한 다개체의 충돌탐지)

  • Yoon, Y.H.;Choi, J.S.;Lee, B.H.
    • The Journal of Korea Robotics Society
    • /
    • v.2 no.3
    • /
    • pp.234-241
    • /
    • 2007
  • Effective tools which can alleviate the complexity and computational load problem in collision-free motion planning for multi-agent system have steadily been demanded in robotics field. To reduce the complexity, the extended collision map (ECM) which adopts decoupled approach and prioritization is already proposed. In ECM, the collision regions which represent the potential collision of robots are calculated using the computational power; the complexity problem is not resolved completely. In this paper, we propose a mathematical analysis of the extended collision map; as a result, we formulate the collision region as an equation with 5-8 variables. For mathematical analysis, we introduce realistic assumptions as follows; the path of each robot can be approximated to a straight line or an arc and every robot moves with uniform velocity or constant acceleration near the intersection between paths. Our result reduces the computational complexity in comparison with the previous result without losing optimality, because we use simple but exact equations of the collision regions. This result can be widely applicable to coordinated multi-agent motion planning.

  • PDF

Complexity Analysis of the Viking Labeled Release Experiments

  • Bianciardi, Giorgio;Miller, Joseph D.;Straat, Patricia Ann;Levin, Gilbert V.
    • International Journal of Aeronautical and Space Sciences
    • /
    • v.13 no.1
    • /
    • pp.14-26
    • /
    • 2012
  • The only extraterrestrial life detection experiments ever conducted were the three which were components of the 1976 Viking Mission to Mars. Of these, only the Labeled Release experiment obtained a clearly positive response. In this experiment $^{14}C$ radiolabeled nutrient was added to the Mars soil samples. Active soils exhibited rapid, substantial gas release. The gas was probably $CO_2$ and, possibly, other radiocarbon-containing gases. We have applied complexity analysis to the Viking LR data. Measures of mathematical complexity permit deep analysis of data structure along continua including signal vs. noise, entropy vs.negentropy, periodicity vs. aperiodicity, order vs. disorder etc. We have employed seven complexity variables, all derived from LR data, to show that Viking LR active responses can be distinguished from controls via cluster analysis and other multivariate techniques. Furthermore, Martian LR active response data cluster with known biological time series while the control data cluster with purely physical measures. We conclude that the complexity pattern seen in active experiments strongly suggests biology while the different pattern in the control responses is more likely to be non-biological. Control responses that exhibit relatively low initial order rapidly devolve into near-random noise, while the active experiments exhibit higher initial order which decays only slowly. This suggests a robust biological response. These analyses support the interpretation that the Viking LR experiment did detect extant microbial life on Mars.

An Efficient In-Place Block Rotation Algorithm and its Complexity Analysis (효율적 In-Place Block Rotation 알고리즘과 복잡도 분석)

  • Kim, Pok-Son;Kutzner, Arne
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.20 no.3
    • /
    • pp.428-433
    • /
    • 2010
  • The notion "block rotation" denotes the operation of exchanging two consecutive sequences of elements uv to vu. There are three already well-known block rotation algorithms called BlockRotation, Juggling and Reversal algorithm. Recently we presented a novel block rotation algorithm called QuickRotation. In this paper we compare QuickRotation to these three known block rotation algorithms. This comparison covers a complexity analysis as well as benchmarking and shows that a switch to QuickRotation is almost always advantageous.

A Study on Performance Analysis of Digital Signature Based on the Security (안전성에 근거를 둔 디지털서명 성능분석에 관한 연구)

  • 이지영
    • Journal of the Korea Society of Computer and Information
    • /
    • v.4 no.2
    • /
    • pp.39-45
    • /
    • 1999
  • In this paper we will look at its cryptographic analysis for digital signature and compare it with other complexity measures such as discrete logarithm problem and factorization problem which are based on the security. The paper especially tries to computational complexity so that it can compare and checks the performance analysis, comparison of data size and processing speed through the simulation me

  • PDF

Analysis and Countermeasure on RSA Algorithm Having High Attack Complexity in Collision-Based Power Analysis Attack (충돌 전력 분석 공격에 높은 공격 복잡도를 갖는 RSA 알고리즘에 대한 취약점 분석 및 대응기법)

  • Kim, Suhri;Kim, Taewon;Jo, Sungmin;Kim, HeeSeok;Hong, Seokhie
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.26 no.2
    • /
    • pp.335-344
    • /
    • 2016
  • It is known that power analysis is one of the most powerful attack in side channel analysis. Among power analysis single trace attack is widely studied recently since it uses one power consumption trace to recover secret key of public cryptosystem. Recently Sim et al. proposed new exponentiation algorithm for RSA cryptosystem with higher attack complexity to prevent single trace attack. In this paper we analyze the vulnerability of exponentiation algorithm described by Sim et al. Sim et al. applied message blinding and random exponentiation splitting method on $2^t-ary$ for higher attack complexity. However we can reveal private key using information exposed during pre-computation generation. Also we describe modified algorithm that provides higher attack complexity on collision attack. Proposed algorithm minimized the reuse of value that are used during exponentiation to provide security under single collision attack.

A Study on the Propriety of the Medical Insurance Fee Schedule of Surgical Operations - In Regard to the Relative Price System and the Classification of the Price Unit of Insurance Fee Schedule - (수술수가의 적정성에 관한 연구 - 상대가격체계와 항목분류를 중심으로 -)

  • Oh Jin Joo
    • Journal of Korean Public Health Nursing
    • /
    • v.2 no.2
    • /
    • pp.21-44
    • /
    • 1988
  • In Korea, fee-for service reimbursement has been adopted from the begining of medical insurance system in 1977, and the importance of the relative value unit is currently being investigated. The purpose of this study was to find out the level of propriety of the difference in the fees for different surgical services, and the appropriateness of the classification of the insurance fee schedule. For the purpose of this study, specific subjects and the procedural methodology is shown as follows: 1. The propriety of the Relative Price System(RPS). 1) Choice of sample operations. In this study, sample operations were selected and classified by specialists in general surgery, and the number of items they classified were 32. For the same group of operations the Insurance Fee Schedule(IFS) classified the operations into 24 separate items. In order to investigate the propriety of the RPS, one of the purpose of this study, was to examine the 24 items classified by the IFS. 2) Evaluation of the complexity of surgery. The data used in this study was collected The data used in this study was collected from 94 specialists in general surgery by mail survey from November I to 15, 1986. Several independent variables (age, location, number of bed, university hospital, whether the medical institution adopt residents or not) were also investigated for analysis of the characteristics of surgical complexity. 3) Complexity and time calculations. Time data was collected from the records of the Seoul National University' Hospital, and the cost per operation was calculated through cost finding methods. 4) Analysis of the propriety of the Relative Price System of the Insurance Fee Schedule. The Relative Price System of the sample operation was regressed on the cost, time, comlexity relative ,value system (RVS) separately. The coefficient of determination indicates the degree of variation in the RPS of the Insurance Fee Schedule explained by the cost, time, complexity RVS separately. 2. The appropriateness of the classification of the Insurance Fee Schedule. 1) Choice of sample operations. The items which differed between the classification of the specialist and the classification of medical, Insurance Fee Schedule were chosen. 2) Comparisons of cost, time and complexity between the items were done to evaluate which classification was more appropriate. The findings of the study can be summarized as follows: 1. The coefficient of determination of the regression of the RPS on-cost RVS was 0.58, on time RVS was 0.65, and on complexity RVS was 0.72. This means that the RPS of Insurance Fee Schedule is improper with respect to the cost, time, complexity separately. Thus this indicates that RPS must be re-shaped according to the standard element. In this study, the correlation coefficients of cost, time, complexity Relative Value System were very high, and this suggests that RPS could be reshaped I according to anyone standard element. Considering of measurement, time was thought to be the most I appropriate. 2. The classifications of specialist and of the Insurance Fee Schedule were compared with respect to cost, time, and complexity separately. For complexity, ANOVA was done and the others were compared to the different values of different classifications. The result was that the classification of specialist was more reasonable and that the classification of Insurance Fee Schedule grouped inappropriately several into one price unit.

  • PDF