• 제목/요약/키워드: Complexity analysis

검색결과 2,396건 처리시간 0.036초

짝수 홀수 분해법에 기초한 CCI의 효율적인 변형 (Efficient Modifications of Cubic Convolution Interpolation Based on Even-Odd Decomposition)

  • 조현지;유훈
    • 전기학회논문지
    • /
    • 제63권5호
    • /
    • pp.690-695
    • /
    • 2014
  • This paper presents a modified CCI image interpolation method based on the even-odd decomposition (EOD). The CCI method is a well-known technique to interpolate images. Although the method provides better image quality than the linear interpolation, its complexity still is a problem. To remedy the problem, this paper introduces analysis on the EOD decomposition of CCI and then proposes a reduced CCI interpolation in terms of complexity, providing better image quality in terms of PSNR. To evaluate the proposed method, we conduct experiments and complexity comparison. The results indicate that our method do not only outperforms the existing methods by up to 43% in terms of MSE but also requires low-complexity with 37% less computing time than the CCI method.

Do the Technostress Creators Predict Job Satisfaction and Teacher Efficacy of Primary School Teachers in Korea?

  • LEE, Mignon;LIM, Kyu Yon
    • Educational Technology International
    • /
    • 제21권1호
    • /
    • pp.69-95
    • /
    • 2020
  • The purpose of this research is to analyze the predictive powers of the five technostress creators - techno-overload, techno-invasion, techno-complexity, techno-insecurity, and techno-uncertainty - in job satisfaction and teacher efficacy of primary school teachers in Korea when they incorporated mobile technology into teaching. A questionnaire was designed to measure the level of teacher's stress from technology, job satisfaction and teacher efficacy. Data were collected from 164 teachers. Multiple regression analysis was conducted to explain which area of technostress led to varying degrees of job satisfaction and teacher efficacy. The results showed that techno-complexity alone predicted both job satisfaction and teacher efficacy. The reason why techno-complexity was the only predictor is that teachers would have first needed to understand how to incorporate mobile technology into teaching, before feeling overloaded, invaded, insecure, or uncertain about it, meaning techno-complexity precedes other constructs. Therefore, the only stress factor that affected them was how to understand the complexity of mobile technology. This calls for adequate training and support from schools and governments in order for the teachers to fully incorporate technology into teaching.

Managing Mega-Project Complexity in Five Dimensions

  • Gransberg, Douglas D.;Jeong, H. David
    • 국제학술발표논문집
    • /
    • The 6th International Conference on Construction Engineering and Project Management
    • /
    • pp.6-9
    • /
    • 2015
  • Traditional project management theory is based on a three-dimensional life cycle approach where the project managerseeks to optimize the dimensions of cost-schedule-technical (quality or design). This paper reports the findings of a case study analysis of two complex mega-projects in Michigan which confirm the findings of previous research and illustrates the use of a framework for five-dimensional project management (5DPM) that is for conceptualizing a complex project's scope of work. The framework elevates the recognition that the project's social/political context and the financial arrangements create complexity adding two new dimensions. This paper also demonstrates a methodology to graphically display a project's complexity to better understand and prioritize the available resources. The result is a "complexity footprint" that may help a complex project manager identify the boundary between controllable and uncontrollable projects impacts. The paper finds that applying 5DPM to the two case study projects has given the project delivery team a tool which is actually adding value to the complex project management process.

  • PDF

유연식 라이저에 대한 유한요소법과 이론적 방법에 의한 구조 거동의 비교 연구 (A Comparison Study of Structure Behavior of Flexible Riser Using Numerical and Theoretical Methods)

  • 임기호;장범선;유동현
    • 대한조선학회논문집
    • /
    • 제53권4호
    • /
    • pp.258-265
    • /
    • 2016
  • A flexible riser consists of several layers which have different materials, shapes and functions. The layers designed properly can take the design load safely, and each property of layer provides a complexity of flexible riser. Such complexity/unit-property is an input for global analysis of flexible riser. There are several approaches to calculate the complexity of flexible riser, those are experimental, numerical and theoretical methods. This paper provides a complexity from numerical and theoretical analysis for 2.5 inch flexible riser of which details and the experimental data are already produced under tension, external pressure, and bending moment. In addition, comparison of stiffness and stress are also provided. Especially, analysis of stress could lead to researches on ultimate strength or fatigue strength of flexible risers.

확장충돌맵의 수학적 분석을 이용한 다개체의 충돌탐지 (Conflict Detection for Multi-agent Motion Planning using Mathematical Analysis of Extended Collision Map)

  • 윤영환;최정식;이범희
    • 로봇학회논문지
    • /
    • 제2권3호
    • /
    • pp.234-241
    • /
    • 2007
  • Effective tools which can alleviate the complexity and computational load problem in collision-free motion planning for multi-agent system have steadily been demanded in robotics field. To reduce the complexity, the extended collision map (ECM) which adopts decoupled approach and prioritization is already proposed. In ECM, the collision regions which represent the potential collision of robots are calculated using the computational power; the complexity problem is not resolved completely. In this paper, we propose a mathematical analysis of the extended collision map; as a result, we formulate the collision region as an equation with 5-8 variables. For mathematical analysis, we introduce realistic assumptions as follows; the path of each robot can be approximated to a straight line or an arc and every robot moves with uniform velocity or constant acceleration near the intersection between paths. Our result reduces the computational complexity in comparison with the previous result without losing optimality, because we use simple but exact equations of the collision regions. This result can be widely applicable to coordinated multi-agent motion planning.

  • PDF

Complexity Analysis of the Viking Labeled Release Experiments

  • Bianciardi, Giorgio;Miller, Joseph D.;Straat, Patricia Ann;Levin, Gilbert V.
    • International Journal of Aeronautical and Space Sciences
    • /
    • 제13권1호
    • /
    • pp.14-26
    • /
    • 2012
  • The only extraterrestrial life detection experiments ever conducted were the three which were components of the 1976 Viking Mission to Mars. Of these, only the Labeled Release experiment obtained a clearly positive response. In this experiment $^{14}C$ radiolabeled nutrient was added to the Mars soil samples. Active soils exhibited rapid, substantial gas release. The gas was probably $CO_2$ and, possibly, other radiocarbon-containing gases. We have applied complexity analysis to the Viking LR data. Measures of mathematical complexity permit deep analysis of data structure along continua including signal vs. noise, entropy vs.negentropy, periodicity vs. aperiodicity, order vs. disorder etc. We have employed seven complexity variables, all derived from LR data, to show that Viking LR active responses can be distinguished from controls via cluster analysis and other multivariate techniques. Furthermore, Martian LR active response data cluster with known biological time series while the control data cluster with purely physical measures. We conclude that the complexity pattern seen in active experiments strongly suggests biology while the different pattern in the control responses is more likely to be non-biological. Control responses that exhibit relatively low initial order rapidly devolve into near-random noise, while the active experiments exhibit higher initial order which decays only slowly. This suggests a robust biological response. These analyses support the interpretation that the Viking LR experiment did detect extant microbial life on Mars.

효율적 In-Place Block Rotation 알고리즘과 복잡도 분석 (An Efficient In-Place Block Rotation Algorithm and its Complexity Analysis)

  • 김복선;쿠츠너 아네
    • 한국지능시스템학회논문지
    • /
    • 제20권3호
    • /
    • pp.428-433
    • /
    • 2010
  • u와 v를 두 인접수열 (consecutive sequence)이라고 했을 때 이때 "block rotation"이란 uv를 vu로 바꾸는 연산을 의미한다. 기존에 3개의 block rotation 알고리즘 즉 "BlockRotation", "Juggling" 그리고 "Reversal 알고리즘"이 소개되었는데 최근 우리는 하나의 새로운, QuickRotation 이라고 명명한 block rotation 알고리즘을 소개했다. 우리는 이 논문에서 QuickRotation 알고리즘을 이들 기존의 알고리즘들과 비교해 보이고자 한다. 벤치마킹 뿐만 아니라 복잡도 분석을 통한 비교를 통해 QuickRotation 알고리즘의 우수성을 증명해 보이고자 한다.

안전성에 근거를 둔 디지털서명 성능분석에 관한 연구 (A Study on Performance Analysis of Digital Signature Based on the Security)

  • 이지영
    • 한국컴퓨터정보학회논문지
    • /
    • 제4권2호
    • /
    • pp.39-45
    • /
    • 1999
  • 본 논문은 디지털서명을 위한 암호화 기법의 분석과 이산대수 문제와 소인수 분해와 같이 계산 복잡도의 어려움에 안전성의 근거를 둔 암호방식을 비교한다. 특히 계산량에 의한 성분분석과 데이터 크기 비교 및 처리속도를 시뮬레이션에 의해 비교, 검토하였다.

  • PDF

충돌 전력 분석 공격에 높은 공격 복잡도를 갖는 RSA 알고리즘에 대한 취약점 분석 및 대응기법 (Analysis and Countermeasure on RSA Algorithm Having High Attack Complexity in Collision-Based Power Analysis Attack)

  • 김수리;김태원;조성민;김희석;홍석희
    • 정보보호학회논문지
    • /
    • 제26권2호
    • /
    • pp.335-344
    • /
    • 2016
  • 부채널 분석 중 전력 분석 공격은 가장 실용적이며 강력한 기법으로 알려져 있다. 전력 분석 공격 중 단일 파형공격은 단 하나의 파형을 이용하여 공개키 암호 시스템의 비밀정보를 복원하는 강력한 분석기법으로 최근에 활발히 연구되고 있다. 가장 최근에 Sim 등은 이러한 공격에 높은 안전성을 갖는 새로운 지수승 알고리즘을 소개하였다. 본 논문에서 Sim 등이 제안한 단일 파형 공격에 높은 공격 복잡도를 갖는 알고리즘의 취약점을 분석한다. 메시지 블라인딩과 지수 분할 기법에 윈도우 기법을 적용해 높은 공격 복잡도를 갖는 알고리즘을 제안하였지만 사전 연산과정에서 발생하는 정보를 이용하여 비밀정보를 복원할 수 있음을 확인하였다. 또한 취약점을 보완하여 단일파형 공격에 높은 공격 복잡도를 갖는 지수승 알고리즘을 새롭게 제안하였다. 설계된 알고리즘은 사전 연산 과정에서 실제지수 연산에 사용되는 값들의 재사용을 최소화 하여 충돌 공격에 대해 높은 안전성를 보장한다.

수술수가의 적정성에 관한 연구 - 상대가격체계와 항목분류를 중심으로 - (A Study on the Propriety of the Medical Insurance Fee Schedule of Surgical Operations - In Regard to the Relative Price System and the Classification of the Price Unit of Insurance Fee Schedule -)

  • 오진주
    • 한국보건간호학회지
    • /
    • 제2권2호
    • /
    • pp.21-44
    • /
    • 1988
  • In Korea, fee-for service reimbursement has been adopted from the begining of medical insurance system in 1977, and the importance of the relative value unit is currently being investigated. The purpose of this study was to find out the level of propriety of the difference in the fees for different surgical services, and the appropriateness of the classification of the insurance fee schedule. For the purpose of this study, specific subjects and the procedural methodology is shown as follows: 1. The propriety of the Relative Price System(RPS). 1) Choice of sample operations. In this study, sample operations were selected and classified by specialists in general surgery, and the number of items they classified were 32. For the same group of operations the Insurance Fee Schedule(IFS) classified the operations into 24 separate items. In order to investigate the propriety of the RPS, one of the purpose of this study, was to examine the 24 items classified by the IFS. 2) Evaluation of the complexity of surgery. The data used in this study was collected The data used in this study was collected from 94 specialists in general surgery by mail survey from November I to 15, 1986. Several independent variables (age, location, number of bed, university hospital, whether the medical institution adopt residents or not) were also investigated for analysis of the characteristics of surgical complexity. 3) Complexity and time calculations. Time data was collected from the records of the Seoul National University' Hospital, and the cost per operation was calculated through cost finding methods. 4) Analysis of the propriety of the Relative Price System of the Insurance Fee Schedule. The Relative Price System of the sample operation was regressed on the cost, time, comlexity relative ,value system (RVS) separately. The coefficient of determination indicates the degree of variation in the RPS of the Insurance Fee Schedule explained by the cost, time, complexity RVS separately. 2. The appropriateness of the classification of the Insurance Fee Schedule. 1) Choice of sample operations. The items which differed between the classification of the specialist and the classification of medical, Insurance Fee Schedule were chosen. 2) Comparisons of cost, time and complexity between the items were done to evaluate which classification was more appropriate. The findings of the study can be summarized as follows: 1. The coefficient of determination of the regression of the RPS on-cost RVS was 0.58, on time RVS was 0.65, and on complexity RVS was 0.72. This means that the RPS of Insurance Fee Schedule is improper with respect to the cost, time, complexity separately. Thus this indicates that RPS must be re-shaped according to the standard element. In this study, the correlation coefficients of cost, time, complexity Relative Value System were very high, and this suggests that RPS could be reshaped I according to anyone standard element. Considering of measurement, time was thought to be the most I appropriate. 2. The classifications of specialist and of the Insurance Fee Schedule were compared with respect to cost, time, and complexity separately. For complexity, ANOVA was done and the others were compared to the different values of different classifications. The result was that the classification of specialist was more reasonable and that the classification of Insurance Fee Schedule grouped inappropriately several into one price unit.

  • PDF