• 제목/요약/키워드: Complexity Analysis

검색결과 2,418건 처리시간 0.027초

네트워크 통합형 의료기기를 위한 안전한 상호작용 패턴과 구조적 안전성 검사 (Pattern-based Design and Safety Analysis for Networked Supervisory Medical Systems)

  • 강우철
    • 정보과학회 컴퓨팅의 실제 논문지
    • /
    • 제20권10호
    • /
    • pp.561-566
    • /
    • 2014
  • 통신기능과 상호연동가능성을 통해 의료기기들을 통합 제어함으로써 의료서비스의 효율과 환자의 안전성을 높이려는 요구가 증가하고 있다. 그러나, 네트워크와 제어컴퓨터를 통한 의료기기의 통합이 체계적이지 못할 경우 시스템의 복잡도를 크게 높이게 되며, 이는 안전사고의 위험성을 높일 가능성이 있다. 본 논문에서는 의료기기간의 안전한 상호작용 형태를 패턴으로 분류하였으며, 이를 이용하여 네트워크 통합형 의료기기의 제어 구조를 계층적으로 설계하고 안전성을 검사하는 방법을 제시한다. 제안된 기법은 Architecture Analysis and Description Language (AADL)의 플러그인으로써 구현되었다. AADL환경에서 의료기기들을 가상으로 통합하고, 이들의 구조적인 안전성 여부를 검사할 수 있게 함으로써, 실제 개발에 앞서 안전성을 검증할 수 있게 된다.

HEVC 복호화기의 메모리 접근 복잡도 분석 (An Analysis of Memory Access Complexity for HEVC Decoder)

  • 조송현;김영남;송용호
    • 전자공학회논문지
    • /
    • 제51권5호
    • /
    • pp.114-124
    • /
    • 2014
  • HEVC는 JCT-VC에 의해 개발된 최신 비디오 코딩 표준이다. HEVC는 H.264/AVC에 비해 약 2배의 주관적 코딩효율을 제공한다. HEVC 개발의 주요목표 중 하나는 UHD급 비디오를 효율적으로 코딩하는 것이기 때문에, HEVC는 UHD급 비디오를 코딩하는데 널리 사용될 것으로 예측된다. 이러한 고해상도 비디오의 복호화는 많은 양의 메모리 접근을 발생시키기 때문에 복호화 시스템은 고대역폭의 메모리 시스템 및 내부 통신 아키텍처가 필요하다. 이러한 요구사항을 파악하기 위해서 본 논문은 HEVC 복호화기의 메모리 접근 복잡도를 분석한다. 우리는 먼저 임베디드 프로세서와 데스크탑에서 동작하는 소프트웨어 HEVC 복호화기의 메모리 접근량을 측정하였다. 또한 우리는 HEVC 복호화기의 데이터흐름을 분석하여 HEVC 복호화기의 메모리 대역폭 모델을 만들었다. 측정결과, 소프트웨어 복호화기는 6.9~40.5GB/s의 DRAM 접근을 하였다. 또한 분석결과에 따르면 하드웨어 복호화기는 2.4GB/s의 DRAM 대역폭을 요구하는 것으로 파악된다.

An Improved method of Two Stage Linear Discriminant Analysis

  • Chen, Yarui;Tao, Xin;Xiong, Congcong;Yang, Jucheng
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제12권3호
    • /
    • pp.1243-1263
    • /
    • 2018
  • The two-stage linear discrimination analysis (TSLDA) is a feature extraction technique to solve the small size sample problem in the field of image recognition. The TSLDA has retained all subspace information of the between-class scatter and within-class scatter. However, the feature information in the four subspaces may not be entirely beneficial for classification, and the regularization procedure for eliminating singular metrics in TSLDA has higher time complexity. In order to address these drawbacks, this paper proposes an improved two-stage linear discriminant analysis (Improved TSLDA). The Improved TSLDA proposes a selection and compression method to extract superior feature information from the four subspaces to constitute optimal projection space, where it defines a single Fisher criterion to measure the importance of single feature vector. Meanwhile, Improved TSLDA also applies an approximation matrix method to eliminate the singular matrices and reduce its time complexity. This paper presents comparative experiments on five face databases and one handwritten digit database to validate the effectiveness of the Improved TSLDA.

신제품의 혁신 속성과 계획적 진부화가 소비자의 구매의도에 미치는 영향 (The Influence of a New Product's Innovative Attributes and Planned Obsolescence on Consumer Purchase Intention)

  • 박철주
    • 유통과학연구
    • /
    • 제13권8호
    • /
    • pp.81-90
    • /
    • 2015
  • Purpose - To vitalize a market or develop a new one, companies frequently release new products into the market, often by shortening the time to market, called the release period. This research aims to investigate the purchase intention behavior of consumers in terms of buying new products at the time of product release based on the release speed. Research Design, Data, and Methodology - The research reviews the influence of relative advantage, complexity, and compatibility among innovative attributes of new products, as proposed by Rogers. Moreover, it examines the moderating effect of the innovative new product attributes in terms of speed of obsolescence of old products and how that influences consumer purchase behavior. Additionally, this study tests the research hypotheses using empirical analysis. Results - The analysis demonstrated that the relative predominance (H1) and suitability (H3) of new products had a statistically significant positive influence on new product purchase intention. However, the complexity (H2) of new products had a statistically significant positive influence on new product purchase intention in contrast to its predicted sign (-). The results of the moderating effect of the old product use period were as follows. H4-1 was not supported since the difference between the path coefficients of the group with the low level old product use period and the group with the high level, represented by the relationship of relative predominance and new product purchase intention, was not statistically significant. H5-1 was also not supported since the difference between the path coefficients of the group with the low level of old product use period and the group with the high level, represented by the relationship of complexity and new product purchase intention, was not statistically significant. However, H4-2 was supported since the difference between the path coefficients of the group with the low level of old product use frequency and the group with the high level, represented by the relationship of relative advantage and new product purchase intention was statistically significant. H5-2 was not supported since the difference between the path coefficients of the group with the low level of old product use frequency and the group with the high level, represented by the relationship of complexity and new product purchase intention, was not statistically significant. H6-2 was also not supported since the difference between the path coefficients of the group with the low level of old product use frequency and the group with the high level, represented by the relationship of compatibility and new product purchase intention, was not statistically significant. Conclusion - According to the results, only H4-2 among the hypotheses on the moderating effect of the old product use period and use frequency was statistically significant. Future research should focus on carrying out a detailed review of the hypothesis on the moderating effect of the old product usage period and frequency, find the cause, and connect this to potential new research.

SEED 블록 암호 알고리즘 확산계층에서 낮은 복잡도를 갖는 부채널 분석 (Side Channel Analysis with Low Complexity in the Diffusion Layer of Block Cipher Algorithm SEED)

  • 원유승;박애선;한동국
    • 정보보호학회논문지
    • /
    • 제27권5호
    • /
    • pp.993-1000
    • /
    • 2017
  • 임베디드 장비의 가용성을 고려했을 때, 안전성과 효율성이 동시에 제공될 수 있는 1차 마스킹과 하이딩 대응기법과 같이 조합된 대응기법은 꽤 매력적이다. 특히, 효율성을 제공하기 위하여 첫 번째와 마지막 라운드의 혼돈 및 확산 계층에 조합된 대응기법을 적용할 수 있다. 또한, 중간 라운드에는 1차 마스킹 또는 대응기법이 없게 구성한다. 본 논문에서, 확산 계층의 출력에서 낮은 복잡도를 갖는 최신 부채널 분석을 제안한다. 일반적으로, 공격자는 높은 공격 복잡도 때문에 확산 계층의 출력을 공격 타겟으로 설정할 수 없다. 블록 암호의 확산 계층이 AND 연산들로 구성되어있을 때, 공격 복잡도를 줄일 수 있다는 것을 보인다. 여기서, 우리는 주 알고리즘을 SEED로 간주한다. 그러면, S-box 출력과 확산 계층 출력과의 상관관계에 의해 $2^{32}$ 를 갖는 공격 복잡도는 $2^{16}$으로 줄일 수 있다. 더욱이, 일반적으로 주 타겟이 S-box 출력이라는 사실과 비교하였을 때, 시뮬레이션 파형에서 요구되는 파형 수가 43~98%가 감소할 수 있다는 것을 입증한다. 게다가, 실제 장비에서 100,000개 파형에 대해 일반적인 방법으로 옳은 키를 추출하는 것을 실패하였음에도, 제안된 방법에 의해 옳은 키를 찾는데 8,000개의 파형이면 충분하다는 것을 보인다.

Modeling strength of high-performance concrete using genetic operation trees with pruning techniques

  • Peng, Chien-Hua;Yeh, I-Cheng;Lien, Li-Chuan
    • Computers and Concrete
    • /
    • 제6권3호
    • /
    • pp.203-223
    • /
    • 2009
  • Regression analysis (RA) can establish an explicit formula to predict the strength of High-Performance Concrete (HPC); however, the accuracy of the formula is poor. Back-Propagation Networks (BPNs) can establish a highly accurate model to predict the strength of HPC, but cannot generate an explicit formula. Genetic Operation Trees (GOTs) can establish an explicit formula to predict the strength of HPC that achieves a level of accuracy in between the two aforementioned approaches. Although GOT can produce an explicit formula but the formula is often too complicated so that unable to explain the substantial meaning of the formula. This study developed a Backward Pruning Technique (BPT) to simplify the complexity of GOT formula by replacing each variable of the tip node of operation tree with the median of the variable in the training dataset belonging to the node, and then pruning the node with the most accurate test dataset. Such pruning reduces formula complexity while maintaining the accuracy. 404 experimental datasets were used to compare accuracy and complexity of three model building techniques, RA, BPN and GOT. Results show that the pruned GOT can generate simple and accurate formula for predicting the strength of HPC.

조직의 IT 도입 시 기술수용의지와 스트레스가 업무에 미치는 영향연구 (Effect of IT Acceptance Will and Stress on Task Performance)

  • 강소라;김유정;전방지
    • Journal of Information Technology Applications and Management
    • /
    • 제18권4호
    • /
    • pp.119-130
    • /
    • 2011
  • When an organization adopts new technology, performance through the technology is expected to be decided depending on attitudes of the members, in that technology produces performance only if the members zealously use it. Therefore, the members need to devote themselves to acquire and use the new technology. This attitude of members is called "IT acceptance will" (Ranarajan et al., 2005), and this research is to examine the effect of an individual's "IT acceptance will" on performance. Thus, this research analyzes what increases IT acceptance will and what effect IT acceptance will has on stress due to uncertainty of an organization-job complexity and role conflict- through new IT adoption. This research conducted a survey from July to September in 2009, targeting employees of government offices and public institutions in Kyunggi and Kyungnam provinces. Then, a total of 370 was collected and the final 344 was selected for our research analysis. As a result, this research found out that personal innovation and client orientation improved an individual's IT acceptance will, which improved performance through new IT adoption. However, the new IT adoption also increased job complexity and role conflict, but negative effects of stress on performance was not found as IT adoption will diminished the stress. Based on this result, this research discussed practical and academic implications and the limitations.

통합 플로우 기반 네트워크의 지연시간 최대치 분석 (Delay Bound Analysis of Networks based on Flow Aggregation)

  • 정진우
    • 한국인터넷방송통신학회논문지
    • /
    • 제20권1호
    • /
    • pp.107-112
    • /
    • 2020
  • 본 연구에서는 IEEE 802.1 time sensitive network(TSN) task group(TG)에서 표준화 중인 asynchronous traffic shaping (ATS) 기술에서 제시된 minimal interleaved regulator(IR) 개념을 확장 적용한 통합 플로우 기반 지연시간 보장 프레임워크를 분석하였다. 해당 프레임워크는 단위 네트워크의 출력포트에 IR을 적용하여 burst 축적을 방지하면서, 동시에 단위 네트워크 안에서는 입출력 포트를 기준으로 플로우를 통합하여 복잡도를 낮출 수 있다. 본 연구에서는 다양한 파라미터들 가진 네트워크에서의 수치적 분석을 통해서, 제안된 낮은 복잡도의 프레임워크의 성능이 기존 integrated services (IntServ) 프레임워크보다 더 우수하거나 비슷함을 보였다. 특히 통합 플로우의 크기와 단위 네트워크의 크기가 클수록 성능이 우수해짐을 확인하였다.

Computation and Communication Efficient Key Distribution Protocol for Secure Multicast Communication

  • Vijayakumar, P.;Bose, S.;Kannan, A.;Jegatha Deborah, L.
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • 제7권4호
    • /
    • pp.878-894
    • /
    • 2013
  • Secure multimedia multicast applications involve group communications where group membership requires secured dynamic key generation and updating operations. Such operations usually consume high computation time and therefore designing a key distribution protocol with reduced computation time is necessary for multicast applications. In this paper, we propose a new key distribution protocol that focuses on two aspects. The first one aims at the reduction of computation complexity by performing lesser numbers of multiplication operations using a ternary-tree approach during key updating. Moreover, it aims to optimize the number of multiplication operations by using the existing Karatsuba divide and conquer approach for fast multiplication. The second aspect aims at reducing the amount of information communicated to the group members during the update operations in the key content. The proposed algorithm has been evaluated based on computation and communication complexity and a comparative performance analysis of various key distribution protocols is provided. Moreover, it has been observed that the proposed algorithm reduces the computation and communication time significantly.