• Title/Summary/Keyword: Complexity science

Search Result 1,844, Processing Time 0.028 seconds

Impossible Differential Cryptanalysis on DVB-CSA

  • Zhang, Kai;Guan, Jie;Hu, Bin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.4
    • /
    • pp.1944-1956
    • /
    • 2016
  • The Digital Video Broadcasting-Common Scrambling Algorithm is an ETSI-designated algorithm designed for protecting MPEG-2 signal streams, and it is universally used. Its structure is a typical hybrid symmetric cipher which contains stream part and block part within a symmetric cipher, although the entropy is 64 bits, there haven't any effective cryptanalytic results up to now. This paper studies the security level of CSA against impossible differential cryptanalysis, a 20-round impossible differential for the block cipher part is proposed and a flaw in the cipher structure is revealed. When we attack the block cipher part alone, to recover 16 bits of the initial key, the data complexity of the attack is O(244.5), computational complexity is O(222.7) and memory complexity is O(210.5) when we attack CSA-BC reduced to 21 rounds. According to the structure flaw, an attack on CSA with block cipher part reduced to 21 rounds is proposed, the computational complexity is O(221.7), data complexity is O(243.5) and memory complexity is O(210.5), we can recover 8 bits of the key accordingly. Taking both the block cipher part and stream cipher part of CSA into consideration, it is currently the best result on CSA which is accessible as far as we know.

A Hybrid Texture Coding Method for Fast Texture Mapping

  • Cui, Li;Kim, Hyungyu;Jang, Euee S.
    • Journal of Computing Science and Engineering
    • /
    • v.10 no.2
    • /
    • pp.68-73
    • /
    • 2016
  • An efficient texture compression method is proposed based on a block matching process between the current block and the previously encoded blocks. Texture mapping is widely used to improve the quality of rendering results in real-time applications. For fast texture mapping, it is important to find an optimal trade-off between compression efficiency and computational complexity. Low-complexity methods (e.g., ETC1 and DXT1) have often been adopted in real-time rendering applications because conventional compression methods (e.g., JPEG) achieve a high compression ratio at the cost of high complexity. We propose a block matching-based compression method that can achieve a higher compression ratio than ETC1 and DXT1 while maintaining computational complexity lower than that of JPEG. Through a comparison between the proposed method and existing compression methods, we confirm our expectations on the performance of the proposed method.

Complexity Reduced Blind Subspace Channel Estimation for DS/CDMA DMB Downlink (DS/CDMA DMB 하향 링크에서 복잡도가 감소된 블라인드 부분 공간 채널 추정)

  • Yang Wan-Chul;Lee Byung-Seub
    • The Journal of Korean Institute of Electromagnetic Engineering and Science
    • /
    • v.15 no.9
    • /
    • pp.863-871
    • /
    • 2004
  • In this paper, we propose a subspace channel estimation technique for DS/CDMA DMB down link system, which can obtain reduction in numerical complexity by using of matched filtering outputs. The complexity reduction is considerable when the channel length is small and the system is moderately loaded. Previously proposed subspace-based blind channel estimation algorithm suffer from high numerical complexity for systems with large spreading gains. Although the proposed algerian suffers a slight performance loss, it becomes negligible for large observation length. Performance is evaluated through simulations and the derivation of the analytical MSE.

POLYNOMIAL COMPLEXITY OF PRIMAL-DUAL INTERIOR-POINT METHODS FOR CONVEX QUADRATIC PROGRAMMING

  • Liu, Zhongyi;Sun, Wenyu;De Sampaio, Raimundo J.B.
    • Journal of applied mathematics & informatics
    • /
    • v.27 no.3_4
    • /
    • pp.567-579
    • /
    • 2009
  • Recently, Peng et al. proposed a primal-dual interior-point method with new search direction and self-regular proximity for LP. This new large-update method has the currently best theoretical performance with polynomial complexity of O($n^{\frac{q+1}{2q}}\;{\log}\;{\frac{n}{\varepsilon}}$). In this paper we use this search direction to propose a primal-dual interior-point method for convex quadratic programming (QP). We overcome the difficulty in analyzing the complexity of the primal-dual interior-point methods for convex quadratic programming, and obtain the same polynomial complexity of O($n^{\frac{q+1}{2q}}\;{\log}\;{\frac{n}{\varepsilon}}$) for convex quadratic programming.

  • PDF

Evaluation of Planning Transparence of User Interface Reflecting State Schemas (스키마 개념을 도입한 사용자 계획수립의 용이도 평가)

  • ;Yoon, Wan Chul
    • Journal of the Korean Operations Research and Management Science Society
    • /
    • v.17 no.2
    • /
    • pp.45-54
    • /
    • 1992
  • It become increasingly important to design user interface to carry low complexity. The cognitive limitations of users severely restrict utility of highly intelligent but complex modern systems. Since humans are known to use schemas to reduce cognitive complexity, imposing good consistency to an interface design that may help that user form useful schemas will provide powerful control over the complexity. This present a research effort to develop a quantitative method for evaluating interface complexity that the user would experience planning his or her course of action. Taking into account the user's potential schemas, a quantitative measure based on information theory was develped to assess the navigational complexity. This approach does not rely on the subjective judgment of the researcher as most schemes dealing with user schemas do. The proposed method may benefit the rapid prototyping approach to design a better user interface by allowing handy assessment of the design.

  • PDF

Performance Comparison of Channelization Schemes for Flexible Satellite Transponder with Digital Filter Banks (디지털 필터뱅크 기반 플렉서블 위성중계기를 위한 채널화 기법의 성능비교 연구)

  • Lee, Dong-Hun;Kim, Ki-Seon
    • Journal of the Korea Institute of Military Science and Technology
    • /
    • v.13 no.3
    • /
    • pp.405-412
    • /
    • 2010
  • The purpose of this paper is to compare complexity and to assess flexibility of competing transponder architectures for satellite communication services. For performance comparison, we consider three channelization techniques: digital down converter(DDC) based on the use of the cascaded integrator-comb(CIC) filter, tuneable pipeline frequency transform(T-PFT) based on the tree-structure(TS) and variable oversampled complex-modulated filter banks(OCM-FB) based on the polyphase FFT(P-FFT). The comparison begins by presenting a basic architecture of each channelization method and includes analytical expressions of the number of multiplications as a computational complexity perspective. The analytical results show that DDC with CIC filter requires the heavy computational burden and the perfect flexibility. T-PFT based on the TS provides the almost perfect flexibility with the low complexity over DDC with the CIC filter for a large number of sub-channels. OCM-FB based on the P-FFT shows the high flexibility and the best computational complexity performance compared with other approaches.

Complexity-Reduced Algorithms for LDPC Decoder for DVB-S2 Systems

  • Choi, Eun-A;Jung, Ji-Won;Kim, Nae-Soo;Oh, Deock-Gil
    • ETRI Journal
    • /
    • v.27 no.5
    • /
    • pp.639-642
    • /
    • 2005
  • This paper proposes two kinds of complexity-reduced algorithms for a low density parity check (LDPC) decoder. First, sequential decoding using a partial group is proposed. It has the same hardware complexity and requires a fewer number of iterations with little performance loss. The amount of performance loss can be determined by the designer, based on a tradeoff with the desired reduction in complexity. Second, an early detection method for reducing the computational complexity is proposed. Using a confidence criterion, some bit nodes and check node edges are detected early on during decoding. Once the edges are detected, no further iteration is required; thus early detection reduces the computational complexity.

  • PDF

DIGITAL WATERMARKING BASED ON COMPLEXITY OF BLOCK

  • Funahashi, Keita;Inazumi, Yasuhiro;Horita, Yuukou
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2009.01a
    • /
    • pp.678-683
    • /
    • 2009
  • A lot of researches [1] have been conducted on digital watermark embedding in brightness. A prerequisite for the digital watermark is that the image quality does not change even if the volume of the embedded information increases. Generally, the noise on complex images is perceived than the noise on fiat images. Thus, we present a method for watermarking an image by embedding complex areas by priority. The proposed method has achieved higher image quality of digital watermarking compared to other method that do not take into consideration the complexity of blocks, although the PSNR of the proposed method is lower than for a method not based on block complexity.

  • PDF

Complexity Theory and Organization Management (복잡성 이론과 기업경영: 프랙탈 경영방식을 중심으로)

  • Lee, Jang-Woo;Park, Hying-Gyu
    • Korean Management Science Review
    • /
    • v.15 no.2
    • /
    • pp.239-257
    • /
    • 1998
  • Facing the globalization of world economy, intense market competition, radical change of information technology, firms are obliged to create a new type of organizations characterized by flexibility and adaptability to new and dynamic environments. This paper reviews the theories of complexity in physics briefly and discusses the implications of them on the management of business organizations. It analogizes the core concepts from complexity theories such as cooperative phenomena, self-organization, adaptation, positive feedback, and butterfly effect, and attempts to identify their implications on business management. Particularly, it suggests principles of 'Fractal' management which apply the fractal structure to the business organization.

  • PDF

Nature and Prospect of Complexity Paradigm (복잡계 패러다임의 특성과 전망)

  • Kim Mun-Cho
    • Journal of Science and Technology Studies
    • /
    • v.3 no.2 s.6
    • /
    • pp.1-27
    • /
    • 2003
  • Complexity paradigm is a scientific amalgam that aims to unite a range of theoretical perspectives and research agendas across natural and social sciences. Proponents of complexity paradigm lay claims to an increasing number of areas of study, including artificial life, interpersonal networks, internal/international patterning of organizations, mapping of cyberspace, etc. All of those can be subsumed under the title, 'complexity turn.' Owing to the idea of open system, complexity paradigm has developed a number of new concepts/themes/perspectives that help to account for the complex mechanism of living and non-living creatures. A complex system comprises a number of properties such as disequilibrium, nonlinearity, dissipative structure, self-organization fractal geometry, autopoiesis, coevolution. Following a brief introduction to theoretical development, those properties are succinctly discussed. The complexity turn has provided a wealth of insights that enable to analyze system operations of any kind. It contributes a lot to illuminating the working of social system as well. The most remarkable attempt may be Niklas Luhmann's 'neofunctional system theory.' Merits and shortcomings of complexity paradigm were examined and its future prospect were assessed with the conclusion that complexity paradigm would continue to be useful both as effective transdisciplinary framework and powerful analytical tool.

  • PDF