• Title/Summary/Keyword: Information Complexity

Search Result 5,349, Processing Time 0.038 seconds

A Complexity Metric for Web Documentation Based on Entropy (엔트로피를 기반으로한 Web 문서들의 복잡도 척도)

  • Kim, Kap-Su
    • Journal of The Korean Association of Information Education
    • /
    • v.2 no.2
    • /
    • pp.260-268
    • /
    • 1998
  • In this paper, I propose a metric model for measuring complexity of Web documentations which are wrote by HTML and XML. The complexity of Web documentation has effect on documentation understandability which is an important metric in maintenance and reusing of Web documentation. The understandable documents have more effect on WEI. The proposed metric uses the entropy to represent the degree of information flows between Web documentations. The proposed documentation complexity measures the information flows in a Web document based on the information passing relationship between Web document files. I evaluate the proposed metric by using the complexity properties proposed by Weyuker, and measure the document complexity. I show effectiveness of analyzing the correlation between the number of document file and document complexity.

  • PDF

A Structural Complexity Metric for Web Application based on Similarity (유사도 기반의 웹 어플리케이션 구조 복잡도)

  • Jung, Woo-Sung;Lee, Eun-Joo
    • Journal of the Korea Society of Computer and Information
    • /
    • v.15 no.8
    • /
    • pp.117-126
    • /
    • 2010
  • Software complexity is used to evaluate a target system's maintainability. The existing complexity metrics on web applications are count-based, so it is hard to incorporate the understandability of developers or maintainers. To make up for this shortcomings, entropy-theory can be applied to define complexity, however, it is assumed that information quantity of each paper is identical. In this paper, structural complexity of a web application is defined based on information theory and similarity. In detail, the proposed complexity is defined using entropy as the previous approach, but the information quantity of individual pages is defined using similarity. That is, a page which are similar with many pages has smaller information quantity than a page which are dissimilar to others. Furthermore, various similarity measures can be used for various views, which results in many-sided complexity measures. Finally, several complexity properties are applied to verify the proposed metric and case studies shows the applicability of the metric.

Low Complexity Vector Quantizer Design for LSP Parameters

  • Woo, Hong-Chae
    • The Journal of the Acoustical Society of Korea
    • /
    • v.17 no.3E
    • /
    • pp.53-57
    • /
    • 1998
  • Spectral information at a speech coder should be quantized with sufficient accuracy to keep perceptually transparent output speech. Spectral information at a low bit rate speech coder is usually transformed into corresponding line spectrum pair parameters and is often quantized with a vector quantization algorithm. As the vector quantization algorithm generally has high complexity in the optimal code vector searching routine, the complexity reduction in that routine is investigated using the ordering property of the line spectrum pair. When the proposed complexity reduction algorithm is applied to the well-known split vector quantization algorithm, the 46% complexity reduction is achieved in the distortion measure compu-tation.

  • PDF

TOPOLOGICAL COMPLEXITY OF SEMIGROUP ACTIONS

  • Yan, Xinhua;He, Lianfa
    • Journal of the Korean Mathematical Society
    • /
    • v.45 no.1
    • /
    • pp.221-228
    • /
    • 2008
  • In this paper, we study the complexity of semigroup actions using complexity functions of open covers. The main results are as follows: (1) A dynamical system is equicontinuous if and only if any open cover has bounded complexity; (2) Weak-mixing implies scattering; (3) We get a criterion for the scattering property.

Contextual Factors Affecting the Information Sharing through Information Systems (정보시스템을 통한 정보공유에 영향을 미치는 상황요인)

  • Kang, Jae-Jung
    • Asia pacific journal of information systems
    • /
    • v.11 no.2
    • /
    • pp.141-158
    • /
    • 2001
  • This paper examines the effects of environmental uncertainty, structural decentralization, formalization, complexity and task interdependence on the information sharing through information system. 197 firms in Korea are surveyed and analyzed to investigate the relationship between the contextual variables and the information sharing. The result of multiple regression analysis shows that task interdependence, structural decentralization, complexity are significant factors to influence on the Information Sharing. Also, additional analysis shows that task interdependence, structural decentralization are major factors in service industry, and task interdependence, structural complexity are in manufacturing industry.

  • PDF

DISTRIBUTED ALGORITHMS SOLVING THE UPDATING PROBLEMS

  • Park, Jung-Ho;Park, Yoon-Young;Choi, Sung-Hee
    • Journal of applied mathematics & informatics
    • /
    • v.9 no.2
    • /
    • pp.607-620
    • /
    • 2002
  • In this paper, we consider the updating problems to reconstruct the biconnected-components and to reconstruct the weighted shortest path in response to the topology change of the network. We propose two distributed algorithms. The first algorithm solves the updating problem that reconstructs the biconnected-components after the several processors and links are added and deleted. Its bit complexity is O((n'+a+d)log n'), its message complexity is O(n'+a+d), the ideal time complexity is O(n'), and the space complexity is O(e long n+e' log n'). The second algorithm solves the updating problem that reconstructs the weighted shortest path. Its message complexity and ideal-time complexity are $O(u^2+a+n')$ respectively.

A High-Speed 2-Parallel Radix-$2^4$ FFT Processor for MB-OFDM UWB Systems (MB-OFDM UWB 통신 시스템을 위한 고속 2-Parallel Radix-$2^4$ FFT 프로세서의 설계)

  • Lee, Jee-Sung;Lee, Han-Ho
    • Proceedings of the IEEK Conference
    • /
    • 2006.06a
    • /
    • pp.533-534
    • /
    • 2006
  • This paper presents the architecture design of a high-speed, low-complexity 128-point radix-$2^4$ FFT processor for ultra-wideband (UWB) systems. The proposed high-speed, low-complexity FFT architecture can provide a higher throughput rate and low hardware complexity by using 2-parallel data-path scheme and single-path delay-feedback (SDF) structure. This paper presents the key ideas applied to the design of high-speed, low-complexity FFT processor, especially that for achieving high throughput rate and reducing hardware complexity. The proposed FFT processor has been designed and implemented with the 0.18-m CMOS technology in a supply voltage of 1.8 V. The throughput rate of proposed FFT processor is up to 1 Gsample/s while it requires much smaller hardware complexity.

  • PDF

Complexity based Sensing Strategy for Spectrum Sensing in Cognitive Radio Networks

  • Huang, Kewen;Liu, Yimin;Hong, Yuanquan;Mu, Junsheng
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.9
    • /
    • pp.4372-4389
    • /
    • 2019
  • Spectrum sensing has attracted much attention due to its significant contribution to idle spectrum detection in Cognitive Radio Networks. However, specialized discussion is on complexity-based sensing strategy for spectrum sensing seldom considered. Motivated by this, this paper is devoted to complexity-based sensing strategy for spectrum sensing. Firstly, three efficiency functions are defined to estimate sensing efficiency of a spectrum scheme. Then a novel sensing strategy is proposed given sensing performance and computational complexity. After that, the proposed sensing strategy is extended to energy detector, Cyclostationary feature detector, covariance matrix detector and cooperative spectrum detector. The proposed sensing strategy provides a novel insight into sensing performance estimation for its consideration of both sensing capacity and sensing complexity. Simulations analyze three efficiency functions and optimal sensing strategy of energy detector, Cyclostationary feature detector and covariance matrix detector.

Low-Complexity Sub-Pixel Motion Estimation Utilizing Shifting Matrix in Transform Domain

  • Ryu, Chul;Shin, Jae-Young;Park, Eun-Chan
    • Journal of Electrical Engineering and Technology
    • /
    • v.11 no.4
    • /
    • pp.1020-1026
    • /
    • 2016
  • Motion estimation (ME) algorithms supporting quarter-pixel accuracy have been recently introduced to retain detailed motion information for high quality of video in the state-of-the-art video compression standard of H.264/AVC. Conventional sub-pixel ME algorithms in the spatial domain are faced with a common problem of computational complexity because of embedded interpolation schemes. This paper proposes a low-complexity sub-pixel motion estimation algorithm in the transform domain utilizing shifting matrix. Simulations are performed to compare the performances of spatial-domain ME algorithms and transform-domain ME algorithms in terms of peak signal-to-noise ratio (PSNR) and the number of bits per frame. Simulation results confirm that the transform-domain approach not only improves the video quality and the compression efficiency, but also remarkably alleviates the computational complexity, compared to the spatial-domain approach.

Impossible Differential Cryptanalysis on DVB-CSA

  • Zhang, Kai;Guan, Jie;Hu, Bin
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.10 no.4
    • /
    • pp.1944-1956
    • /
    • 2016
  • The Digital Video Broadcasting-Common Scrambling Algorithm is an ETSI-designated algorithm designed for protecting MPEG-2 signal streams, and it is universally used. Its structure is a typical hybrid symmetric cipher which contains stream part and block part within a symmetric cipher, although the entropy is 64 bits, there haven't any effective cryptanalytic results up to now. This paper studies the security level of CSA against impossible differential cryptanalysis, a 20-round impossible differential for the block cipher part is proposed and a flaw in the cipher structure is revealed. When we attack the block cipher part alone, to recover 16 bits of the initial key, the data complexity of the attack is O(244.5), computational complexity is O(222.7) and memory complexity is O(210.5) when we attack CSA-BC reduced to 21 rounds. According to the structure flaw, an attack on CSA with block cipher part reduced to 21 rounds is proposed, the computational complexity is O(221.7), data complexity is O(243.5) and memory complexity is O(210.5), we can recover 8 bits of the key accordingly. Taking both the block cipher part and stream cipher part of CSA into consideration, it is currently the best result on CSA which is accessible as far as we know.