• Title/Summary/Keyword: Complexity-Weighted

Search Result 153, Processing Time 0.027 seconds

An Efficient Skipping Method of H.264/AVC Weighted Prediction for Various Illuminating Effects (다양한 영상의 밝기 효과에 대해 효과적으로 적응하는 H.264/AVC의 가중치 예측 생략 방법)

  • Choi, Ji-Ho;SunWoo, Myung-Hoon
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.47 no.5
    • /
    • pp.206-211
    • /
    • 2010
  • This paper describes a skipping method for handling various illuminating effects in video sequences. The weighted prediction in H.264/AVC improves coding efficiency and image quality. However, it requires massive computation overheads for entire system, and thus, reducing the computation complexity becomes more important. Compared to the weighted prediction method in the H.264/AVC, the proposed method can decrease the bitrate down to 15%. Moreover, the proposed algorithm can reduce computation complexity down to 30%, compared to the localized weighted prediction which does not skip unnecessary calculation.

DISTRIBUTED ALGORITHMS SOLVING THE UPDATING PROBLEMS

  • Park, Jung-Ho;Park, Yoon-Young;Choi, Sung-Hee
    • Journal of applied mathematics & informatics
    • /
    • v.9 no.2
    • /
    • pp.607-620
    • /
    • 2002
  • In this paper, we consider the updating problems to reconstruct the biconnected-components and to reconstruct the weighted shortest path in response to the topology change of the network. We propose two distributed algorithms. The first algorithm solves the updating problem that reconstructs the biconnected-components after the several processors and links are added and deleted. Its bit complexity is O((n'+a+d)log n'), its message complexity is O(n'+a+d), the ideal time complexity is O(n'), and the space complexity is O(e long n+e' log n'). The second algorithm solves the updating problem that reconstructs the weighted shortest path. Its message complexity and ideal-time complexity are $O(u^2+a+n')$ respectively.

A Software Complexity Measurement Technique for Object-Oriented Reverse Engineering (객체지향 역공학을 위한 소프트웨어 복잡도 측정 기법)

  • Kim Jongwan;Hwang Chong-Sun
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.9
    • /
    • pp.847-852
    • /
    • 2005
  • Over the last decade, numerous complexity measurement techniques for Object-Oriented (OO) software system have been proposed for managing the effects of OO codes. These techniques may be based on source code analysis such as WMC (Weighted Methods per Class) and LCOM (Lack of Cohesion in Methods). The techniques are limited to count the number of functions (C++). However. we suggested a new weighted method that checks the number of parameters, the return value and its data type. Then we addressed an effective complexity measurement technique based on the weight of class interfaces to provide guidelines for measuring the class complexity of OO codes in reverse engineering. The results of this research show that the proposed complexity measurement technique ECC(Enhanced Class Complexity) is consistent and accurate in C++ environment.

Low Complexity Hybrid Interpolation Algorithm using Weighted Edge Detector (가중치 윤곽선 검출기를 이용한 저 복잡도 하이브리드 보간 알고리듬)

  • Kwon, Hyeok-Jin;Jeon, Gwang-Gil;Jeong, Je-Chang
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.32 no.3C
    • /
    • pp.241-248
    • /
    • 2007
  • In predictive image coding, a LS (Least Squares)-based adaptive predictor is an efficient method to improve image edge predictions. This paper proposes a hybrid interpolation with weighted edge detector. A hybrid approach of switching between bilinear interpolation and EDI (Edge-Directed Interpolation) is proposed in order to reduce the overall computational complexity The objective and subjective quality is also similar to the bilinear interpolation and EDI. Experimental results demonstrate that this hybrid interpolation method that utilizes a weighted edge detector can achieve reduction in complexity with minimal degradation in the interpolation results.

An Efficient Distributed Algoritm for the Weighted Shortest-path Updating Problem (최단 경로 갱신문제를 해결하는 분산알고리듬)

  • Park, Jeong-Ho;Lee, Gyeong-O;Gang, Gyu-Cheol
    • The Transactions of the Korea Information Processing Society
    • /
    • v.7 no.6
    • /
    • pp.1778-1784
    • /
    • 2000
  • We consider the weighted shortest path updating problem, that is, the problem to reconstruct the weighted shortest paths in response to topology change of the network. This appear proposes a distributed algorithms that reconstructs the weighted shortest paths after several processors and links are added and deleted. its message complexity and ideal-time complexity are O(p$^2$+q+n') and O(p$^2$+q+n') respectively, where n' is the number of processors in the network after the topology change, q is the number of added links, and p is the total number of processors in he biconnected components (of the network before the topology change) including the deleted links or added links.

  • PDF

Weighted Distance-Based Quantization for Distributed Estimation

  • Kim, Yoon Hak
    • Journal of information and communication convergence engineering
    • /
    • v.12 no.4
    • /
    • pp.215-220
    • /
    • 2014
  • We consider quantization optimized for distributed estimation, where a set of sensors at different sites collect measurements on the parameter of interest, quantize them, and transmit the quantized data to a fusion node, which then estimates the parameter. Here, we propose an iterative quantizer design algorithm with a weighted distance rule that allows us to reduce a system-wide metric such as the estimation error by constructing quantization partitions with their optimal weights. We show that the search for the weights, the most expensive computational step in the algorithm, can be conducted in a sequential manner without deviating from convergence, leading to a significant reduction in design complexity. Our experments demonstrate that the proposed algorithm achieves improved performance over traditional quantizer designs. The benefit of the proposed technique is further illustrated by the experiments providing similar estimation performance with much lower complexity as compared to the recently published novel algorithms.

Fuzzy Linguistic Approach for Evaluating Task Complexity in Nuclear Power Plant (원자력발전소에서의 작업복잡도를 평가하기 위한 퍼지기반 작업복잡도 지수의 개발)

  • Jung Kwang-Tae;Jung Won-dea;Park Jin-Kyun
    • Journal of the Korean Society of Safety
    • /
    • v.20 no.1 s.69
    • /
    • pp.126-132
    • /
    • 2005
  • The purpose of this study is to propose a method to evaluate task complexity using CIFs(Complexity Influencing Factors). We developed a method that CIFs can be used in the evaluation of task complexity using fuzzy linguistic approach. That is, a fuzzy linguistic multi-criteria method to assess task complexity in a specific task situation was proposed. The CIFs luting was assessed in linguistic terms, which are described by fuzzy numbers with triangular and trapezoidal membership function. A fuzzy weighted average algorithm, based on the extension principle, was employed to aggregate these fuzzy numbers. Finally, the method was validated by experimental approach. In the result, it was validated that TCIM(Tink Complexity Index Method) is an efficient method to evaluate task complexity because the correlation coefficient between task performance time and TCI(Task Complexity Index) was 0.699.

Weighted Constrained One-Bit Transform Method for Low-Complexity Block Motion Estimation

  • Choi, Youngkyoung;Kim, Hyungwook;Lim, Sojeong;Yu, Sungwook
    • ETRI Journal
    • /
    • v.34 no.5
    • /
    • pp.795-798
    • /
    • 2012
  • This letter proposes a new low-complexity motion estimation method. The proposed method classifies various nonmatching pixel pairs into several categories and assigns an appropriate weight for each category in the matching stage. As a result, it can significantly improve performance compared to that of the conventional methods by adding only one 1-bit addition and two Boolean operations per pixel.

Application of Nonuniform Weighted Distribution Method to Enhancing Signal Processing Effect of Subband Spatial-Temproral Adaptive Filter

  • Vuong Le Quoc;Tai Pham Trong
    • Proceedings of the IEEK Conference
    • /
    • summer
    • /
    • pp.97-102
    • /
    • 2004
  • The very complicated proplem in spatial processing is effects of phading (Multipath and Delay Spread) and co-channel interference (CCI). The phading is one of principal causes, that form inter-symbol interference (ISI). Spatial-Temproral Adaptive Filter (STAF) has been taken as a solution of this problem, because it can suppress both these types of interference. But the performance of STAF exposes some elemental limitations, in which are the slow convergence of adaptive process and computational complexity. The cause of this is that, STAF must treat a large quantity of information in both space and time. The way that master these limitation is a use of Subband Spatial-Temproral Adaptive Filter (SSTAF). SSTAF reduce computational complexity by pruning off samples of signal and thus it lost some information in time. This draw on attennation of output SINR of SSTAF. The article analyse a optimal solution of this problem by introducing SSTAF with nonuniform weighted distribution.

  • PDF

A Software Size Estimation Using Weighted FFP (가중치를 적용한 FFP 소프트웨어 규모 측정)

  • Park Juseok
    • Journal of Internet Computing and Services
    • /
    • v.6 no.2
    • /
    • pp.37-47
    • /
    • 2005
  • Most of the methods of estimating the size of software are based on the functions provided to costumers and in the process of granting the score to each function we consider the complexity during the process. The FFP technique has advantages applied to vast areas like data management. real-time system, algorithmic software, etc, but on the other hand, has disadvantage on estimating sizes for weights for necessary function elements. This paper proposes the estimating method for software size by considering the complexity of each function elements in full function point calculation method applied to a new developed project and maintenance projects. For this, based on function point by using surveyed data proved the validity of proposed method. The valid result. was that the function elements, the attributes used in size estimation of software, est mated better estimated sizes than in the case of other weights being applied.

  • PDF