• Title/Summary/Keyword: Information Complexity

Search Result 5,349, Processing Time 0.026 seconds

High-throughput Low-complexity Mixed-radix FFT Processor using a Dual-path Shared Complex Constant Multiplier

  • Nguyen, Tram Thi Bao;Lee, Hanho
    • JSTS:Journal of Semiconductor Technology and Science
    • /
    • v.17 no.1
    • /
    • pp.101-109
    • /
    • 2017
  • This paper presents a high-throughput low-complexity 512-point eight-parallel mixed-radix multipath delay feedback (MDF) fast Fourier transform (FFT) processor architecture for orthogonal frequency division multiplexing (OFDM) applications. To decrease the number of twiddle factor (TF) multiplications, a mixed-radix $2^4/2^3$ FFT algorithm is adopted. Moreover, a dual-path shared canonical signed digit (CSD) complex constant multiplier using a multi-layer scheme is proposed for reducing the hardware complexity of the TF multiplication. The proposed FFT processor is implemented using TSMC 90-nm CMOS technology. The synthesis results demonstrate that the proposed FFT processor can lead to a 16% reduction in hardware complexity and higher throughput compared to conventional architectures.

Adaptive De-interlacing Algorithm using Method Selection based on Degree of Local Complexity (지역 복잡도 기반 방법 선택을 이용한 적응적 디인터레이싱 알고리듬)

  • Hong, Sung-Min;Park, Sang-Jun;Jeong, Je-Chang
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.36 no.4C
    • /
    • pp.217-225
    • /
    • 2011
  • In this paper, we propose an adaptive de-interlacing algorithm that is based on the degree of local complexity. The conventional intra field de-interlacing algorithms show the different performance according to the ways which find the edge direction. Furthermore, FDD (Fine Directional De-interlacing) algorithm has the better performance than other algorithms but the computational complexity of FDD algorithm is too high. In order to alleviate these problems, the proposed algorithm selects the most efficient de-interacing algorithm among LA (Line Average), MELA (Modified Edge-based Line Average), and LCID (Low-Complexity Interpolation Method for De-interlacing) algorithms which have low complexity and good performance. The proposed algorithm is trained by the DoLC (Degree of Local Complexity) for selection of the algorithms mentioned above. Simulation results show that the proposed algorithm not only has the low complexity but also performs better objective and subjective image quality performances compared with the conventional intra-field methods.

Distributed video coding complexity balancing method by phase motion estimation algorithm (단계적 움직임 예측을 이용한 분산비디오코딩(DVC)의 복잡도 분배 방법)

  • Kim, Chul-Keun;Kim, Min-Geon;Suh, Doug-Young;Park, Jong-Bin;Jeon, Byeung-Woo
    • Journal of Broadcast Engineering
    • /
    • v.15 no.1
    • /
    • pp.112-121
    • /
    • 2010
  • Distributed video coding is a coding paradigm that allows complexity to be shared between encoder and decoder, in contrast with conventional video coding. We propose that complexity balancing method of encoder/decoder by phase motion estimation algorithm. The encoder performs partial motion estimation. The result of the partial motion estimation is transferred to the decoder, and the decoder performs motion estimation within the narrow range. When the encoder can afford some complexity, complexity balancing is possible. The method proposed is able to know relativity between complexity balancing and coding efficiency. The coding efficiency increase rate by the encoder complexity increases is higher than that by the decoder complexity increases. The proposed method can control the complexity and coding efficiency according to devices' resources and channel conditions.

On the Signal Power Normalization Approach to the Escalator Adaptive filter Algorithms

  • Kim Nam-Yong
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.31 no.8C
    • /
    • pp.801-805
    • /
    • 2006
  • A normalization approach to coefficient adaptation in the escalator(ESC) filter structure that conventionally employs least mean square(LMS) algorithm is introduced. Using Taylor's expansion of the local error signal, a normalized form of the ESC-LMS algorithm is derived. Compared with the computational complexity of the conventional ESC-LMS algorithm employs input power estimation for time-varying convergence coefficient using a single-pole low-pass filter, the computational complexity of the proposed method can be reduced by 50% without performance degradation.

Revolution of nuclear energy efficiency, economic complexity, air transportation and industrial improvement on environmental footprint cost: A novel dynamic simulation approach

  • Ali, Shahid;Jiang, Junfeng;Hassan, Syed Tauseef;Shah, Ashfaq Ahmad
    • Nuclear Engineering and Technology
    • /
    • v.54 no.10
    • /
    • pp.3682-3694
    • /
    • 2022
  • The expansion of a country's ecological footprint generates resources for economic development. China's import bill and carbon footprint can be reduced by investing in green transportation and energy technologies. A sustainable environment depends on the cessation of climate change; the current study investigates nuclear energy efficiency, economic complexity, air transportation, and industrial improvement for reducing environmental footprint. Using data spanning the years 1983-2016, the dynamic autoregressive distributed lag simulation method has demonstrated the short- and long-term variability in the impact of regressors on the ecological footprint. The study findings revealed that economic complexity in China had been found to have a statistically significant impact on the country's ecological footprint. Moreover, the industrial improvement process is helpful for the ecological footprint in China. In the short term, air travel has a negative impact on the ecological footprint, but this effect diminishes over time. Additionally, energy innovation is negative and substantial both in the short and long run, thus demonstrating its positive role in reducing the ecological footprint. Policy implications can be extracted from a wide range of issues, including economic complexity, industrial improvement, air transportation, energy innovation, and ecological impact to achieve sustainable goals.

Development of a Functional Complexity Reduction Concept of MMIS for Innovative SMRs

  • Gyan, Philip Kweku;Jung, Jae Cheon
    • Journal of the Korean Society of Systems Engineering
    • /
    • v.17 no.2
    • /
    • pp.69-81
    • /
    • 2021
  • The human performance issues and increased automation issues in advanced Small Modular Reactors (SMRs) are critical to numerous stakeholders in the nuclear industry, due to the undesirable implications targeting the Man Machine Interface Systems (MMIS) complexity of (Generation IV) SMRs. It is imperative that the design of future SMRs must address these problems. Nowadays, Multi Agent Systems (MAS) are used in the industrial sector to solve multiple complex problems; therefore incorporating this technology in the proposed innovative SMR (I-SMR) design will contribute greatly in the decision making process during plant operations, also reduce the number MCR operating crew and human errors. However, it is speculated that an increased level of complexity will be introduced. Prior to achieving the objectives of this research, the tools used to analyze the system for complexity reduction, are the McCabe's Cyclomatic complexity metric and the Henry-Kafura Information Flow metric. In this research, the systems engineering approach is used to guide the engineering process of complexity reduction concept of the system in its entirety.

Determining the complexity level of proceduralized tasks in a digitalized main control room using the TACOM measure

  • Inseok Jang;Jinkyun Park
    • Nuclear Engineering and Technology
    • /
    • v.54 no.11
    • /
    • pp.4170-4180
    • /
    • 2022
  • The task complexity (TACOM) measure was previously developed to quantify the complexity of proceduralized tasks conducted by nuclear power plant operators. Following the development of the TACOM measure, its appropriateness has been validated by investigating the relationship between TACOM scores and three kinds of human performance data, namely response times, human error probabilities, and subjective workload scores. However, the information reflected in quantified TACOM scores is still insufficient to determine the levels of complexity of proceduralized tasks for human reliability analysis (HRA) applications. In this regard, the objective of this study is to suggest criteria for determining the levels of task complexity based on logistic regression between human error occurrences in digitalized main control rooms and TACOM scores. Analysis results confirmed that the likelihood of human error occurrence according to the TACOM score is secured. This result strongly implies that the TACOM measure can be used to identify the levels of task complexity, which could be applicable to various research domains including HRA.

Task Complexity of Movement Skills for Robots (로봇 운동솜씨의 작업 복잡도)

  • Kwon, Woo-Young;Suh, Il-Hong;Lee, Jun-Goo;You, Bum-Jae;Oh, Sang-Rok
    • The Journal of Korea Robotics Society
    • /
    • v.7 no.3
    • /
    • pp.194-204
    • /
    • 2012
  • Measuring task complexity of movement skill is an important factor to evaluate a difficulty of learning and/or imitating a task for autonomous robots. Although many complexity-measures are proposed in research areas such as neuroscience, physics, computer science, and biology, there have been little attention on the robotic tasks. To cope with measuring complexity of robotic task, we propose an information-theoretic measure for task complexity of movement skills. By modeling proprioceptive as well as exteroceptive sensor data as multivariate Gaussian distribution, movements of a task can be modeled as probabilistic model. Additionally, complexity of temporal variations is modeled by sampling in time and modeling as individual random variables. To evaluate our proposed complexity measure, several experiments are performed on the real robotic movement tasks.

Reduced Complexity Schnorr-Euchner Sphere Decoders in MIMO Applications

  • Le Minh-Tuan;Pham Van-Su;Mai Linh;Yoon Gi-Wan
    • Journal of information and communication convergence engineering
    • /
    • v.4 no.2
    • /
    • pp.79-83
    • /
    • 2006
  • We present two techniques based on lookup tables to reduce complexity of the well-known Schnorr-Euchner (SE) sphere decoder (SD) without introducing performance degradation. By the aid of lookup tables, the computational loads caused by the SE enumeration and decision feedback are reduced at the cost of higher storage capacity. Simulation results are provided to verify performance and complexity of the proposed decoders.

Low-Complexity Maximum-Likelihood Decoder for V-BLAST Architecture

  • Le, Minh-Tuan;Pham, Van-Su;Mai, Linh;Yoon, Gi-Wan
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • v.9 no.1
    • /
    • pp.126-130
    • /
    • 2005
  • In this paper, a low-complexity maximum-likelihood (ML) decoder based on QR decomposition, called real-valued LCMLDec decoder or RVLCMLDec for short, is proposed for the Vertical Bell Labs Layered Space-Time (V-BLAST) architecture, a promising candidate for providing high data rates in future fixed wireless communication systems [1]. Computer simulations, in comparison with other detection techniques, show that the proposed decoder is capable of providingthe V-BLAST schemes with ML performance at low detection complexity.

  • PDF