• Title/Summary/Keyword: adaptive weighting

Search Result 113, Processing Time 0.026 seconds

Self-Adaptive Performance Improvement of Novel SDD Equalization Using Sigmoid Estimate and Threshold Decision-Weighted Error (시그모이드 추정과 임계 판정 가중 오차를 사용한 새로운 SDD 등화의 자기적응 성능 개선)

  • Oh, Kil Nam
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.17 no.8
    • /
    • pp.17-22
    • /
    • 2016
  • For the self-adaptive equalization of higher-order QAM systems, this paper proposes a new soft decision-directed (SDD) algorithm that opens the eye patterns quickly as well as significantly reducing the error level in the steady-state when it is applied to the initial equalization stage with completely closed eye patterns. The proposed method for M-QAM application minimized the computational complexity of the existing SDD by the symbol estimated based on the two symbols closest to the observation, and greatly simplified the soft decision independently of the QAM order. Furthermore, in the symbol estimating it increased the reliability of the estimates by applying the superior properties of the sigmoid function and avoiding the erroneous estimation of the threshold function. In addition, the initialization performance was improved when an error is generated to update the equalizer, weighting the symbol decision by the threshold function to the error, resulting in an extension of the range of error fluctuations. As a result, the proposed method improves remarkably the computational complexity and the properties of initialization and convergence of the traditional SDD. Through simulations for 64-QAM and 256-QAM under multipath channel conditions with additive noise, the usefulness of the proposed methods was confirmed by comparing the performance of the proposed 2-SDD and two forms of weighted 2-SDD with CMA.

A Study on Projection Image Restoration by Adaptive Filtering (적응적 필터링에 의한 투사영상 복원에 관한 연구)

  • 김정희;김광익
    • Journal of Biomedical Engineering Research
    • /
    • v.19 no.2
    • /
    • pp.119-128
    • /
    • 1998
  • This paper describes a filtering algorithm which employs apriori information of SPECT lesion detectability potential for the filtering of degraded projection images prior to the backprojection reconstruction. In this algorithm, we determined m minimum detectable lesion sized(MDLSs) by assuming m object contrasts uniformly-chosen in the range of 0.0-1.0, based on a signal/noise model which provides the capability potential of SPECT in terms of physical factors. A best estimate of given projection image is attempted as a weighted combination of the subimages from m optimal filters whose design is focused on maximizing the local S/N ratios for the MDLS-lesions. These subimages show relatively larger resolution recovery effect and relatively smaller noise reduction effect with the decreased MDLS, and the weighting on each subimage was controlled by the difference between the subimage and the maximum-resolution-recovered projection image. The proposed filtering algoritym was tested on SPECT image reconstruction problems, and produced good results. Especially, this algorithm showed the adaptive effect that approximately averages the filter outputs in homogeneous areas and sensitively depends on each filter strength on contrast preserving/enhancing in textured lesion areas of the reconstructed image.

  • PDF

Assessment of water use vulnerability in the unit watersheds using TOPSIS approach with subjective and objective weights (주관적·객관적 가중치를 활용한 TOPSIS 기반 단위유역별 물이용 취약성 평가)

  • Park, Hye Sun;Kim, Jeong Bin;Um, Myoung-Jin;Kim, Yeonjoo
    • Journal of Korea Water Resources Association
    • /
    • v.49 no.8
    • /
    • pp.685-692
    • /
    • 2016
  • This study aimed to develop the indicator-based approach to assess water use vulnerability in watersheds and applied to the unit watershed within the Han River watershed. Vulnerability indices were comprised of three sub-components (exposure, sensitivity, adaptive capacity) with respect to water use. The indicators were made up of 16 water use indicators. Then we estimated vulnerability indices using the Technique for Order of Preference by Similarity to Ideal Solution approach (TOPSIS). We collected environmental and socio-economic data from national statistics database, and used them for simulated results by the Soil and Water Assessment Tool (SWAT) model. For estimating the weighted values for each indicator, expert surveys for subjective weight and data-based Shannon's entropy method for objective weight were utilized. With comparing the vulnerability ranks and analyzing rank correlation between two methods, we evaluated the vulnerabilities for the Han River watershed. For water use, vulnerable watersheds showed high water use and the water leakage ratio. The indices from both weighting methods showed similar spatial distribution in general. Such results suggests that the approach to consider different weighting methods would be important for reliably assessing the water use vulnerability in watersheds.

Time- and Frequency-Domain Block LMS Adaptive Digital Filters: Part Ⅰ- Realization Structures (시간영역 및 주파수영역 블럭적응 여파기에 관한 연구 : 제1부- 구현방법)

  • Lee, Jae-Chon;Un, Chong-Kwan
    • The Journal of the Acoustical Society of Korea
    • /
    • v.7 no.4
    • /
    • pp.31-53
    • /
    • 1988
  • In this work we study extensively the structures and performance characteristics of the block least mean-square (BLMS) adaptive digital filters (ADF's) that can be realized efficiently using the fast Fourier transform (FFT). The weights of a BLMS ADF realized using the FFT can be adjusted either in the time domain or in the frequency domain, leading to the time-domain BLMS(TBLMS) algorithm or the frequency-domain BLMS (FBLMS) algorithm, respectively. In Part Ⅰof the paper, we first present new results on the overlap-add realization and the number-theoretic transform realization of the FBLMS ADF's. Then, we study how we can incorporate the concept of different frequency-weighting on the error signals and the self-orthogonalization of weight adjustment in the FBLMS ADF's , and also in the TBLMS ADF's. As a result, we show that the TBLMS ADF can also be made to have the same fast convergence speed as that of the self-orthogonalizing FBLMS ADF. Next, based on the properties of the sectioning operations in weight adjustment, we discuss unconstrained FBLMS algorithms that can reduce two FFT operations both for the overlap-save and overlap-add realizations. Finally, we investigate by computer simulation the effects of different parameter values and different algorithms on the convergence behaviors of the FBLMS and TBLMS ADF's. In Part Ⅱ of the paper, we will analyze the convergence characteristics of the TBLMS and FBLMS ADF's.

  • PDF

Locally adaptive intelligent interpolation for population distribution modeling using pre-classified land cover data and geographically weighted regression (지표피복 데이터와 지리가중회귀모형을 이용한 인구분포 추정에 관한 연구)

  • Kim, Hwahwan
    • Journal of the Korean association of regional geographers
    • /
    • v.22 no.1
    • /
    • pp.251-266
    • /
    • 2016
  • Intelligent interpolation methods such as dasymetric mapping are considered to be the best way to disaggregate zone-based population data by observing and utilizing the internal variation within each source zone. This research reviews the advantages and problems of the dasymetric mapping method, and presents a geographically weighted regression (GWR) based method to take into consideration the spatial heterogeneity of population density - land cover relationship. The locally adaptive intelligent interpolation method is able to make use of readily available ancillary information in the public domain without the need for additional data processing. In the case study, we use the preclassified National Land Cover Dataset 2011 to test the performance of the proposed method (i.e. the GWR-based multi-class dasymetric method) compared to four other popular population estimation methods (i.e. areal weighting interpolation, pycnophylactic interpolation, binary dasymetric method, and globally fitted ordinary least squares (OLS) based multi-class dasymetric method). The GWR-based multi-class dasymetric method outperforms all other methods. It is attributed to the fact that spatial heterogeneity is accounted for in the process of determining density parameters for land cover classes.

  • PDF

Color-Depth Combined Semantic Image Segmentation Method (색상과 깊이정보를 융합한 의미론적 영상 분할 방법)

  • Kim, Man-Joung;Kang, Hyun-Soo
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.18 no.3
    • /
    • pp.687-696
    • /
    • 2014
  • This paper presents a semantic object extraction method using user's stroke input, color, and depth information. It is supposed that a semantically meaningful object is surrounded with a few strokes from a user, and has similar depths all over the object. In the proposed method, deciding the region of interest (ROI) is based on the stroke input, and the semantically meaningful object is extracted by using color and depth information. Specifically, the proposed method consists of two steps. The first step is over-segmentation inside the ROI using color and depth information. The second step is semantically meaningful object extraction where over-segmented regions are classified into the object region and the background region according to the depth of each region. In the over-segmentation step, we propose a new marker extraction method where there are two propositions, i.e. an adaptive thresholding scheme to maximize the number of the segmented regions and an adaptive weighting scheme for color and depth components in computation of the morphological gradients that is required in the marker extraction. In the semantically meaningful object extraction, we classify over-segmented regions into the object region and the background region in order of the boundary regions to the inner regions, the average depth of each region being compared to the average depth of all regions classified into the object region. In experimental results, we demonstrate that the proposed method yields reasonable object extraction results.

The Proxy Variables Selection of Vulnerability Assessment for Agricultural Infrastructure According to Climate Change (논문 - 기후변화에 따른 농업생산기반 재해 취약성 평가를 위한 대리변수 선정)

  • Kim, Sung-Jae;Park, Tae-Yang;Kim, Sung-Min;Kim, Sang-Min
    • KCID journal
    • /
    • v.18 no.2
    • /
    • pp.33-42
    • /
    • 2011
  • Climate change has impacts on not only the average temperature rise but also the intensity and frequency of extreme events such as flood and drought. It is also expected that the damages on agricultural infrastructure will be increased resulting from increased rainfall intensity and frequency caused by climate change. To strengthen the climate change adaptation capacity, it is necessary to identify the vulnerability of a given society's physical infrastructures and to develop appropriate adaptation strategies with infrastructure management because generally facilities related to human settlements are vulnerable to climate changes and establishing an adaptive public infrastructure would reduce the damages and the repair cost. Therefore, development of mitigation strategies for agricultural infrastructure against climatic hazard is very important, but there are few studies on agricultural infrastructure vulnerability assessment and adaptation strategies. The concept of vulnerability, however, is difficult to functionally define due to the fact that vulnerability itself includes many aspects (biological, socioeconomic, etc.) in various sectors. As such, much research on vulnerability has used indicators which are useful for standardization and aggregation. In this study, for the vulnerability assessment for agricultural infrastructure, 3 categories of climate exposure, sensitivity, and adaptation capacity were defined which are composed of 16 sub-categories and 49 proxy variables. Database for each proxy variables was established based on local administrative province. Future studies are required to define the weighting factor and standardization method to calculate the vulnerability indicator for agricultural infrastructure against climate change.

  • PDF

Search Algorithm for Efficient Optimal Path based on Time-weighted (시간 가중치 기반 효율적인 최적 경로 탐색 기법 연구)

  • Her, Yu-sung;Kim, Tae-woo;Ahn, Yonghak
    • Journal of the Korea Convergence Society
    • /
    • v.11 no.2
    • /
    • pp.1-8
    • /
    • 2020
  • In this paper, we propose an optimal path search algorithm between each node and midpoint that applies the time weighting. Services for using a location of mid point usually provide a mid point location-based on the location of users. There is a problem that is not efficient in terms of time because a location-based search method is only considered for location. To solve the problem of the existing location-based search method, the proposed algorithm sets the weights between each node and midpoint by reflecting user's location information and required time. Then, by utilizing that, it is possible to search for an optimum path. In addition, to increase the efficiency of the search, it ensures high accuracy by setting weights adaptively to the information given. Experimental results show that the proposed algorithm is able to find the optimal path to the midpoint compared with the existing method.

AN EXPERIMENTAL STUDY ON THE CHANGE OF CONDYLE HEAD AFTER MANDIBULAR RAMUS OBLIQUE OSTEOTOMY (하악지 사선골절단술 후 하악두의 변화에 관한 실험적 연구)

  • Cha, Seon-Kyung;Kim, Yeo-Gab
    • Maxillofacial Plastic and Reconstructive Surgery
    • /
    • v.14 no.1_2
    • /
    • pp.65-76
    • /
    • 1992
  • This study was designed to observe the adaptive changes of mandibular condyles to displacement of mandibular condyle in adult animals. In this study, 16 rabbits weighting about 3.5 kg was selected. Four rabbits were preserved for control group and 8 animals were divided into 3 groups, 2 weeks, 4 weeks and 8 weeks. The experimental animals were performed oblique osteotomy on both mandibular ramus and internal wiring at mandibular border. The experimental animals were sacrificed consecutively on the 2 weeks, 4 weeks and 8 weeks after oblique osteotomy and mandibular condyles were dissected out carefully to produced tissue specimen. The specimens were fixed with 10% N formaline solution for 24 hours and rinsed with phosphate buffer solution. It was decalcified with 5% nitric acid for 15 days. Thereafter the specimens were dehydrated in alcohol series and embedded paraffin as usual manner. The mebedded specimen were sectioned in $4-6{\mu}m$ microtome, stained with hematoxylin-eosin and azan stain and observed through light microscope. The following results were observed from this experiment. When there was postional change of condyle head after mandibular ramus oblique osteotomy in adult rabbit, 1. The density of chondrocyte was generally increased at condylar cartilage and the thickness of condylar cartilage was increased posterosuperior aspect of the mandibular condyle slightly. 2. The density of chondrocyte was increased at proliferative zone so fibrous articular zone, porliferative zone and hypertrophic zone was clearly distinguished. 3. Active endochondral bone formation was occurred at mandibular condyle.

  • PDF

A Novel Two-Stage Training Method for Unbiased Scene Graph Generation via Distribution Alignment

  • Dongdong Jia;Meili Zhou;Wei WEI;Dong Wang;Zongwen Bai
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.17 no.12
    • /
    • pp.3383-3397
    • /
    • 2023
  • Scene graphs serve as semantic abstractions of images and play a crucial role in enhancing visual comprehension and reasoning. However, the performance of Scene Graph Generation is often compromised when working with biased data in real-world situations. While many existing systems focus on a single stage of learning for both feature extraction and classification, some employ Class-Balancing strategies, such as Re-weighting, Data Resampling, and Transfer Learning from head to tail. In this paper, we propose a novel approach that decouples the feature extraction and classification phases of the scene graph generation process. For feature extraction, we leverage a transformer-based architecture and design an adaptive calibration function specifically for predicate classification. This function enables us to dynamically adjust the classification scores for each predicate category. Additionally, we introduce a Distribution Alignment technique that effectively balances the class distribution after the feature extraction phase reaches a stable state, thereby facilitating the retraining of the classification head. Importantly, our Distribution Alignment strategy is model-independent and does not require additional supervision, making it applicable to a wide range of SGG models. Using the scene graph diagnostic toolkit on Visual Genome and several popular models, we achieved significant improvements over the previous state-of-the-art methods with our model. Compared to the TDE model, our model improved mR@100 by 70.5% for PredCls, by 84.0% for SGCls, and by 97.6% for SGDet tasks.