• Title/Summary/Keyword: Weighted Distance

Search Result 372, Processing Time 0.026 seconds

A Study on 1-D Bit-Serial Array Processor Design for Code-String Matching Using a MWLD Algorithm (MWLD 알고리즘을 이용한 문자열정합 1차원 Bit-Serial 어레이 프로세서의 설계)

  • 박종진;김은원;조원경
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.29B no.2
    • /
    • pp.1-8
    • /
    • 1992
  • This paper is proposed a Modified WLD (Weighted Levenshtein Distance) algorithm for processor desihn of code-string matching. A proposed MWLD (Modified Weighted Levenshtein Distance) algorithm is consist of 1-dimension bit-serial array processor to pattern matching using a Hamming Distance. The proposed processor is applied to recognition of character with real time input. The recognition rate of Hangul strokes is resulted to 98.65$\%$

  • PDF

Weighted Distance-Based Quantization for Distributed Estimation

  • Kim, Yoon Hak
    • Journal of information and communication convergence engineering
    • /
    • v.12 no.4
    • /
    • pp.215-220
    • /
    • 2014
  • We consider quantization optimized for distributed estimation, where a set of sensors at different sites collect measurements on the parameter of interest, quantize them, and transmit the quantized data to a fusion node, which then estimates the parameter. Here, we propose an iterative quantizer design algorithm with a weighted distance rule that allows us to reduce a system-wide metric such as the estimation error by constructing quantization partitions with their optimal weights. We show that the search for the weights, the most expensive computational step in the algorithm, can be conducted in a sequential manner without deviating from convergence, leading to a significant reduction in design complexity. Our experments demonstrate that the proposed algorithm achieves improved performance over traditional quantizer designs. The benefit of the proposed technique is further illustrated by the experiments providing similar estimation performance with much lower complexity as compared to the recently published novel algorithms.

Negative Exponential Disparity Based Robust Estimates of Ordered Means in Normal Models

  • Bhattacharya, Bhaskar;Sarkar, Sahadeb;Jeong, Dong-Bin
    • Communications for Statistical Applications and Methods
    • /
    • v.7 no.2
    • /
    • pp.371-383
    • /
    • 2000
  • Lindsay (1994) and Basu et al (1997) show that another density-based distance called the negative exponential disparity (NED) is an excellent competitor to the Hellinger distance (HD) in generating an asymptotically fully efficient and robust estimator. Bhattacharya and Basu (1996) consider estimation of the locations of several normal populations when an order relation between them is known to be true. They empirically show that the robust HD based weighted likelihood estimators compare favorably with the M-estimators based on Huber's $\psi$ function, the Gastworth estimator, and the trimmed mean estimator. In this paper we investigate the performance of the weighted likelihood estimator based on the NED as a robust alternative relative to that based on the HD. The NED based estimator is found to be quite competitive in the settings considered by Bhattacharya and Basu.

  • PDF

Minimizing the Average Distance of Separated Points on the Plane in the L1-Distance

  • Kim, Jae-Hoon
    • Journal of information and communication convergence engineering
    • /
    • v.10 no.1
    • /
    • pp.1-4
    • /
    • 2012
  • Given separated points divided by a line, called a wall, in a plane, we aim to make a gate in the wall to connect the separated points to each other. In this setting, the problem is to find a location for the gate that minimizes the average distance between the points. The problem is a variant of the well-known facility location problem, which is extensively studied in the fields of operations research, location theory, theoretical computer science, and so on. In this paper, we consider the $L^1$-distance of the points in the plane. The points are projected onto the wall and so the problem is transformed to a proximity problem of points on a line. Then it is shown that the transformed problem is related to the weighted median problem of points on the line. Therefore, we obtain an O(n log n)-time algorithm to solve our problem.

Weighted Distance De-interlacing Algorithm Based on EDI and NAL (EDI와 NAL 알고리듬을 기반으로 한 거리 가중치 비월주사 방식 알고리듬)

  • Lee, Se-Young;Ku, Su-Il;Jeong, Je-Chang
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.33 no.9C
    • /
    • pp.704-711
    • /
    • 2008
  • This paper proposes a new de-interlacing method which results in efficient visual improvement. In the proposed algorithm, the distance weight was considered and the previously developed the EDI (Edge Dependent Interpolation) algorithm and the NAL (New Adaptive Linear interpolation) algorithm were used as a basis. The do-interlacing method was divided into two main parts. First, the edge direction was found by using information of closer pixels. Then, missing pixels were interpolated along with the decided edge direction. In this paper, after predicting the edge through the EDI algorithm, missing pixels were interpolated by using the weighted distance based on the NAL algorithm. Experimental results indicate that the proposed algorithm be superior to the conventional algorithms in terms of the objective and subjective criteria.

A Performance Analysis of the Face Recognition Based on PCA/LDA on Distance Measures (거리 척도에 따른 PCA/LDA기반의 얼굴 인식 성능 분석)

  • Song Young-Jun;Kim Young-Gil;Ahn Jae-Hyeong
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.6 no.3
    • /
    • pp.249-254
    • /
    • 2005
  • In this paper, we analysis the recognition performance of PCA/LDA by distance measures. We are adapt to ORL face database with the fourteen distance measures. In case of PCA, it has high performance for the manhattan distance and the weighted SSE distance to face recognition, In case of PCA/LDA, it has high performance for the angle-based distance and the modified SSE distance. Also, PCA/LDA is better than PCA for reduction of dimension. Therefore, the PCA/LDA method and the angle-based distance have the most performance and a few dimension for face recognition with ORL face database.

  • PDF

Optimally Weighted Cepstral Distance Measure for Speech Recognition (음성 인식을 위한 최적 가중 켑스트랄 거리 측정 방법)

  • 김원구
    • Proceedings of the Acoustical Society of Korea Conference
    • /
    • 1994.06c
    • /
    • pp.133-137
    • /
    • 1994
  • In this paper, a method for designing an optimal weight function for the weighted cepstral distance measure is proposed. A conventional weight function or cepstral lifter is obtained eperimentally depending on the spectral components to be emphasized. The proposed method minimizes the error between word reference patterns and the traning data. To compare the proposed optimal weight function with conventional function, speech recognition systems based on Dpynamic Time Warping and Hidden Markov Models were constructed to conduct speaker independent isolated word necogination eperiment. Results show that the proposed method gives better performance than conventional weight functions.

  • PDF

Design of 3D Laser Radar Based on Laser Triangulation

  • Yang, Yang;Zhang, Yuchen;Wang, Yuehai;Liu, Danian
    • KSII Transactions on Internet and Information Systems (TIIS)
    • /
    • v.13 no.5
    • /
    • pp.2414-2433
    • /
    • 2019
  • The aim of this paper is to design a 3D laser radar prototype based on laser triangulation. The mathematical model of distance sensitivity is deduced; a pixel-distance conversion formula is discussed and used to complete 3D scanning. The center position extraction algorithm of the spot is proposed, and the error of the linear laser, camera distortion and installation are corrected by using the proposed weighted average algorithm. Finally, the three-dimensional analytic computational algorithm is given to transform the measured distance into point cloud data. The experimental results show that this 3D laser radar can accomplish the 3D object scanning and the environment 3D reconstruction task. In addition, the experiment result proves that the product of the camera focal length and the baseline length is the key factor to influence measurement accuracy.

Machine-printed Numeral Recognition using Weighted Template Matching with Chain Code Trimming (체인 코드 트리밍과 가중 원형 정합을 이용한 인쇄체 숫자 인식)

  • Jung, Min-Chul
    • Journal of Intelligence and Information Systems
    • /
    • v.13 no.4
    • /
    • pp.35-44
    • /
    • 2007
  • This paper proposes a new method of weighted template matching for machine-printed numeral recognition. The proposed weighted template matching, which emphasizes the feature of a pattern using adaptive Hamming distance on local feature areas, improves the recognition rate while template matching processes an input image as one global feature. Template matching is vulnerable to random noises that generate ragged outlines of a pattern when it is binarized. This paper offers a method of chain code trimming in order to remove ragged outlines. The method corrects specific chain codes within the chain codes of the inner and the outer contour of a pattern. The experiment compares confusion matrices of both the template matching and the proposed weighted template matching with chain code trimming. The result shows that the proposed method improves fairly the recognition rate of the machine-printed numerals.

  • PDF

Co-registration of PET-CT Brain Images using a Gaussian Weighted Distance Map (가우시안 가중치 거리지도를 이용한 PET-CT 뇌 영상정합)

  • Lee, Ho;Hong, Helen;Shin, Yeong-Gil
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.7
    • /
    • pp.612-624
    • /
    • 2005
  • In this paper, we propose a surface-based registration using a gaussian weighted distance map for PET-CT brain image fusion. Our method is composed of three main steps: the extraction of feature points, the generation of gaussian weighted distance map, and the measure of similarities based on weight. First, we segment head using the inverse region growing and remove noise segmented with head using region growing-based labeling in PET and CT images, respectively. And then, we extract the feature points of the head using sharpening filter. Second, a gaussian weighted distance map is generated from the feature points in CT images. Thus it leads feature points to robustly converge on the optimal location in a large geometrical displacement. Third, weight-based cross-correlation searches for the optimal location using a gaussian weighted distance map of CT images corresponding to the feature points extracted from PET images. In our experiment, we generate software phantom dataset for evaluating accuracy and robustness of our method, and use clinical dataset for computation time and visual inspection. The accuracy test is performed by evaluating root-mean-square-error using arbitrary transformed software phantom dataset. The robustness test is evaluated whether weight-based cross-correlation achieves maximum at optimal location in software phantom dataset with a large geometrical displacement and noise. Experimental results showed that our method gives more accuracy and robust convergence than the conventional surface-based registration.