• Title/Summary/Keyword: 거리가중치

Search Result 394, Processing Time 0.023 seconds

Co-registration of PET-CT Brain Images using a Gaussian Weighted Distance Map (가우시안 가중치 거리지도를 이용한 PET-CT 뇌 영상정합)

  • Lee, Ho;Hong, Helen;Shin, Yeong-Gil
    • Journal of KIISE:Software and Applications
    • /
    • v.32 no.7
    • /
    • pp.612-624
    • /
    • 2005
  • In this paper, we propose a surface-based registration using a gaussian weighted distance map for PET-CT brain image fusion. Our method is composed of three main steps: the extraction of feature points, the generation of gaussian weighted distance map, and the measure of similarities based on weight. First, we segment head using the inverse region growing and remove noise segmented with head using region growing-based labeling in PET and CT images, respectively. And then, we extract the feature points of the head using sharpening filter. Second, a gaussian weighted distance map is generated from the feature points in CT images. Thus it leads feature points to robustly converge on the optimal location in a large geometrical displacement. Third, weight-based cross-correlation searches for the optimal location using a gaussian weighted distance map of CT images corresponding to the feature points extracted from PET images. In our experiment, we generate software phantom dataset for evaluating accuracy and robustness of our method, and use clinical dataset for computation time and visual inspection. The accuracy test is performed by evaluating root-mean-square-error using arbitrary transformed software phantom dataset. The robustness test is evaluated whether weight-based cross-correlation achieves maximum at optimal location in software phantom dataset with a large geometrical displacement and noise. Experimental results showed that our method gives more accuracy and robust convergence than the conventional surface-based registration.

Surface Curvature Based 3D Pace Image Recognition Using Depth Weighted Hausdorff Distance (표면 곡률을 이용하여 깊이 가중치 Hausdorff 거리를 적용한 3차원 얼굴 영상 인식)

  • Lee Yeung hak;Shim Jae chang
    • Journal of Korea Multimedia Society
    • /
    • v.8 no.1
    • /
    • pp.34-45
    • /
    • 2005
  • In this paper, a novel implementation of a person verification system based on depth-weighted Hausdorff distance (DWHD) using the surface curvature of the face is proposed. The definition of Hausdorff distance is a measure of the correspondence of two point sets. The approach works by finding the nose tip that has a protrusion shape on the face. In feature recognition of 3D face image, one has to take into consideration the orientated frontal posture to normalize after extracting face area from original image. The binary images are extracted by using the threshold values for the curvature value of surface for the person which has differential depth and surface characteristic information. The proposed DWHD measure for comparing two pixel sets were used, because it is simple and robust. In the experimental results, the minimum curvature which has low pixel distribution achieves recognition rate of 98% among the proposed methods.

  • PDF

Semantic Similarity Measures Between Words within a Document using WordNet (워드넷을 이용한 문서내에서 단어 사이의 의미적 유사도 측정)

  • Kang, SeokHoon;Park, JongMin
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.16 no.11
    • /
    • pp.7718-7728
    • /
    • 2015
  • Semantic similarity between words can be applied in many fields including computational linguistics, artificial intelligence, and information retrieval. In this paper, we present weighted method for measuring a semantic similarity between words in a document. This method uses edge distance and depth of WordNet. The method calculates a semantic similarity between words on the basis of document information. Document information uses word term frequencies(TF) and word concept frequencies(CF). Each word weight value is calculated by TF and CF in the document. The method includes the edge distance between words, the depth of subsumer, and the word weight in the document. We compared out scheme with the other method by experiments. As the result, the proposed method outperforms other similarity measures. In the document, the word weight value is calculated by the proposed method. Other methods which based simple shortest distance or depth had difficult to represent the information or merge informations. This paper considered shortest distance, depth and information of words in the document, and also improved the performance.

Salt and Pepper Noise Removal using Modified Distance Weight Filter (변형된 거리가중치 필터를 이용한 Salt and Pepper 잡음제거)

  • Lee, Hwa-Yeong;Kim, Nam-Ho
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2022.05a
    • /
    • pp.441-443
    • /
    • 2022
  • Currently, image processing is being used in various fields such as image analysis, image recognition, and factory automation according to the development of IT technology. Salt and pepper noise is generated due to various external factors in the process of acquiring or transmitting an image, which deteriorates the image quality. Therefore, noise removal to improve image quality is essential. Various methods have been proposed to remove salt and pepper noise, and representative examples include AF, MF, and A-TMF. However, the conventional filter has insufficient noise removal performance in a high-density noise environment. Therefore, in this paper, we propose an algorithm for estimating and processing the original pixel by using the modified distance weight filter only in the case of noise, and replacing the original pixel in case of non-noise after performing noise judgment. To evaluate the performance of the proposed algorithm, we compare and analyze it with existing algorithms using PSNR.

  • PDF

A Study of Document Ranking Algorithms in a P-norm Retrieval System (P-norm 검색의 문헌 순위화 기법에 관한 실험적 연구)

  • 고미영;정영미
    • Journal of the Korean Society for information Management
    • /
    • v.16 no.1
    • /
    • pp.7-30
    • /
    • 1999
  • This study is to develop effective document ranking algorithms in the P-norm retrieval system which can be implemented to the Boolean retrieval system without major difficulties by using non-statistical term weights based on document structure. Also, it is to enhance the performance by introducing the rank adjustment process which rearranges the ranks of retrieved documents according to the similarity between the top ranked documents and the rest of them. Of the non-statistical term weight algorithms, this study uses field weight and term pair distance weight. In the rank adjustment process, five retrieval experiments were performed, ranging between the case of using one record for the similarity measurement and the case of using first five records. It is proved that non-statistical term weights are highly effective and the rank adjustment process enhance the performance further.

  • PDF

Layout Criteria of an Access Mode's on and off Facility at Multiple Transfer Centers (복합환승센터 접근교통수단의 승하차 시설배치기준)

  • Kim, Si-Gon
    • KSCE Journal of Civil and Environmental Engineering Research
    • /
    • v.32 no.2D
    • /
    • pp.95-101
    • /
    • 2012
  • Layout Criteria are developed for an access mode's on and off facility at multiple transfer centers in this paper. Layout Criteria are location issues between a main mode and access modes in terms of the on and off facility one another. The total distance between them has been suggested to be minimized. In the distance calculation stairs are considered to be more difficult and than open space. On the other hand an escalator and an elevator are treated as easier than open space. Considering the number of people between on and off facility, the weighted average distance is suggested to be a MOE(Measure of effectiveness) for layout criteria at multiple transfer centers. Finally, the layout criteria are applied to the existing Kimpo airport terminal and some improvement ideas are suggested.

A Detection Technique for Credit-card Robbery using Time Weight and Distanced-based Graph (시간가중치와 거리기반 도표를 이용한 신용카드 도난 분실 탐지 기법)

  • 나용찬;나연묵
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2001.10a
    • /
    • pp.229-231
    • /
    • 2001
  • 최근들어 경제활동의 증가로 대부분의 성인들은 몇 장의 신용카드를 소지하고 있을 것이다. 이에 따른 신용카드의 도난 분실 사고는 카드사의 문제가 되고있다. 기존의 탐지 시스템은 도난신고 등의 일반적인 탐지와 갑작스런 사용 액수의 증가를 탐지하여 도난 분실 카드를 판별하였다. 이것은 소액의 부정거래탐지가 어렵다는 단점이 있다. 본 논문에서 제시하는 탐지 시스템은 outlier 기법을 사용하여 training set을 만들고 시간가중치와 거리기반 도표를 이용하여 도난 분실 카드를 탐지한다. 금액, 시간 도표에서 거래요구시간의 차를 계산하여 가중치를 주고 장소, 소비종류 도표에서는 training set에서 얻은 자료인 저녁 8시를 기준으로 소비종류의 배열을 바꾼다. 제안된 시스템은 소액의 부정거래 탐지에도 우수하고 이전의 시스템보다 정확함을 장점으로 한다.

  • PDF

Advanced HEED protocol using distance weight in USN (USN 환경에서 거리 가중치를 사용한 개선된 HEED 프로토콜)

  • Jeoung, Su-Hyung;Yoo, Hae-Young
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.10 no.2
    • /
    • pp.370-376
    • /
    • 2009
  • Recently, Study for routing protocol is gone vigorously in the Ubiquitous Sensor Network. A hierarchical routing protocol is being practical and received interest among them. Therefore we analyze a weak point of HEED. And I suggest the new protocol that solved a weak point of HEED. The new protocol that we propose puts weight in the energy remainder amount than HEED and elect CH. And elected CH is designed to change by new node when quantity of energy leftover becomes less than 50%. Therefore all nodes come to use energy fairly. The protocol that we proposed can prove the cluster survival rate about 30%. And CH is more effective because when elect CH replace, response time selects small node.

Weighted Distance De-interlacing Algorithm Based on EDI and NAL (EDI와 NAL 알고리듬을 기반으로 한 거리 가중치 비월주사 방식 알고리듬)

  • Lee, Se-Young;Ku, Su-Il;Jeong, Je-Chang
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.33 no.9C
    • /
    • pp.704-711
    • /
    • 2008
  • This paper proposes a new de-interlacing method which results in efficient visual improvement. In the proposed algorithm, the distance weight was considered and the previously developed the EDI (Edge Dependent Interpolation) algorithm and the NAL (New Adaptive Linear interpolation) algorithm were used as a basis. The do-interlacing method was divided into two main parts. First, the edge direction was found by using information of closer pixels. Then, missing pixels were interpolated along with the decided edge direction. In this paper, after predicting the edge through the EDI algorithm, missing pixels were interpolated by using the weighted distance based on the NAL algorithm. Experimental results indicate that the proposed algorithm be superior to the conventional algorithms in terms of the objective and subjective criteria.

The Comparison of Estimation Methods for the Missing Rainfall Data with spatio-temporal Variability (시공간적 변동성을 고려한 강우의 결측치 추정 방법의 비교)

  • Kim, Byung-Sik;Noh, Hui-Seong;Kim, Hung-Soo
    • Journal of Wetlands Research
    • /
    • v.13 no.2
    • /
    • pp.189-197
    • /
    • 2011
  • This paper reviewed application of data-driven method, distance-weighted method(IDWM, IEWM, CCWM, ANN), and radar data method estimated of missing raifall data. To evaluate these methods, statistics was compared using radar and station rainfall data from Imjin-river basin. The range of RMSE values calculated for CCWM, ANN was 1.4 to 1.79mm, and the range of RMSE values estimated data used for radar rainfall data was 0.05 to 2.26mm. Spatial characteristics is considered to Radar rainfall data rather than station rainfall data. Result suggest that estimated data used for radar data can impove estimation of missing raifall data.