• Title/Summary/Keyword: scale normalization

Search Result 78, Processing Time 0.022 seconds

Improvement of Face Recognition Rate by Normalization of Facial Expression (표정 정규화를 통한 얼굴 인식율 개선)

  • Kim, Jin-Ok
    • The KIPS Transactions:PartB
    • /
    • v.15B no.5
    • /
    • pp.477-486
    • /
    • 2008
  • Facial expression, which changes face geometry, usually has an adverse effect on the performance of a face recognition system. To improve the face recognition rate, we propose a normalization method of facial expression to diminish the difference of facial expression between probe and gallery faces. Two approaches are used to facial expression modeling and normalization from single still images using a generic facial muscle model without the need of large image databases. The first approach estimates the geometry parameters of linear muscle models to obtain a biologically inspired model of the facial expression which may be changed intuitively afterwards. The second approach uses RBF(Radial Basis Function) based interpolation and warping to normalize the facial muscle model as unexpressed face according to the given expression. As a preprocessing stage for face recognition, these approach could achieve significantly higher recognition rates than in the un-normalized case based on the eigenface approach, local binary patterns and a grey-scale correlation measure.

A RST Resistant Logo Embedding Technique Using Block DCT and Image Normalization (블록 DCT와 영상 정규화를 이용한 회전, 크기, 이동 변환에 견디는 강인한 로고 삽입방법)

  • Choi Yoon-Hee;Choi Tae-Sun
    • Journal of the Korea Institute of Information Security & Cryptology
    • /
    • v.15 no.5
    • /
    • pp.93-103
    • /
    • 2005
  • In this paper, we propose a RST resistant robust logo embedding technique for multimedia copyright protection Geometric manipulations are challenging attacks in that they do not introduce the quality degradation very much but make the detection process very complex and difficult. Watermark embedding in the normalized image directly suffers from smoothing effect due to the interpolation during the image normalization. This can be avoided by estimating the transform parameters using an image normalization technique, instead of embedding in the normalized image. Conventional RST resistant schemes that use full frame transform suffer from the absence of effective perceptual masking methods. Thus, we adopt $8\times8$ block DCT and calculate masking using a spatio-frequency localization of the $8\times8$ block DCT coefficients. Simulation results show that the proposed algorithm is robust against various signal processing techniques, compression and geometrical manipulations.

Robust Object Tracking based on Kernelized Correlation Filter with multiple scale scheme (다중 스케일 커널화 상관 필터를 이용한 견실한 객체 추적)

  • Yoon, Jun Han;Kim, Jin Heon
    • Journal of IKEEE
    • /
    • v.22 no.3
    • /
    • pp.810-815
    • /
    • 2018
  • The kernelized correlation filter algorithm yielded meaningful results in accuracy for object tracking. However, because of the use of a fixed size template, we could not cope with the scale change of the tracking object. In this paper, we propose a method to track objects by finding the best scale for each frame using correlation filtering response values in multi-scale using nearest neighbor interpolation and Gaussian normalization. The scale values of the next frame are updated using the optimal scale value of the previous frame and the optimal scale value of the next frame is found again. For the accuracy comparison, the validity of the proposed method is verified by using the VOT2014 data used in the existing kernelized correlation filter algorithm.

Path Loss Characterization in Tunnel Using Ray Launching Method at 2.6 GHz (Ray-Launching 기법을 이용한 2.6 GHz 대역의 터널 내 경로손실 특성 분석)

  • Kim, Do-Youn;Jo, Han-Shin;Yook, Jong-Gwan;Park, Han-Kyu
    • Proceedings of the Korea Electromagnetic Engineering Society Conference
    • /
    • 2003.11a
    • /
    • pp.33-37
    • /
    • 2003
  • This paper presents the characteristics of large-scale fading in a tunnel environment. The Ray-Launching Method has been used to analyze the characteristics of the tunnel. For a curved tunnel, The concept of RDN (Ray Density Normalization) is introduced in order to obtain more accurate results. For our purposes, the structure of tunnel is assumed to be either a straight or curved tunnel having rectangular cross-section. A large scale fading has been presented shown in several tunnel cases.

  • PDF

A Study on Object Based Image Analysis Methods for Land Use and Land Cover Classification in Agricultural Areas (변화지역 탐지를 위한 시계열 KOMPSAT-2 다중분광 영상의 MAD 기반 상대복사 보정에 관한 연구)

  • Yeon, Jong-Min;Kim, Hyun-Ok;Yoon, Bo-Yeol
    • Journal of the Korean Association of Geographic Information Studies
    • /
    • v.15 no.3
    • /
    • pp.66-80
    • /
    • 2012
  • It is necessary to normalize spectral image values derived from multi-temporal satellite data to a common scale in order to apply remote sensing methods for change detection, disaster mapping, crop monitoring and etc. There are two main approaches: absolute radiometric normalization and relative radiometric normalization. This study focuses on the multi-temporal satellite image processing by the use of relative radiometric normalization. Three scenes of KOMPSAT-2 imagery were processed using the Multivariate Alteration Detection(MAD) method, which has a particular advantage of selecting PIFs(Pseudo Invariant Features) automatically by canonical correlation analysis. The scenes were then applied to detect disaster areas over Sendai, Japan, which was hit by a tsunami on 11 March 2011. The case study showed that the automatic extraction of changed areas after the tsunami using relatively normalized satellite data via the MAD method was done within a high accuracy level. In addition, the relative normalization of multi-temporal satellite imagery produced better results to rapidly map disaster-affected areas with an increased confidence level.

A Corpus-based Study of Translation Universals in English Translations of Korean Newspaper Texts (한국 신문의 영어 번역에 나타난 번역 보편소의 코퍼스 기반 분석)

  • Goh, Gwang-Yoon;Lee, Younghee (Cheri)
    • Cross-Cultural Studies
    • /
    • v.45
    • /
    • pp.109-143
    • /
    • 2016
  • This article examines distinctive linguistic shifts of translational English in an effort to verify the validity of the translation universals hypotheses, including simplification, explicitation, normalization and leveling-out, which have been most heavily explored to date. A large-scale study involving comparable corpora of translated and non-translated English newspaper texts has been carried out to typify particular linguistic attributes inherent in translated texts. The main findings are as follows. First, by employing the parameters of STTR, top-to-bottom frequency words, and mean values of sentence lengths, the translational instances of simplification have been detected across the translated English newspaper corpora. In contrast, the portion of function words produced contrary results, which in turn suggests that this feature might not constitute an effective test of the hypothesis. Second, it was found that the use of connectives was more salient in original English newspaper texts than translated English texts, being incompatible with the explicitation hypothesis. Third, as an indicator of translational normalization, lexical bundles were found to be more pervasive in translated texts than in non-translated texts, which is expected from and therefore support the normalization hypothesis. Finally, the standard deviations of both STTR and mean sentence lengths turned out to be higher in translated texts, indicating that the translated English newspaper texts were less leveled out within the same corpus group, which is opposed to what the leveling-out hypothesis postulates. Overall, the results suggest that not all four hypotheses may qualify for the label translation universals, or at least that some translational predictors are not feasible enough to evaluate the effectiveness of the translation universals hypotheses.

A Desirability Function-Based Multi-Characteristic Robust Design Optimization Technique (호감도 함수 기반 다특성 강건설계 최적화 기법)

  • Jong Pil Park;Jae Hun Jo;Yoon Eui Nahm
    • Journal of Korean Society of Industrial and Systems Engineering
    • /
    • v.46 no.4
    • /
    • pp.199-208
    • /
    • 2023
  • Taguchi method is one of the most popular approaches for design optimization such that performance characteristics become robust to uncontrollable noise variables. However, most previous Taguchi method applications have addressed a single-characteristic problem. Problems with multiple characteristics are more common in practice. The multi-criteria decision making(MCDM) problem is to select the optimal one among multiple alternatives by integrating a number of criteria that may conflict with each other. Representative MCDM methods include TOPSIS(Technique for Order of Preference by Similarity to Ideal Solution), GRA(Grey Relational Analysis), PCA(Principal Component Analysis), fuzzy logic system, and so on. Therefore, numerous approaches have been conducted to deal with the multi-characteristic design problem by combining original Taguchi method and MCDM methods. In the MCDM problem, multiple criteria generally have different measurement units, which means that there may be a large difference in the physical value of the criteria and ultimately makes it difficult to integrate the measurements for the criteria. Therefore, the normalization technique is usually utilized to convert different units of criteria into one identical unit. There are four normalization techniques commonly used in MCDM problems, including vector normalization, linear scale transformation(max-min, max, or sum). However, the normalization techniques have several shortcomings and do not adequately incorporate the practical matters. For example, if certain alternative has maximum value of data for certain criterion, this alternative is considered as the solution in original process. However, if the maximum value of data does not satisfy the required degree of fulfillment of designer or customer, the alternative may not be considered as the solution. To solve this problem, this paper employs the desirability function that has been proposed in our previous research. The desirability function uses upper limit and lower limit in normalization process. The threshold points for establishing upper or lower limits let us know what degree of fulfillment of designer or customer is. This paper proposes a new design optimization technique for multi-characteristic design problem by integrating the Taguchi method and our desirability functions. Finally, the proposed technique is able to obtain the optimal solution that is robust to multi-characteristic performances.

An intelligent system for isomorphic transformation pattern recognition

  • Xie, Qiusheng;Kobayashi, Akira
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 1990.10b
    • /
    • pp.939-944
    • /
    • 1990
  • To recognize isomorphic transformation patterns, such as scale-change, translation and rotation transformed patterns, is an old difficult but interesting problem. Many researches have been done with a dominant approach of normalization by many eminent pioneers. However, there seems no a perfect system which can even recognize 90 .deg.-multiple rotation isomorphic transformation patterns for real needs. Here, as a new challenge, we propose a method of how to recognize 90 .deg.-multiple rotation isomorphic and symmetry isomorphic transformation patterns.

  • PDF

Image Scale Normalization Based on the Face Detection (얼굴 영역 검출 기반한 영상 크기 정규화)

  • 이혜현;임은경;김민환
    • Proceedings of the Korea Multimedia Society Conference
    • /
    • 2004.05a
    • /
    • pp.267-270
    • /
    • 2004
  • 본 논문에서는 다양한 크기를 갖는 얼굴 영상들을 대상으로 얼굴 영역을 추출하고 추출된 얼굴 영역을 기준으로 증명 사진의 규격에 맞게 영상을 정규화하는 방법을 제안한다. 얼굴 영역의 추출을 위해서 본 논문에서는 피부색 확장을 통하여 얼굴 후보 영역을 추출하고. 얼굴 기관의 위치 정보와 얼굴 모양에 대한 통계치를 사용하여 최종 얼굴 영역을 결정한다 추출된 얼굴 영역과 배경 영역의 크기에 대한 비례 관계는 증명사진에 대한 통계조사에 의해 산출된 규칙을 적용하여 정규화한다. 제안된 방법은 다양한 배경을 갖는 130개 영상을 대상으로 실험하여 타당성을 확인하였다.

  • PDF

Keypoint Detection Using Normalized Higher-Order Scale Space Derivatives (스케일 공간 고차 미분의 정규화를 통한 특징점 검출 기법)

  • Park, Jongseung;Park, Unsang
    • Journal of KIISE
    • /
    • v.42 no.1
    • /
    • pp.93-96
    • /
    • 2015
  • The SIFT method is well-known for robustness against various image transformations, and is widely used for image retrieval and matching. The SIFT method extracts keypoints using scale space analysis, which is different from conventional keypoint detection methods that depend only on the image space. The SIFT method has also been extended to use higher-order scale space derivatives for increasing the number of keypoints detected. Such detection of additional keypoints detected was shown to provide performance gain in image retrieval experiments. Herein, a sigma based normalization method for keypoint detection is introduced using higher-order scale space derivatives.