• Title/Summary/Keyword: Korean normalization

Search Result 927, Processing Time 0.059 seconds

Design and Implementation of Binary Image Normalization Hardware for High Speed Processing (고속 처리를 위한 이진 영상 정규화 하드웨어의 설계 및 구현)

  • 김형구;강선미;김덕진
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.31B no.5
    • /
    • pp.162-167
    • /
    • 1994
  • The binary image normalization method in image processing can be used in several fields, Especially, its high speed processing method and its hardware implmentation is more useful, A normalization process of each character in character recognition requires a lot of processing time. Therefore, the research was done as a part of high speed process of OCR (optical character reader) implementation as a pipeline structure with host computer in hardware to give temporal parallism. For normalization process, general purpose CPU,MC68000, was used to implement it. As a result of experiment, the normalization speed of the hardware is sufficient to implement high speed OCR which the recognition speed is over 140 characters per second.

  • PDF

Physical Artifact Correction in Nuclear Medicine Imaging: Normalization and Attenuation Correction (핵의학 영상의 물리적 인공산물보정: 정규화보정 및 감쇠보정)

  • Kim, Jin-Su;Lee, Jae-Sung;Cheon, Gi-Jeong
    • Nuclear Medicine and Molecular Imaging
    • /
    • v.42 no.2
    • /
    • pp.112-117
    • /
    • 2008
  • Artifact corrections including normalization and attenuation correction were important for quantitative analysis in Nuclear Medicine Imaging. Normalization is the process of ensuring that all lines of response joining detectors in coincidence have the same effective sensitivity. Failure to account for variations in LOR sensitivity leads to bias and high-frequency artifacts in the reconstructed images. Attenuation correction is the process of the correction of attenuation phenomenon lies in the natural property that photons emitted by the radiopharmaceutical will interact with tissue and other materials as they pass through the body. In this paper, we will review the several approaches for normalization and attenuation correction strategies.

Selective pole filtering based feature normalization for performance improvement of short utterance recognition in noisy environments (잡음 환경에서 짧은 발화 인식 성능 향상을 위한 선택적 극점 필터링 기반의 특징 정규화)

  • Choi, Bo Kyeong;Ban, Sung Min;Kim, Hyung Soon
    • Phonetics and Speech Sciences
    • /
    • v.9 no.2
    • /
    • pp.103-110
    • /
    • 2017
  • The pole filtering concept has been successfully applied to cepstral feature normalization techniques for noise-robust speech recognition. In this paper, it is proposed to apply the pole filtering selectively only to the speech intervals, in order to further improve the recognition performance for short utterances in noisy environments. Experimental results on AURORA 2 task with clean-condition training show that the proposed selectively pole-filtered cepstral mean normalization (SPFCMN) and selectively pole-filtered cepstral mean and variance normalization (SPFCMVN) yield error rate reduction of 38.6% and 45.8%, respectively, compared to the baseline system.

Correction for Hangul Normalization in Unicode (유니코드 환경에서의 올바른 한글 정규화를 위한 수정 방안)

  • Ahn, Dae-Hyuk;Park, Young-Bae
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.2
    • /
    • pp.169-177
    • /
    • 2007
  • Hangul text normalization in current Unicode makes wrong Hangul syllable problems when using with precomposed modern Hangul syllables and composing old Hangul by using conjoining-Hangul Jamo and compatibility Hangul Jamo. This problem comes from allowing incorrect normalization form of compatibility Hangul Jamo and Hangul Symbol and also permitting to use conjoining-Hangul Jamo mixture with precomposed Hangul syllable in Unicode Hangul composing rule. It is caused by lack of consideration of old Hangul and/or insufficient understanding of Hangul code processing when writing specification for normalization forms in Unicode. Therefore on this paper, we study Hangul code in Unicode environment, specifically problems of normalization used for Web and XML, IDN in nowadays. Also we propose modification of Hangul normalization methods and Hangul composing rules for correct processing of Hangul normalization in Unicode.

Supervised Rank Normalization for Support Vector Machines (SVM을 위한 교사 랭크 정규화)

  • Lee, Soojong;Heo, Gyeongyong
    • Journal of the Korea Society of Computer and Information
    • /
    • v.18 no.11
    • /
    • pp.31-38
    • /
    • 2013
  • Feature normalization as a pre-processing step has been widely used in classification problems to reduce the effect of different scale in each feature dimension and error as a result. Most of the existing methods, however, assume some distribution function on feature distribution. Even worse, existing methods do not use the labels of data points and, as a result, do not guarantee the optimality of the normalization results in classification. In this paper, proposed is a supervised rank normalization which combines rank normalization and a supervised learning technique. The proposed method does not assume any feature distribution like rank normalization and uses class labels of nearest neighbors in classification to reduce error. SVM, in particular, tries to draw a decision boundary in the middle of class overlapping zone, the reduction of data density in that area helps SVM to find a decision boundary reducing generalized error. All the things mentioned above can be verified through experimental results.

A Local Alignment Algorithm using Normalization by Functions (함수에 의한 정규화를 이용한 local alignment 알고리즘)

  • Lee, Sun-Ho;Park, Kun-Soo
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.34 no.5_6
    • /
    • pp.187-194
    • /
    • 2007
  • A local alignment algorithm does comparing two strings and finding a substring pair with size l and similarity s. To find a pair with both sufficient size and high similarity, existing normalization approaches maximize the ratio of the similarity to the size. In this paper, we introduce normalization by functions that maximizes f(s)/g(l), where f and g are non-decreasing functions. These functions, f and g, are determined by experiments comparing DNA sequences. In the experiments, our normalization by functions finds appropriate local alignments. For the previous algorithm, which evaluates the similarity by using the longest common subsequence, we show that the algorithm can also maximize the score normalized by functions, f(s)/g(l) without loss of time.

URL Signatures for Improving URL Normalization (URL 정규화 향상을 위한 URL 서명)

  • Soon, Lay-Ki;Lee, Sang-Ho
    • Journal of KIISE:Databases
    • /
    • v.36 no.2
    • /
    • pp.139-149
    • /
    • 2009
  • In the standard URL normalization mechanism, URLs are normalized syntactically by a set of predefined steps. In this paper, we propose to complement the standard URL normalization by incorporating the semantically meaningful metadata of the web pages. The metadata taken into consideration are the body texts and the page size of the web pages, which can be extracted during HTML parsing. The results from our first exploratory experiment indicate that the body texts are effective in identifying equivalent URLs. Hence, given a URL which has undergone the standard normalization, we construct its URL signature by hashing the body text of the associated web page using Message-Digest algorithm 5 in the second experiment. URLs which share identical signatures are considered to be equivalent in our scheme. The results in the second experiment show that our proposed URL signatures were able to further reduce redundant URLs by 32.94% in comparison with the standard URL normalization.

Dynamic Contrast Enhanced MRI and Intravoxel Incoherent Motion to Identify Molecular Subtypes of Breast Cancer with Different Vascular Normalization Gene Expression

  • Wan-Chen Tsai;Kai-Ming Chang;Kuo-Jang Kao
    • Korean Journal of Radiology
    • /
    • v.22 no.7
    • /
    • pp.1021-1033
    • /
    • 2021
  • Objective: To assess the expression of vascular normalization genes in different molecular subtypes of breast cancer and to determine whether molecular subtypes with a higher vascular normalization gene expression can be identified using dynamic contrast-enhanced (DCE) magnetic resonance imaging (MRI) and intravoxel incoherent motion (IVIM) diffusion-weighted imaging (DWI). Materials and Methods: This prospective study evaluated 306 female (mean age ± standard deviation, 50 ± 10 years), recruited between January 2014 and August 2017, who had de novo breast cancer larger than 1 cm in diameter (308 tumors). DCE MRI followed by IVIM DWI studies using 11 different b-values (0 to 1200 s/mm2) were performed on a 1.5T MRI system. The Tofts model and segmented biexponential IVIM analysis were used. For each tumor, the molecular subtype (according to six [I-VI] subtypes and PAM50 subtypes), expression profile of genes for vascular normalization, pericytes, and normal vascular signatures were determined using freshly frozen tissue. Statistical associations between imaging parameters and molecular subtypes were examined using logistic regression or linear regression with a significance level of p = 0.05. Results: Breast cancer subtypes III and VI and PAM50 subtypes luminal A and normal-like exhibited a higher expression of genes for vascular normalization, pericyte markers, and normal vessel function signature (p < 0.001 for all) compared to other subtypes. Subtypes III and VI and PAM50 subtypes luminal A and normal-like, versus the remaining subtypes, showed significant associations with Ktrans, kep, vp, and IAUGCBN90 on DEC MRI, with relatively smaller values in the former. The subtype grouping was significantly associated with D, with relatively less restricted diffusion in subtypes III and VI and PAM50 subtypes luminal A and normal-like. Conclusion: DCE MRI and IVIM parameters may identify molecular subtypes of breast cancers with a different vascular normalization gene expression.

A Study on the Negotiation on Management Normalization of GM Korea through the Two-Level Games (양면게임 이론으로 분석한 한국GM 경영정상화 협상연구)

  • Lee, Ji-Seok
    • Korea Trade Review
    • /
    • v.44 no.1
    • /
    • pp.31-44
    • /
    • 2019
  • This study examines the normalization of Korean GM management between the Korean government and GM in terms of external negotiation game and internal negotiation game using Putnam's Two-Level Games. In addition, GM's Win-set change and negotiation strategy were analyzed. This analysis suggested implications for the optimal negotiation strategy for mutual cooperation between multinational corporations and local governments in the global business environment. First, the negotiation strategy for Korea's normalization of GM management in Korea can be shifted to both the concession theory and the opposition theory depending on the situation change and the government policy centered on the cautious theory. Second, GM will maximize its bargaining power through 'brink-end tactics' by utilizing the fact that the labor market is stabilized, which is the biggest weakness of the Korean government, while maintaining a typical Win-set reduction strategy. GM will be able to restructure at any time in terms of global management strategy, and if the financial support of the Korean government is provided, it will maintain the local factory but withdraw the local plant at the moment of stopping the support. In negotiations on the normalization of GM management in Korea, it is necessary to prepare a problem and countermeasures for various scenarios and to maintain a balance so that the policy does not deviate to any one side.