• Title/Summary/Keyword: normalization factor

Search Result 101, Processing Time 0.029 seconds

Shape Recognition Using Skeleton Image Based on Mathematical Morphology (수리형태론적 스켈리턴 영상을 이용한 형상인식)

  • Jang, Ju-Seok;Son, Yun-Gu
    • The Transactions of the Korea Information Processing Society
    • /
    • v.3 no.4
    • /
    • pp.883-898
    • /
    • 1996
  • In this paper, we propose improved method to recognize the shape for enhancing the quality of the pattern recognition system by compressing the source images. In the proposed method, we reduced the data amount by skeletonizing the source images using mathematical morphology, and then matched patterns after accomplishing the translation and scale normalization, and rotation invariance on the transformed images. Through the scale normalization, it was possible for the shape recognition at minimum amount of the pixel by giving the weight to the skeleton pixel. As the source images was replaced by the skeleton images, it was possible to reduce the amount of data and computational loads dramatically, and so become much faster even with a smaller memory capacity. Through the experiment, we investigated the optimum scale factor and good result was proved when realizing the pattern recognition system.

  • PDF

Standardized Surveying Method of Rural Amenity Resources with Database Normalization Technique (자료정규화를 통한 농촌어메니티자원 조사표의 표준화)

  • Kim, Sang-Bum;Rhee, Sang-Young;Jung, Nam-Su;Lee, Ji-Min;Cho, Soon-Jae;Lee, Jeong-Jae
    • Journal of Korean Society of Rural Planning
    • /
    • v.10 no.4 s.25
    • /
    • pp.1-7
    • /
    • 2004
  • In Korea, rural community has been becomming unstable by declining of agriculture. In order to solve this problem, there were some trials to activate rural communities by maintaining rural amenities. But, it is difficult to use rural amenities as a development factor to promote rural communities because there are few researches about quantifying rural amenities. In this study, a method fer quantifying rural amenities is suggested using database normalization technique. Previous thirty seven surveying items of rural amenity resources are formally reduced to five common surveying items, seven resources, and eleven surveying tables. Finally, big picture of rural amenity resource map with surveying data for rural development is suggested.

Application of Vocal Properties and Vocal Independent Features to Classifying Sasang Constitution (음성 특성 및 음성 독립 변수의 사상체질 분류로의 적용 방법)

  • Kim, Keun-Ho;Kang, Nam-Sik;Ku, Bon-Cho;Kim, Jong-Yeol
    • Journal of Sasang Constitutional Medicine
    • /
    • v.23 no.4
    • /
    • pp.458-470
    • /
    • 2011
  • 1. Objectives Vocal characteristics are commonly considered as an important factor in determining the Sasang constitution and the health condition. We have tried to find out the classification procedure to distinguish the constitution objectively and quantitatively by analyzing the characteristics of subject's voice without noise and error. 2. Methods In this study, we extract the vocal features from voice selected with prior information, remove outliers, minimize the correlated features, correct the features with normalization according to gender and age, and make the discriminant functions that are adaptive to gender and age from the features for improving diagnostic accuracy. 3. Results and Conclusions Finally, the discriminant functions produced about 45% accuracy to classify the constitution for every age interval and every gender, and the diagnostic accuracy was meaningful as the result from only the voice.

Optimized Integer Cosine Transform (최적화 정수형 여현 변환)

  • 이종하;김혜숙;송인준;곽훈성
    • Journal of the Korean Institute of Telematics and Electronics B
    • /
    • v.32B no.9
    • /
    • pp.1207-1214
    • /
    • 1995
  • We present an optimized integer cosine transform(OICT) as an alternative approach to the conventional discrete cosine transform(DCT), and its fast computational algorithm. In the actual implementation of the OICT, we have used the techniques similar to those of the orthogonal integer transform(OIT). The normalization factors are approximated to single one while keeping the reconstruction error at the best tolerable level. By obtaining a single normalization factor, both forward and inverse transform are performed using only the integers. However, there are so many sets of integers that are selected in the above manner, the best OICT matrix obtained through value minimizing the Hibert-Schmidt norm and achieving fast computational algorithm. Using matrix decomposing, a fast algorithm for efficient computation of the order-8 OICT is developed, which is minimized to 20 integer multiplications. This enables us to implement a high performance 2-D DCT processor by replacing the floating point operations by the integer number operations. We have also run the simulation to test the performance of the order-8 OICT with the transform efficiency, maximum reducible bits, and mean square error for the Wiener filter. When the results are compared to those of the DCT and OIT, the OICT has out-performed them all. Furthermore, when the conventional DCT coefficients are reduced to 7-bit as those of the OICT, the resulting reconstructed images were critically impaired losing the orthogonal property of the original DCT. However, the 7-bit OICT maintains a zero mean square reconstruction error.

  • PDF

Step Size Normalization for Maximum Cross-Correntropy Algorithms (최대 상호코렌트로피 알고리듬을 위한 스텝사이즈 정규화)

  • Kim, Namyong
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.41 no.9
    • /
    • pp.995-1000
    • /
    • 2016
  • The maximum cross-correntropy (MCC) algorithm with a set of random symbols keeps its optimum weights undisturbed from impulsive noise unlike MSE-based algorithms and its main factor has been known to be the input magnitude controller (IMC) that adjusts the input intensity according to error power. In this paper, a normalization of the step size of the MCC algorithm by the power of IMC output is proposed. The IMC output power is tracked recursively through a single-pole low-pass filter. In the simulation under impulsive noise with two different multipath channels, the steady state MSE and convergence speed of the proposed algorithm is found to be enhanced by about 1 dB and 500 samples, respectively, compared to the conventional MCC algorithm.

The Effect of Specimen Size in Charpy Impact Testing (샬피 충격시험에 있어서 시험편 크기의 영향)

  • Kim, Hoon;Kim, Joo-Hark;Chi, Se-Hwan;Hong, Jun-Hwa
    • Transactions of the Korean Society of Mechanical Engineers A
    • /
    • v.21 no.1
    • /
    • pp.93-103
    • /
    • 1997
  • Charpy V-notch impact tests were performed on the full-, half-and third-size specimens from two ferritic SA 508 Cl. 3 steels for nuclear pressure vessel. New normalization factors were proposed to predict the upper shelf energy(USE) and the ductile-brittle transition temperature(DBTT) of full-size specimens from the measured data on sub-size specimens. The factors for the USE and the DBTT are $(Bb^2/Kt); and; (Bb/R)^1/2/, $ respectively, where B the width, b the ligament size, $K_{t}$ the elastic stress concentration factor, and R the notch root radius. These correlations successfully estimated the USE and DBTT of the full-size specimens based on sub-size specimen data. In addition, the size effects were studied to develop the correlations among absorbed energy, lateral expansion(LE) and displacement. It was also found that the LE was able to be estimated from the displacement obtained by the instrumented impact test, and that the displacement would be used as a criterion for the toughness of the steels corresponding to change in their yield strength.h.

Combined Normalized and Offset Min-Sum Algorithm for Low-Density Parity-Check Codes (LDPC 부호의 복호를 위한 정규화와 오프셋이 조합된 최소-합 알고리즘)

  • Lee, Hee-ran;Yun, In-Woo;Kim, Joon Tae
    • Journal of Broadcast Engineering
    • /
    • v.25 no.1
    • /
    • pp.36-47
    • /
    • 2020
  • The improved belief-propagation-based algorithms, such as normalized min-sum algorithm (NMSA) or offset min-sum algorithm (OMSA), are widely used to decode LDPC(Low-Density Parity-Check) codes because they are less computationally complex and work well even at low SNR(Signal-to-Noise Ratio). However, these algorithms work well only when an appropriate normalization factor or offset value is used. A new method that uses a CMD(Check Node Message Distribution) chart and least-square method, which has been recently proposed, has advantages on computational complexity over other approaches to get optimal coefficients. Furthermore, this method can be used to derive coefficients for each iteration. In this paper, we apply this method and propose an algorithm to derive a combination of normalization factor and offset value for a combined normalized and offset min-sum algorithm to further improve the decoding of LDPC codes. Simulations on the next-generation broadcasting standards, ATSC 3.0 LDPC codes, prove that a combined normalized and offset min-sum algorithm which takes the proposed coefficients as correction coefficients shows the best BER performance among other decoding algorithms.

A Feature Extraction Method by Simultaneous Diagonalization (동시절각화에 의한 다변수군간 특징추출의 일수법)

  • ;安居院猛
    • Journal of the Korean Institute of Telematics and Electronics
    • /
    • v.15 no.4
    • /
    • pp.14-19
    • /
    • 1978
  • Here a method is shown to extract features from two multi-variable classes by using the coordinate systems transformed by one-class and mixture normalization algorithms. Some properties and implemented results of this technique are described. Also, comparision of these features with factor analysis results is performed. This method is thought to be more powerful one, in feature extraction sense, than factor analysis.

  • PDF

Artificial intelligence (AI) based analysis for global warming mitigations of non-carbon emitted nuclear energy productions

  • Tae Ho Woo
    • Nuclear Engineering and Technology
    • /
    • v.55 no.11
    • /
    • pp.4282-4286
    • /
    • 2023
  • Nuclear energy is estimated by the machine learning method as the mathematical quantifications where neural networking is the major algorithm of the data propagations from input to output. As the aspect of nuclear energy, the other energy sources of the traditional carbon emission-characterized oil and coal are compared. The artificial intelligence (AI) oriented algorithm like the intelligence of a robot is applied to the modeling in which the mimicking of biological neurons is utilized in the mathematical calculations. There are graphs for nuclear priority weighted by climate factor and for carbon dioxide mitigation weighted by climate factor in which the carbon dioxide quantities are divided by the weighting that produces some results. Nuclear Priority and CO2 Mitigation values give the dimensionless values that are the comparative quantities with the normalization in 2010. The values are 1.0 in 2010 of the graphs which are changed to 24.318 and 0.0657 in 2040, respectively. So, the carbon dioxide emissions could be reduced in this study.

PARAFAC Tensor Reconstruction for Recommender System based on Apache Spark (아파치 스파크에서의 PARAFAC 분해 기반 텐서 재구성을 이용한 추천 시스템)

  • Im, Eo-Jin;Yong, Hwan-Seung
    • Journal of Korea Multimedia Society
    • /
    • v.22 no.4
    • /
    • pp.443-454
    • /
    • 2019
  • In recent years, there has been active research on a recommender system that considers three or more inputs in addition to users and goods, making it a multi-dimensional array, also known as a tensor. The main issue with using tensor is that there are a lot of missing values, making it sparse. In order to solve this, the tensor can be shrunk using the tensor decomposition algorithm into a lower dimensional array called a factor matrix. Then, the tensor is reconstructed by calculating factor matrices to fill original empty cells with predicted values. This is called tensor reconstruction. In this paper, we propose a user-based Top-K recommender system by normalized PARAFAC tensor reconstruction. This method involves factorization of a tensor into factor matrices and reconstructs the tensor again. Before decomposition, the original tensor is normalized based on each dimension to reduce overfitting. Using the real world dataset, this paper shows the processing of a large amount of data and implements a recommender system based on Apache Spark. In addition, this study has confirmed that the recommender performance is improved through normalization of the tensor.