• Title/Summary/Keyword: normalization method

Search Result 640, Processing Time 0.037 seconds

Global Environmental Impacts Assessment of Power Generation Technologies with LCA Method (LCA를 통한 국내 발전기술의 글로벌 환경성 평가)

  • Chung Whan-Sam;Kim Seong-Ho;Kim Tae-Woon
    • Journal of Energy Engineering
    • /
    • v.14 no.2 s.42
    • /
    • pp.140-146
    • /
    • 2005
  • In this study, a quantitative environmental impacts assessment was performed for various power technologies with a lift cycle assessment (LCA) method. The LCA is regarded as a useful tool far analyzing diverse environmental impacts at a local, regional, and global aspect. The investigated power plants such as nuclear, coal, and LNC power systems were selected because they took share over $90\%$ of domestic elec-tricity supply in Korea. Furthermore, a wind power technology was included as a representative energy source out of Korean renewable energy systems. According to the three geological aspects, environmental impacts had been categorized into eight types. For these impact categories, characterization had been carried out for comparing environmental burdens of power systems under consideration. Then, normalization had been done in order to gain a better understanding of the relative size among impact categories.

Human Activity Recognition using View-Invariant Features and Probabilistic Graphical Models (시점 불변인 특징과 확률 그래프 모델을 이용한 인간 행위 인식)

  • Kim, Hyesuk;Kim, Incheol
    • Journal of KIISE
    • /
    • v.41 no.11
    • /
    • pp.927-934
    • /
    • 2014
  • In this paper, we propose an effective method for recognizing daily human activities from a stream of three dimensional body poses, which can be obtained by using Kinect-like RGB-D sensors. The body pose data provided by Kinect SDK or OpenNI may suffer from both the view variance problem and the scale variance problem, since they are represented in the 3D Cartesian coordinate system, the origin of which is located on the center of Kinect. In order to resolve the problem and get the view-invariant and scale-invariant features, we transform the pose data into the spherical coordinate system of which the origin is placed on the center of the subject's hip, and then perform on them the scale normalization using the length of the subject's arm. In order to represent effectively complex internal structures of high-level daily activities, we utilize Hidden state Conditional Random Field (HCRF), which is one of probabilistic graphical models. Through various experiments using two different datasets, KAD-70 and CAD-60, we showed the high performance of our method and the implementation system.

Comparative Study on the Attenuation of P and S Waves in the Crust of the Southeastern Korea (한국 남동부 지각의 P파와 5파 감쇠구조 비교연구)

  • Chung, Tae-Woong
    • Journal of the Korean earth science society
    • /
    • v.22 no.2
    • /
    • pp.112-119
    • /
    • 2001
  • The Yangsan fault in the southeastern Korea has been receiving increasing attention in its seismic activity. In this fault region, by using the extended coda-normalization method for 707 seismograms of local earthquakes, were obtained 0.009f$^{-1.05}$ and 0.004f$^{-0.70}$ for fitting values of Q$_p^{-1}$ and Q$_s^{-1}$, respectively. These results indicate that Q$_p^{-1}$ and Q$_s^{-1}$ in the southeastern Korea is the lowest level in the world although the exponent values agree well with those in the other areas. The low Q-1 is not related to the movement of the Yangsan fault but to the tectonically inactive status like a shield area.

  • PDF

A Study on the Attenuation of High-frequency P and S Waves in the Crust of the Southeastern Korea using the Seismic Data in Deok-jung Ri (덕정리 지진자료를 이용한 한국남동부지역 지각의 P, S파 감쇠구조 연구)

  • Chung, Tae-Woong;Sato, Haruo
    • Journal of the Korean Geophysical Society
    • /
    • v.3 no.3
    • /
    • pp.193-200
    • /
    • 2000
  • The attenuation characteristics($Q^{-1}$) are important factors representing the physical properties of the Earth interiors, and are essential for the quantitative prediction of strong ground-motion. Based on 156 earthquakes including 76 single-station record on the seismic station located Deok-jung Ri, southeastern Korea, we made the simultaneous measurement of P and S wave attenuation($Q_P^{-1}\;and\;Q_S^{-1}$) by means of extended coda-normalization method. Estimated $Q_P^{-1}\;and\;Q_S^{-1}$ decreased from $1{\times}10^{-2}\;and\;9{\times}10^{-3}$ at 1.5 Hz to $6{\times}10^{-4}\;and\;5{\times}10^{-4}$ at 24 Hz, respectively. This can be expressed by $Q_P^{-1}=0.01\;f^{-1.07}\;and\;Q_S^{-1}=0.01\;f^{-1.03}$ which indicate strong frequency dependence.

  • PDF

Embeded-type Search Function with Feedback for Smartphone Applications (스마트폰 애플리케이션을 위한 임베디드형 피드백 지원 검색체)

  • Kang, Moonjoong;Hwang, Mintae
    • Journal of the Korea Institute of Information and Communication Engineering
    • /
    • v.21 no.5
    • /
    • pp.974-983
    • /
    • 2017
  • In this paper, we have discussed the search function that can be embedded and used on Android-based applications. We used BM25 to suppress insignificant and too frequent words such as postpositions, Pivoted Length Normalization technique used to resolve the search priority problem related to each item's length, and Rocchio's method to pull items inferred to be related to the query closer to the query vector on Vector Space Model to support implicit feedback function. The index operation is divided into two methods; simple index to support offline operation and complex index for online operation. The implementation uses query inference function to guess user's future input by collating given present input with indexed data and with it the function is able to handle and correct user's error. Thus the implementation could be easily adopted into smartphone applications to improve their search functions.

Normalization of Spectral Magnitude and Cepstral Transformation for Compensation of Lombard Effect (롬바드 효과의 보정을 위한 스펙트럼 크기의 정규화와 켑스트럼 변환)

  • Chi, Sang-Mun;Oh, Yung-Hwan
    • The Journal of the Acoustical Society of Korea
    • /
    • v.15 no.4
    • /
    • pp.83-92
    • /
    • 1996
  • This paper describes Lombard effect compensation and noise suppression so as to reduce speech recognition error in noisy environments. Lombard effect is represented by the variation of spectral envelope of energy normalized word and the variation of overall vocal intensity. The variation of spectral envelope can be compensated by linear transformation in cepstral domain. The variation of vocal intensity is canceled by spectral magnitude normalization. Spectral subtraction is use to suppress noise contamination, and band-pass filtering is used to emphasize dynamic features. To understand Lombard effect and verify the effectiveness of the proposed method, speech data are collected in simulated noisy environments. Recognition experiments were conducted with contamination by noise from automobile cabins, an exhibition hall, telephone booths in down town, crowded streets, and computer rooms. From the experiments, the effectiveness of the proposed method has been confirmed.

  • PDF

A Method for Same Author Name Disambiguation in Domestic Academic Papers (국내 학술논문의 동명이인 저자명 식별을 위한 방법)

  • Shin, Daye;Yang, Kiduk
    • Journal of the Korean BIBLIA Society for library and Information Science
    • /
    • v.28 no.4
    • /
    • pp.301-319
    • /
    • 2017
  • The task of author name disambiguation involves identifying an author with different names or different authors with the same name. The author name disambiguation is important for correctly assessing authors' research achievements and finding experts in given areas as well as for the effective operation of scholarly information services such as citation indexes. In the study, we performed error correction and normalization of data and applied rules-based author name disambiguation to compare with baseline machine learning disambiguation in order to see if human intervention could improve the machine learning performance. The improvement of over 0.1 in F-measure by the corrected and normalized email-based author name disambiguation over machine learning demonstrates the potential of human pattern identification and inference, which enabled data correction and normalization process as well as the formation of the rule-based diambiguation, to complement the machine learning's weaknesses to improve the author name disambiguation results.

Normalizing interval data and their use in AHP (구간데이터 정규화와 계층적 분석과정에의 활용)

  • Kim, Eun Young;Ahn, Byeong Seok
    • Journal of Intelligence and Information Systems
    • /
    • v.22 no.2
    • /
    • pp.1-11
    • /
    • 2016
  • Entani and Tanaka (2007) presented a new approach for obtaining interval evaluations suitable for handling uncertain data. Above all, their approach is characterized by the normalization of interval data and thus the elimination of redundant bounds. Further, interval global weights in AHP are derived by using such normalized interval data. In this paper, we present a heuristic method for finding extreme points of interval data, which basically extends the method by Entani and Tanaka (2007), and also helps to obtain normalized interval data. In the second part of this paper, we show that the solutions to the linear program for interval global weights can be obtained by a simple inspection. In the meantime, the absolute dominance proposed by the authors is extended to pairwise dominance which makes it possible to identify at least more dominated alternatives under the same information.

Comparison of Based on Histogram Equalization Techniques by Using Normalization in Thoracic Computed Tomography (흉부 컴퓨터 단층 촬영에서 정규화를 사용한 다양한 히스토그램 평준화 기법을 비교)

  • Lee, Young-Jun;Min, Jung-Whan
    • Journal of radiological science and technology
    • /
    • v.44 no.5
    • /
    • pp.473-480
    • /
    • 2021
  • This study was purpose to method that applies for improving the image quality in CT and X-ray scan, especially in the lung region. Also, we researched the parameters of the image before and after applying for Histogram Equalization (HE) such as mean, median values in the histogram. These techniques are mainly used for all type of medical images such as for Chest X-ray, Low-Dose Computed Tomography (CT). These are also used to intensify tiny anatomies like vessels, lung nodules, airways and pulmonary fissures. The proposed techniques consist of two main steps using the MATLAB software (R2021a). First, the technique should apply for the process of normalization for improving the basic image more correctly. In the next, the technique actively rearranges the intensity of the image contrast. Second, the Contrast Limited Adaptive Histogram Equalization (CLAHE) method was used for enhancing small details, textures and local contrast of the image. As a result, this paper shows the modern and improved techniques of HE and some advantages of the technique on the traditional HE. Therefore, this paper concludes that various techniques related to the HE can be helpful for many processes, especially image pre-processing for Machine Learning (ML), Deep Learning (DL).

LSTM based sequence-to-sequence Model for Korean Automatic Word-spacing (LSTM 기반의 sequence-to-sequence 모델을 이용한 한글 자동 띄어쓰기)

  • Lee, Tae Seok;Kang, Seung Shik
    • Smart Media Journal
    • /
    • v.7 no.4
    • /
    • pp.17-23
    • /
    • 2018
  • We proposed a LSTM-based RNN model that can effectively perform the automatic spacing characteristics. For those long or noisy sentences which are known to be difficult to handle within Neural Network Learning, we defined a proper input data format and decoding data format, and added dropout, bidirectional multi-layer LSTM, layer normalization, and attention mechanism to improve the performance. Despite of the fact that Sejong corpus contains some spacing errors, a noise-robust learning model developed in this study with no overfitting through a dropout method helped training and returned meaningful results of Korean word spacing and its patterns. The experimental results showed that the performance of LSTM sequence-to-sequence model is 0.94 in F1-measure, which is better than the rule-based deep-learning method of GRU-CRF.