• Title/Summary/Keyword: normalization method

Search Result 640, Processing Time 0.033 seconds

Implementation of Text Summarize Automation Using Document Length Normalization (문서 길이 정규화를 이용한 문서 요약 자동화 시스템 구현)

  • 이재훈;김영천;이성주
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2001.12a
    • /
    • pp.51-55
    • /
    • 2001
  • With the rapid growth of the World Wide Web and electronic information services, information is becoming available on-Line at an incredible rate. One result is the oft-decried information overload. No one has time to read everything, yet we often have to make critical decisions based on what we are able to assimilate. The technology of automatic text summarization is becoming indispensable for dealing with this problem. Text summarization is the process of distilling the most important information from a source to produce an abridged version for a particular user or task. Information retrieval(IR) is the task of searching a set of documents for some query-relevant documents. On the other hand, text summarization is considered to be the task of searching a document, a set of sentences, for some topic-relevant sentences. In this paper, we show that document information, that is more reliable and suitable for query, using document length normalization of which is gained through information retrieval . Experimental results of this system in newspaper articles show that document length normalization method superior to other methods use query itself.

  • PDF

Image Classification Method using Independent Component Analysis and Normalization (독립성분해석과 정규화를 이용한 영상분류 방법)

  • Hong, Jun-Sik;Ryu, Jeong-Woong
    • Journal of KIISE:Software and Applications
    • /
    • v.28 no.9
    • /
    • pp.629-633
    • /
    • 2001
  • In this paper, we improve noise tolerance in image classification by combining ICA(Independent Component Analysis) with Normalization. When we add noise to the raw image data the degree of noise tolerance becomes N(0, 0.4) for PCA and N(0, 0.53) for ICA. However, when we use the preprocessing approach the degree of noise tolerance after Normalization becomes N(0, 0.75), which shows the improvement of noise tolerance in classification.

  • PDF

Short Term Sensor's Drift Analysis and Compensation Using Internal Normalization (내부 최적화를 이용한 화학 센서의 단기 드리프트 분석 및 보정)

  • Jeon, Jin-Young;Baek, Jong-Hyun;Byun, Hyung-Gi
    • Journal of Sensor Science and Technology
    • /
    • v.24 no.4
    • /
    • pp.270-273
    • /
    • 2015
  • One of the main problems when working the chemical sensor is the lack of repeatability and reproducibility of the sensor response. If the problem is not properly taken into consideration, the stability and reliability of the system using chemical sensors would be decreased. In this paper we analyzed the sensor's drift of short term and proposed a compensation method for reducing the effects of the drift in order to improve the stability and the reliability of the chemical sensor. The sensor drift was analyzed by a trend line graph and CV(coefficient of variation) was used to quantify. And we compensated for the drift by using the internal normalization. As a result it was found that the value of CV was decreased after compensation.

Comparative Analysis for Emotion Expression Using Three Methods Based by CNN (CNN기초로 세 가지 방법을 이용한 감정 표정 비교분석)

  • Yang, Chang Hee;Park, Kyu Sub;Kim, Young Seop;Lee, Yong Hwan
    • Journal of the Semiconductor & Display Technology
    • /
    • v.19 no.4
    • /
    • pp.65-70
    • /
    • 2020
  • CNN's technologies that represent emotional detection include primitive CNN algorithms, deployment normalization, and drop-off. We present the methods and data of the three experiments in this paper. The training database and the test database are set up differently. The first experiment is to extract emotions using Batch Normalization, which complemented the shortcomings of distribution. The second experiment is to extract emotions using Dropout, which is used for rapid computation. The third experiment uses CNN using convolution and maxpooling. All three results show a low detection rate, To supplement these problems, We will develop a deep learning algorithm using feature extraction method specialized in image processing field.

Annual Yearly Load Forecasting by Using Seasonal Load Characteristics With Considering Weekly Normalization (주단위 정규화를 통하여 계절별 부하특성을 고려한 연간 전력수요예측)

  • Cha, Jun-Min;Yoon, Kyoung-Ha;Ku, Bon-Hui
    • Proceedings of the KIEE Conference
    • /
    • 2011.07a
    • /
    • pp.199-200
    • /
    • 2011
  • Load forecasting is very important for power system analysis and planning. This paper suggests yearly load forecasting of considering weekly normalization and seasonal load characteristics. Each weekly peak load is normalized and the average value is calculated. The new hourly peak load is seasonally collected. This method was used for yearly load forecasting. The results of the actual data and forecast data were calculated error rate by comparing.

  • PDF

Automatic 3D Head Pose-Normalization using 2D and 3D Interaction (자동 3차원 얼굴 포즈 정규화 기법)

  • Yu, Sun-Jin;Kim, Joong-Rock;Lee, Sang-Youn
    • Proceedings of the IEEK Conference
    • /
    • 2007.07a
    • /
    • pp.211-212
    • /
    • 2007
  • Pose-variation factors present a significant problem in 2D face recognition. To solve this problem, there are various approaches for a 3D face acquisition system which was able to generate multi-view images. However, this created another pose estimation problem in terms of normalizing the 3D face data. This paper presents a 3D head pose-normalization method using 2D and 3D interaction. The proposed method uses 2D information with the AAM(Active Appearance Model) and 3D information with a 3D normal vector. In order to verify the performance of the proposed method, we designed an experiment using 2.5D face recognition. Experimental results showed that the proposed method is robust against pose variation.

  • PDF

Analysis on Topographic Normalization Methods for 2019 Gangneung-East Sea Wildfire Area Using PlanetScope Imagery (2019 강릉-동해 산불 피해 지역에 대한 PlanetScope 영상을 이용한 지형 정규화 기법 분석)

  • Chung, Minkyung;Kim, Yongil
    • Korean Journal of Remote Sensing
    • /
    • v.36 no.2_1
    • /
    • pp.179-197
    • /
    • 2020
  • Topographic normalization reduces the terrain effects on reflectance by adjusting the brightness values of the image pixels to be equal if the pixels cover the same land-cover. Topographic effects are induced by the imaging conditions and tend to be large in high mountainousregions. Therefore, image analysis on mountainous terrain such as estimation of wildfire damage assessment requires appropriate topographic normalization techniques to yield accurate image processing results. However, most of the previous studies focused on the evaluation of topographic normalization on satellite images with moderate-low spatial resolution. Thus, the alleviation of topographic effects on multi-temporal high-resolution images was not dealt enough. In this study, the evaluation of terrain normalization was performed for each band to select the optimal technical combinations for rapid and accurate wildfire damage assessment using PlanetScope images. PlanetScope has considerable potential in the disaster management field as it satisfies the rapid image acquisition by providing the 3 m resolution daily image with global coverage. For comparison of topographic normalization techniques, seven widely used methods were employed on both pre-fire and post-fire images. The analysis on bi-temporal images suggests the optimal combination of techniques which can be applied on images with different land-cover composition. Then, the vegetation index was calculated from the images after the topographic normalization with the proposed method. The wildfire damage detection results were obtained by thresholding the index and showed improvementsin detection accuracy for both object-based and pixel-based image analysis. In addition, the burn severity map was constructed to verify the effects oftopographic correction on a continuous distribution of brightness values.

An Experimental Study on the Relationship between Deformation and Relative Settlement for Weathered-granite (화강풍화토의 변형계수와 상대침하 관계식에 관한 실험적 연구)

  • Park, Yong-Boo
    • Land and Housing Review
    • /
    • v.4 no.1
    • /
    • pp.125-131
    • /
    • 2013
  • To predict the real bearing capacity and settlement of the shallow foundation the plate load test results were used. But there is no field estimation method about igneous weathered soil and rock. Therefore, to predict the settlement equation, the plate load test about igneous weathered soil and rock was done in this study. To analyze the load ~ relative settlement curve by normalization, it did not use normal analysis method, but the load ~ relative settlement (s/B, s : settlement, B : breadth of plate) was used. As a result of normalization by load ~ relative settlement conception, the curve was regular regardless of plate diameter and it was suggested the relationship of in-situ soil condition and results.

Improving methods for normalizing biomedical text entities with concepts from an ontology with (almost) no training data at BLAH5 the CONTES

  • Ferre, Arnaud;Ba, Mouhamadou;Bossy, Robert
    • Genomics & Informatics
    • /
    • v.17 no.2
    • /
    • pp.20.1-20.5
    • /
    • 2019
  • Entity normalization, or entity linking in the general domain, is an information extraction task that aims to annotate/bind multiple words/expressions in raw text with semantic references, such as concepts of an ontology. An ontology consists minimally of a formally organized vocabulary or hierarchy of terms, which captures knowledge of a domain. Presently, machine-learning methods, often coupled with distributional representations, achieve good performance. However, these require large training datasets, which are not always available, especially for tasks in specialized domains. CONTES (CONcept-TErm System) is a supervised method that addresses entity normalization with ontology concepts using small training datasets. CONTES has some limitations, such as it does not scale well with very large ontologies, it tends to overgeneralize predictions, and it lacks valid representations for the out-of-vocabulary words. Here, we propose to assess different methods to reduce the dimensionality in the representation of the ontology. We also propose to calibrate parameters in order to make the predictions more accurate, and to address the problem of out-of-vocabulary words, with a specific method.

A Study on Comparison of Normalization and Weighting Method for Constructing Index about Flood (홍수관련 지표 산정을 위한 표준화 및 가중치 비교 연구)

  • Baeck, Seung-Hyub;Choi, Si-Jung;Hong, Seung-Jin;Kim, Dong-Phil
    • Journal of Wetlands Research
    • /
    • v.13 no.3
    • /
    • pp.411-426
    • /
    • 2011
  • The construction of composite indicators should be normalized and weighted to render them comparable and evaluable variables in the field, which undergoes absence of a distinct methodology and where the application of universally popular method is common. Constructing of indices does not compare and analyze applying various normalizing and weighting, but constructer generally use chosen method and develops indicators and indices in most research. In this study, indices are applied various normalization and weighting methods, thereby analyzing how much impact the index and identifying individual characteristics derive a more reasonable way to help other research in the future. 5 different methods of normalization and 4 different types of weights were compared and analyzed. There are different results depending applied normalized methods and Z-score method best reflects the characteristics of the variables. According to weighting methods, the calculated results show little difference, but the ranking results of indices did not changed significantly. It might be better to provide constructors with a set of normalization and weighting methods to reflect their characteristics in order to build flood indices through the result of this study.