• Title/Summary/Keyword: Shannon Entropy

Search Result 81, Processing Time 0.023 seconds

Assessment of water use vulnerability in the unit watersheds using TOPSIS approach with subjective and objective weights (주관적·객관적 가중치를 활용한 TOPSIS 기반 단위유역별 물이용 취약성 평가)

  • Park, Hye Sun;Kim, Jeong Bin;Um, Myoung-Jin;Kim, Yeonjoo
    • Journal of Korea Water Resources Association
    • /
    • v.49 no.8
    • /
    • pp.685-692
    • /
    • 2016
  • This study aimed to develop the indicator-based approach to assess water use vulnerability in watersheds and applied to the unit watershed within the Han River watershed. Vulnerability indices were comprised of three sub-components (exposure, sensitivity, adaptive capacity) with respect to water use. The indicators were made up of 16 water use indicators. Then we estimated vulnerability indices using the Technique for Order of Preference by Similarity to Ideal Solution approach (TOPSIS). We collected environmental and socio-economic data from national statistics database, and used them for simulated results by the Soil and Water Assessment Tool (SWAT) model. For estimating the weighted values for each indicator, expert surveys for subjective weight and data-based Shannon's entropy method for objective weight were utilized. With comparing the vulnerability ranks and analyzing rank correlation between two methods, we evaluated the vulnerabilities for the Han River watershed. For water use, vulnerable watersheds showed high water use and the water leakage ratio. The indices from both weighting methods showed similar spatial distribution in general. Such results suggests that the approach to consider different weighting methods would be important for reliably assessing the water use vulnerability in watersheds.

The Optimal Partition of Initial Input Space for Fuzzy Neural System : Measure of Fuzziness (퍼지뉴럴 시스템을 위한 초기 입력공간분할의 최적화 : Measure of Fuzziness)

  • Baek, Deok-Soo;Park, In-Kue
    • Journal of the Institute of Electronics Engineers of Korea TE
    • /
    • v.39 no.3
    • /
    • pp.97-104
    • /
    • 2002
  • In this paper we describe the method which optimizes the partition of the input space by means of measure of fuzziness for fuzzy neural network. It covers its generation of fuzzy rules for input sub space. It verifies the performance of the system depended on the various time interval of the input. This method divides the input space into several fuzzy regions and assigns a degree of each of the generated rules for the partitioned subspaces from the given data using the Shannon function and fuzzy entropy function generating the optimal knowledge base without the irrelevant rules. In this scheme the basic idea of the fuzzy neural network is to realize the fuzzy rule base and the process of reasoning by neural network and to make the corresponding parameters of the fuzzy control rules be adapted by the steepest descent algorithm. According to the input interval the proposed inference procedure proves that the fast convergence of root mean square error (RMSE) owes to the optimal partition of the input space

Tonality Design for Sound Quality Evaluation in Printer (프린터 음질평가를 위한 순음도 설계)

  • Kim, Eui-Youl;Lee, Young-Jun;Lee, Sang-Kwon
    • Transactions of the Korean Society for Noise and Vibration Engineering
    • /
    • v.22 no.4
    • /
    • pp.318-327
    • /
    • 2012
  • The operating sound radiated from a laser printer includes tonal noise components caused by the rotating mechanical parts such as gear, shaft, motor, fan, etc. The negative effects of the tonal noise components need to be considered in the process of developing a sound quality index for the quantitative evaluation of the emotional satisfaction in terms of psycho-acoustics. However, in a previous paper, it was confirmed that the Aures tonality did not have enough correlation with the results of jury evaluation. The sound quality index based on loudness, articulation index, fluctuation strength has a little problem in considering the effects of rotating mechanical parts on the sound quality. In this paper, to solve the tonality evaluation problem, the calculation algorithm of Aures tonality was investigated in detail to find the cause of decreasing the correlation. The new tonality evaluation model was proposed by modifying and optimizing the masking effect, loudness ratio, and shape of weighting curve based on the basic algorithm of Aures tonality, and applied to two kinds of operating sound groups in order to verify the usefulness of proposed model. As a result, it is confirmed that the proposed tonality evaluation model has enough correlation and usefulness for expressing the tonalness in the operating sounds of laser printers. In the following paper, this results will be used to model the sound quality index as the input data by using the classification algorithm.

Statistical Consideration on the Resources of the Countries in the World (세계 각국의 자원에 대한 통계적 고찰)

  • Huh, Moon-Yul;Choi, Byong-Su;Lee, Seung-Chun
    • The Korean Journal of Applied Statistics
    • /
    • v.22 no.1
    • /
    • pp.41-57
    • /
    • 2009
  • The paper investigates the resources of the 232 countries based on the 39 resources of these countries. The data used in this work is from various sources like UN, CIA, World bank, OECD reports and the home pages of each country. The purpose of the study is to evaluate what resources are most influential to the wealth of a country, to the well-bring of the country, or the status of the country's development. For this, data visualization method is applied. Data visualization technique, although powerful for exploratory purposes, is dependent upon the users expertize and the interpretation is also dependent on the of the users. For objective methods of investigation, mutual information based on the Shanon's entropy theory is applied here. All the statistical methods employed in this paper are processed with DAVIS (Huh and Song, 2002)

A Study on Detection of Malicious Android Apps based on LSTM and Information Gain (LSTM 및 정보이득 기반의 악성 안드로이드 앱 탐지연구)

  • Ahn, Yulim;Hong, Seungah;Kim, Jiyeon;Choi, Eunjung
    • Journal of Korea Multimedia Society
    • /
    • v.23 no.5
    • /
    • pp.641-649
    • /
    • 2020
  • As the usage of mobile devices extremely increases, malicious mobile apps(applications) that target mobile users are also increasing. It is challenging to detect these malicious apps using traditional malware detection techniques due to intelligence of today's attack mechanisms. Deep learning (DL) is an alternative technique of traditional signature and rule-based anomaly detection techniques and thus have actively been used in numerous recent studies on malware detection. In order to develop DL-based defense mechanisms against intelligent malicious apps, feeding recent datasets into DL models is important. In this paper, we develop a DL-based model for detecting intelligent malicious apps using KU-CISC 2018-Android, the most up-to-date dataset consisting of benign and malicious Android apps. This dataset has hardly been addressed in other studies so far. We extract OPcode sequences from the Android apps and preprocess the OPcode sequences using an N-gram model. We then feed the preprocessed data into LSTM and apply the concept of Information Gain to improve performance of detecting malicious apps. Furthermore, we evaluate our model with numerous scenarios in order to verify the model's design and performance.

Prioritizing the locations for hydrogen production using a hybrid wind-solar system: A case study

  • Mostafaeipour, Ali;Jooyandeh, Erfan
    • Advances in Energy Research
    • /
    • v.5 no.2
    • /
    • pp.107-128
    • /
    • 2017
  • Energy is a major component of almost all economic, production, and service activities, and rapid population growth, urbanization and industrialization have led to ever growing demand for energy. Limited energy resources and increasingly evident environmental effects of fossil fuel consumption has led to a growing awareness about the importance of further use of renewable energy sources in the countries energy portfolio. Renewable hydrogen production is a convenient method for storage of unstable renewable energy sources such as wind and solar energy for use in other place or time. In this study, suitability of 25 cities located in Iran's western region for renewable hydrogen production are evaluated by multi-criteria decision making techniques including TOPSIS, VIKOR, ELECTRE, SAW, Fuzzy TOPSIS, and also hybrid ranking techniques. The choice of suitable location for the centralized renewable hydrogen production is associated with various technical, economic, social, geographic, and political criteria. This paper describes the criteria affecting the hydrogen production potential in the study region. Determined criteria are weighted with Shannon entropy method, and Angstrom model and wind power model are used to estimate respectively the solar and wind energy production potential in each city and each month. Assuming the use of proton exchange membrane electrolyzer for hydrogen production, the renewable hydrogen production potential of each city is then estimated based on the obtained wind and solar energy generation potentials. The rankings obtained with MCDMs show that Kermanshah is the best option for renewable hydrogen production, and evaluation of renewable hydrogen production capacities show that Gilangharb has the highest capacity among the studied cities.

Design of Decision Error Model for Reliability of Sound Quality Analysis and Its Experimental Verification (프린터 음질평가의 신뢰성을 위한 결정오차 모델설계 및 실험적 검증)

  • Kim, Eui-Youl;Lee, Young-Jun;Lee, Sang-Kwon
    • Transactions of the Korean Society for Noise and Vibration Engineering
    • /
    • v.22 no.7
    • /
    • pp.605-618
    • /
    • 2012
  • In this study, the possibility of decision error is investigated to identify and improve the reliability of participants in the process of conducting the sound quality analysis for laser printers. So far, there is not a way to identify and express the possibility of individual participant quantitatively. Thus, the decision error model is proposed which is based on the expectation value between the perceived sounds. Through the experimental verification on the laser printers, it was found that the possibility of decision error is affected according to the normalized difference. The possibility of decision error has inversely proportional to the normalized difference between the perceived sounds. When the normalized difference becomes small value, the uncertainly between decisions is inversely increase, and then it is difficult to obtain the proper result in the process of the jury evaluation for laser printers. For this reason, in this study, the proposed decision error model is added in the previous step of the correlation verification. Comparing to the conventional process only using the correlation based method, after the reliability of each participant is verified, the correlation with the mean response of participants is verified. It was found that the participants who were recognized as having unusual preferences are actually identified as having the reliability problem. Based on the results of this study, the proposed decision error model will be helpful to identify and improve the reliability of participants in the following study for the sound quality analysis.

Subsequent application of self-organizing map and hidden Markov models infer community states of stream benthic macroinvertebrates

  • Kim, Dong-Hwan;Nguyen, Tuyen Van;Heo, Muyoung;Chon, Tae-Soo
    • Journal of Ecology and Environment
    • /
    • v.38 no.1
    • /
    • pp.95-107
    • /
    • 2015
  • Because an ecological community consists of diverse species that vary nonlinearly with environmental variability, its dynamics are complex and difficult to analyze. To investigate temporal variations of benthic macroinvertebrate community, we used the community data that were collected at the sampling site in Baenae Stream near Busan, Korea, which is a clean stream with minimum pollution, from July 2006 to July 2013. First, we used a self-organizing map (SOM) to heuristically derive the states that characterizes the biotic condition of the benthic macroinvertebrate communities in forms of time series data. Next, we applied the hidden Markov model (HMM) to fine-tune the states objectively and to obtain the transition probabilities between the states and the emission probabilities that show the connection of the states with observable events such as the number of species, the diversity measured by Shannon entropy, and the biological water quality index (BMWP). While the number of species apparently addressed the state of the community, the diversity reflected the state changes after the HMM training along with seasonal variations in cyclic manners. The BMWP showed clear characterization of events that correspond to the different states based on the emission probabilities. The environmental factors such as temperature and precipitation also indicated the seasonal and cyclic changes according to the HMM. Though the usage of the HMM alone can guarantee the convergence of the training or the precision of the derived states based on field data in this study, the derivation of the states by the SOM that followed the fine-tuning by the HMM well elucidated the states of the community and could serve as an alternative reference system to reveal the ecological structures in stream communities.

Normalization and Valuation of Research Evaluation Indicators in Different Scientific Fields

  • Chakoli, Abdolreza Noroozi;Ghazavi, Roghayeh
    • Journal of Information Science Theory and Practice
    • /
    • v.4 no.1
    • /
    • pp.21-29
    • /
    • 2016
  • Given the difference in research performance in various scientific fields, this study aims to weight and valuate current indicators used for evaluation of scientific productions (publications), in order to adjust these indicators in comparison to each other and make possible a more precise evaluation of scientific productions. This is a scientometrics study using documentary, evaluative, and survey techniques. The statistical population consisted of 106 top Iranian researchers, scientists, and scientific and research managers. Then their research résumé information was gathered and analyzed based on research questions. In order to compare values, the data gathered from research production performance of the population was weighted using Shannon entropy method. Also, the weights of each scientific production importance according to expert opinions (extracted from other works) was analyzed and after adjustment the final weight of each scientific production was determined. A pairwise matrix was used in order to determine the ratios. According to the results, in the area of engineering sciences, patents (0.142) in the area of science, international articles (0.074) in the area of humanities and social sciences, books (0.174), and in the area of medical sciences, international articles (0.111) had the highest weight compared to other information formats. By dividing the weights for each type of publication, the value of each scientific production compared to other scientific productions in the same field and productions of other fields was calculated. Validation of the results in the studied population resulted in very high credibility for all investigated indicators in all four fields. By using these values and normalized ratios of publication indicators it is possible to achieve precise and adjusted results, making it possible to feasibly use these results in realistic policy making.

A Study on the Multiresolutional Coding Based on Spline Wavelet Transform (스플라인 웨이브렛 변환을 이용한 영상의 다해상도 부호화에 관한 연구)

  • 김인겸;정준용;유충일;이광기;박규태
    • The Journal of Korean Institute of Communications and Information Sciences
    • /
    • v.19 no.12
    • /
    • pp.2313-2327
    • /
    • 1994
  • As the communication environment evolves, there is an increasing need for multiresolution image coding. To meet this need, the entrophy constratined vector quantizer(ECVQ) for coding of image pyramids by spline wavelet transform is introduced in this paper. This paper proposes a new scheme for image compression taking into account psychovisual feature both in the space and frequency domains : this proposed method involves two steps. First we use spline wavelet transform in order to obtain a set of biorthogonal subclasses of images ; the original image is decomposed at different scale using a pyramidal algorithm architecture. The decomposition is along the vertical and horizontal directions and maintains constant the number of pixels required the image. Second, according to Shannon's rate distortion theory, the wavelet coefficients are vectored quantized using a multi-resolution ECVQ(entropy-constrained vector quantizer) codebook. The simulation results showed that the proposed method could achieve higher quality LENA image improved by about 2.0 dB than that of the ECVQ using other wavelet at 0.5 bpp and, by about 0.5 dB at 1.0 bpp, and reduce the block effect and the edge degradation.

  • PDF