• Title/Summary/Keyword: Boltzmann Machine

Search Result 37, Processing Time 0.04 seconds

Optimal Graph Partitioning by Boltzmann Machine (Boltzmann Machine을 이용한 그래프의 최적분할)

  • Lee, Jong-Hee;Kim, Jin-Ho;Park, Heung-Moon
    • Journal of the Korean Institute of Telematics and Electronics
    • /
    • v.27 no.7
    • /
    • pp.1025-1032
    • /
    • 1990
  • We proposed a neural network energy function for the optimal graph partitioning and its optimization method using Boltzmann Machine. We composed a Boltzmann Machine with the proposed neural network energy function, and the simulation results show that we can obtain an optimal solution with the energy function parameters of A=50, B=5, c=14 and D=10, at the Boltzmann Machine parameters of To=80 and \ulcorner0.07 for a 6-node 3-partition problem. As a result, the proposed energy function and optimization parameters are proved to be feasible for the optimal graph partitioning.

  • PDF

Boltzmann machine using Stochastic Computation (확률 연산을 이용한 볼츠만 머신)

  • 이일완;채수익
    • Journal of the Korean Institute of Telematics and Electronics A
    • /
    • v.31A no.6
    • /
    • pp.159-168
    • /
    • 1994
  • Stochastic computation is adopted to reduce the silicon area of the multipliers in implementing neural network in VLSI. In addition to this advantage, the stochastic computation has inherent random errors which is required for implementing Boltzmann machine. This random noise is useful for the simulated annealing which is employed to achieve the global minimum for the Boltzmann Machine. In this paper, we propose a method to implement the Boltzmann machine with stochastic computation and discuss the addition problem in stochastic computation and its simulated annealing in detail. According to this analysis Boltzmann machine using stochastic computation is suitable for the pattern recognition/completion problems. We have verified these results through the simulations for XOR, full adder and digit recognition problems, which are typical of the pattern recognition/completion problems.

  • PDF

Design of Fuzzy k-Nearest Neighbors Classifiers based on Feature Extraction by using Stacked Autoencoder (Stacked Autoencoder를 이용한 특징 추출 기반 Fuzzy k-Nearest Neighbors 패턴 분류기 설계)

  • Rho, Suck-Bum;Oh, Sung-Kwun
    • The Transactions of The Korean Institute of Electrical Engineers
    • /
    • v.64 no.1
    • /
    • pp.113-120
    • /
    • 2015
  • In this paper, we propose a feature extraction method using the stacked autoencoders which consist of restricted Boltzmann machines. The stacked autoencoders is a sort of deep networks. Restricted Boltzmann machines (RBMs) are probabilistic graphical models that can be interpreted as stochastic neural networks. In terms of pattern classification problem, the feature extraction is a key issue. We use the stacked autoencoders networks to extract new features which have a good influence on the improvement of the classification performance. After feature extraction, fuzzy k-nearest neighbors algorithm is used for a classifier which classifies the new extracted data set. To evaluate the classification ability of the proposed pattern classifier, we make some experiments with several machine learning data sets.

Parallel Implementation of One Boltzmann Machine's Algorithm

  • Zhu, H.;Ren, F.;Sun, N.;Eguchi, K.;Tabata, T.
    • Proceedings of the IEEK Conference
    • /
    • 2002.07a
    • /
    • pp.265-268
    • /
    • 2002
  • Parallel-computation is very interesting topic. This paper describes that we apply it into the Boltzmann machine with the Parallel-Transit-Evaluation Method successfully.

  • PDF

Deterministic Boltzmann Machine Based on Nonmonotonic Neuron Model (비단조 뉴런 모델을 이용한 결정론적 볼츠만 머신)

  • 강형원;박철영
    • Proceedings of the IEEK Conference
    • /
    • 2003.07d
    • /
    • pp.1553-1556
    • /
    • 2003
  • In this paper, We evaluate the learning ability of non-monotonic DBM(Deterministic Boltzmann Machine) network through numerical simulations. The simulation results show that the proposed system has higher performance than monotonic DBM network model. Non-monotonic DBM network also show an interesting result that network itself adjusts the number of hidden layer neurons. DBM network can be realized with fewer components than other neural network models. These results enhance the utilization of non-monotonic neurons in the large scale integration of neuro-chips.

  • PDF

Learning Ability of Deterministic Boltzmann Machine with Non-Monotonic Neurons (비단조뉴런 DBM 네트워크의 학습 능력에 관한 연구)

  • 박철영;이도훈
    • Proceedings of the Korean Institute of Intelligent Systems Conference
    • /
    • 2001.12a
    • /
    • pp.275-278
    • /
    • 2001
  • In this paper, We evaluate the learning ability of non-monotonic DBM(Deterministic Boltzmann Machine) network through numerical simulations. The simulation results show that the proposed system has higher performance than monotonic DBM network model. Non-monotonic DBM network also show an interesting result that network itself adjusts the number of hidden layer neurons. DBM network can be realized with fewer components than other neural network models. These results enhance the utilization of non-monotonic neurons in the large scale integration of neuro-chips.

  • PDF

Performance Improvement of Deterministic Boltzmann Machine Based on Nonmonotonic Neuron (비단조 뉴런에 의한 결정론적 볼츠만머신의 성능 개선)

  • 강형원;박철영
    • Proceedings of the Korea Society for Industrial Systems Conference
    • /
    • 2003.05a
    • /
    • pp.52-56
    • /
    • 2003
  • In this paper, We evaluate the learning ability of non-monotonic DBM(Deterministic Boltzmann Machine) network through numerical simulations. The simulation results show that the proposed system has higher performance than monotonic DBM network model. Non-monotonic DBM network also show an interesting result that network itself adjusts the number of hidden layer neurons. DBM network can be realized with fewer components than other neural network models. These results enhance the utilization of non-monotonic neurons in the large scale integration of neuro-chips.

  • PDF

Mild Cognitive Impairment Prediction Model of Elderly in Korea Using Restricted Boltzmann Machine (제한된 볼츠만 기계학습 알고리즘을 이용한 우리나라 지역사회 노인의 경도인지장애 예측모형)

  • Byeon, Haewon
    • Journal of Convergence for Information Technology
    • /
    • v.9 no.8
    • /
    • pp.248-253
    • /
    • 2019
  • Early diagnosis of mild cognitive impairment (MCI) can reduce the incidence of dementia. This study developed the MCI prediction model for the elderly in Korea. The subjects of this study were 3,240 elderly (1,502 men, 1,738 women) aged 65 and over who participated in the Korean Longitudinal Survey of Aging (KLoSA) in 2012. Outcome variables were defined as MCI prevalence. Explanatory variables were age, marital status, education level, income level, smoking, drinking, regular exercise more than once a week, average participation time of social activities, subjective health, hypertension, diabetes Respectively. The prediction model was developed using Restricted Boltzmann Machine (RBM) neural network. As a result, age, sex, final education, subjective health, marital status, income level, smoking, drinking, regular exercise were significant predictors of MCI prediction model of rural elderly people in Korea using RBM neural network. Based on these results, it is required to develop a customized dementia prevention program considering the characteristics of high risk group of MCI.

Learning Ability of Deterministic Boltzmann Machine with Non-Monotonic Neurons in Hidden Layer (은닉층에 비단조 뉴런을 갖는 결정론적 볼츠만 머신의 학습능력에 관한 연구)

  • 박철영
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.11 no.6
    • /
    • pp.505-509
    • /
    • 2001
  • In this paper, we evaluate the learning ability of non-monotonic DMM(Deterministic Boltzmann Machine) network through numerical simulations. The simulation results show that the proposed system has higher performance than monotonic DBM network model. Non-monotonic DBM network also show an interesting result that network itself adjusts the number of hidden layer neurons. DBM network can be realized with fewer components than other neural network models. These results enhance the utilization of non-monotonic neurons in the large scale integration of neuro-chips.

  • PDF

Malwares Attack Detection Using Ensemble Deep Restricted Boltzmann Machine

  • K. Janani;R. Gunasundari
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.5
    • /
    • pp.64-72
    • /
    • 2024
  • In recent times cyber attackers can use Artificial Intelligence (AI) to boost the sophistication and scope of attacks. On the defense side, AI is used to enhance defense plans, to boost the robustness, flexibility, and efficiency of defense systems, which means adapting to environmental changes to reduce impacts. With increased developments in the field of information and communication technologies, various exploits occur as a danger sign to cyber security and these exploitations are changing rapidly. Cyber criminals use new, sophisticated tactics to boost their attack speed and size. Consequently, there is a need for more flexible, adaptable and strong cyber defense systems that can identify a wide range of threats in real-time. In recent years, the adoption of AI approaches has increased and maintained a vital role in the detection and prevention of cyber threats. In this paper, an Ensemble Deep Restricted Boltzmann Machine (EDRBM) is developed for the classification of cybersecurity threats in case of a large-scale network environment. The EDRBM acts as a classification model that enables the classification of malicious flowsets from the largescale network. The simulation is conducted to test the efficacy of the proposed EDRBM under various malware attacks. The simulation results show that the proposed method achieves higher classification rate in classifying the malware in the flowsets i.e., malicious flowsets than other methods.