• Title/Summary/Keyword: 앞먹임 신경망

Search Result 2, Processing Time 0.395 seconds

Neural Predictive Coding for Text Compression Using GPGPU (GPGPU를 활용한 인공신경망 예측기반 텍스트 압축기법)

  • Kim, Jaeju;Han, Hwansoo
    • KIISE Transactions on Computing Practices
    • /
    • v.22 no.3
    • /
    • pp.127-132
    • /
    • 2016
  • Several methods have been proposed to apply artificial neural networks to text compression in the past. However, the networks and targets are both limited to the small size due to hardware capability in the past. Modern GPUs have much better calculation capability than CPUs in an order of magnitude now, even though CPUs have become faster. It becomes possible now to train greater and complex neural networks in a shorter time. This paper proposed a method to transform the distribution of original data with a probabilistic neural predictor. Experiments were performed on a feedforward neural network and a recurrent neural network with gated-recurrent units. The recurrent neural network model outperformed feedforward network in compression rate and prediction accuracy.

Analyzing Performance and Dynamics of Echo State Networks Given Various Structures of Hidden Neuron Connections (Echo State Network 모델의 은닉 뉴런 간 연결구조에 따른 성능과 동역학적 특성 분석)

  • Yoon, Sangwoong;Zhang, Byoung-Tak
    • KIISE Transactions on Computing Practices
    • /
    • v.21 no.4
    • /
    • pp.338-342
    • /
    • 2015
  • Recurrent Neural Network (RNN), a machine learning model which can handle time-series data, can possess more varied structures than a feed-forward neural network, since a RNN allows hidden-to-hidden connections. This research focuses on the network structure among hidden neurons, and discusses the information processing capability of RNN. Time-series learning potential and dynamics of RNNs are investigated upon several well-established network structure models. Hidden neuron network structure is found to have significant impact on the performance of a model, and the performance variations are generally correlated with the criticality of the network dynamics. Especially Preferential Attachment Network model showed an interesting behavior. These findings provide clues for performance improvement of the RNN.