• 제목/요약/키워드: hebbian learning

검색결과 23건 처리시간 0.027초

Long-term Synaptic Plasticity: Circuit Perturbation and Stabilization

  • Park, Joo Min;Jung, Sung-Cherl;Eun, Su-Yong
    • The Korean Journal of Physiology and Pharmacology
    • /
    • 제18권6호
    • /
    • pp.457-460
    • /
    • 2014
  • At central synapses, activity-dependent synaptic plasticity has a crucial role in information processing, storage, learning, and memory under both physiological and pathological conditions. One widely accepted model of learning mechanism and information processing in the brain is Hebbian Plasticity: long-term potentiation (LTP) and long-term depression (LTD). LTP and LTD are respectively activity-dependent enhancement and reduction in the efficacy of the synapses, which are rapid and synapse-specific processes. A number of recent studies have a strong focal point on the critical importance of another distinct form of synaptic plasticity, non-Hebbian plasticity. Non-Hebbian plasticity dynamically adjusts synaptic strength to maintain stability. This process may be very slow and occur cell-widely. By putting them all together, this mini review defines an important conceptual difference between Hebbian and non-Hebbian plasticity.

An Alternative Method of Regression: Robust Modified Anti-Hebbian Learning

  • Hwang, Chang-Ha
    • Journal of the Korean Data and Information Science Society
    • /
    • 제7권2호
    • /
    • pp.203-210
    • /
    • 1996
  • A linear neural unit with a modified anti-Hebbian learning rule has been shown to be able to optimally fit curves, surfaces, and hypersurfaces by adaptively extracting the minor component of the input data set. In this paper, we study how to use the robust version of this neural fitting method for linear regression analysis. Furthermore, we compare this method with other methods when data set is contaminated by outliers.

  • PDF

변형하이브리드 학습규칙의 구현에 관한 연구 (A Study on the Implementation of Modified Hybrid Learning Rule)

  • 송도선;김석동;이행세
    • 전자공학회논문지B
    • /
    • 제31B권12호
    • /
    • pp.116-123
    • /
    • 1994
  • A modified Hybrid learning rule(MHLR) is proposed, which is derived from combining the Back Propagation algorithm that is known as an excellent classifier with modified Hebbian by changing the orginal Hebbian which is a good feature extractor. The network architecture of MHLR is multi-layered neural network. The weights of MHLR are calculated from sum of the weight of BP and the weight of modified Hebbian between input layer and higgen layer and from the weight of BP between gidden layer and output layer. To evaluate the performance, BP, MHLR and the proposed Hybrid learning rule (HLR) are simulated by Monte Carlo method. As the result, MHLR is the best in recognition rate and HLR is the second. In learning speed, HLR and MHLR are much the same, while BP is relatively slow.

  • PDF

상관관계를 이용한 홉필드 네트웍의 VLSI 구현 (VLSI Implementation of Hopfield Network using Correlation)

  • 오재혁;박성범;이종호
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 1993년도 하계학술대회 논문집 A
    • /
    • pp.254-257
    • /
    • 1993
  • This paper presents a new method to implement Hebbian learning method on artificial neural network. In hebbian learning algorithm, complexity in terms of multiplications is high. To save the chip area, we consider a new learning circuit. By calculating similarity, or correlation between $X_i$ and $O_i$, large portion of circuits commonly used in conventional neural networks is not necessary for this new hebbian learning circuit named COR. The output signals of COR is applied to weight storage capacitors for direct control the voltages of the capacitors. The weighted sum, ${\Sigma}W_{ij}O_j$, is realized by multipliers, whose output currents are summed up in one line which goes to learning circuit or output circuit. The drain current of the multiplier can produce positive or negative synaptic weights. The pass transistor selects eight learning mode or recall mode. The layout of an learnable six-neuron fully connected Hopfield neural network is designed, and is simulated using PSPICE. The network memorizes, and retrieves the patterns correctly under the existence of minor noises.

  • PDF

A Robust Principal Component Neural Network

  • Changha Hwang;Park, Hyejung;A, Eunyoung-N
    • Communications for Statistical Applications and Methods
    • /
    • 제8권3호
    • /
    • pp.625-632
    • /
    • 2001
  • Principal component analysis(PCA) is a multivariate technique falling under the general title of factor analysis. The purpose of PCA is to Identify the dependence structure behind a multivariate stochastic observation In order to obtain a compact description of it. In engineering field PCA is utilized mainly (or data compression and restoration. In this paper we propose a new robust Hebbian algorithm for robust PCA. This algorithm is based on a hyperbolic tangent function due to Hampel ef al.(1989) which is known to be robust in Statistics. We do two experiments to investigate the performance of the new robust Hebbian learning algorithm for robust PCA.

  • PDF

신경 진동자를 이용한 한글 문자의 인식 속도의 개선에 관한 연구 (A study for improvement of Recognition velocity of Korean Character using Neural Oscillator)

  • Kwon, Yong-Bum;Lee, Joon-Tark
    • 한국지능시스템학회:학술대회논문집
    • /
    • 한국퍼지및지능시스템학회 2004년도 춘계학술대회 학술발표 논문집 제14권 제1호
    • /
    • pp.491-494
    • /
    • 2004
  • Neural Oscillator can be applied to oscillatory systems such as the image recognition, the voice recognition, estimate of the weather fluctuation and analysis of geological fluctuation etc in nature and principally, it is used often to pattern recoglition of image information. Conventional BPL(Back-Propagation Learning) and MLNN(Multi Layer Neural Network) are not proper for oscillatory systems because these algorithm complicate Learning structure, have tedious procedures and sluggish convergence problem. However, these problems can be easily solved by using a synchrony characteristic of neural oscillator with PLL(phase-Locked Loop) function and by using a simple Hebbian learning rule. And also, Recognition velocity of Korean Character can be improved by using a Neural Oscillator's learning accelerator factor η$\_$ij/

  • PDF

Robustness를 형성시키기 위한 Hybrid 학습법칙을 갖는 다층구조 신경회로망 (Multi-layer Neural Network with Hybrid Learning Rules for Improved Robust Capability)

  • 정동규;이수영
    • 전자공학회논문지B
    • /
    • 제31B권8호
    • /
    • pp.211-218
    • /
    • 1994
  • In this paper we develope a hybrid learning rule to improve the robustness of multi-layer Perceptions. In most neural networks the activation of a neuron is deternined by a nonlinear transformation of the weighted sum of inputs to the neurons. Investigating the behaviour of activations of hidden layer neurons a new learning algorithm is developed for improved robustness for multi-layer Perceptrons. Unlike other methods which reduce the network complexity by putting restrictions on synaptic weights our method based on error-backpropagation increases the complexity of the underlying proplem by imposing it saturation requirement on hidden layer neurons. We also found that the additional gradient-descent term for the requirement corresponds to the Hebbian rule and our algorithm incorporates the Hebbian learning rule into the error back-propagation rule. Computer simulation demonstrates fast learning convergence as well as improved robustness for classification and hetero-association of patterns.

  • PDF

다층신경망에서 하이브리드 학습 규칙의 구현에 관한 연구 (A Study on the Implementation of Hybrid Learning Rule for Neural Network)

  • 송도선;김석동;이행세
    • 한국음향학회지
    • /
    • 제13권4호
    • /
    • pp.60-68
    • /
    • 1994
  • 본 논문에서는 다층구조 순방향 신경회로망에 적용될 수 있는 것으로 입력의 특징 추출기능(Feature Extractor)이 우수한 Hebb 학습 규칙과 패턴 분류 기능(Classifier)이 우수한 BP 알고리듬을 결합한 Hybrid학습 규칙을 제안하고자 한다. 오차역전파 학습법칙을 적용한 다층구조퍼셉트론(MLP)과는 달리, 다층구조에 오차역전파 학습법칙과 Hebb학습법칙이 동시에 적용될 수 있는 Hybrid(Hebbian+BP)학습법칙은 학습시에 출력층의 연결강도를 제외한 모든 연결강도 계산에 적용되며 출력층에는 기존의 오차역전파법칙만이 적용된다. 출력층에 Hebb 학습법칙을 제외시킨것은 다층구조학습시에 학습의 수렴성에 대한 보장이 주어져 있지 않기 때문이다. 제안된 Hybrid 학습법칙의 성능평가를 위해 몇가지의 영역구분 문제에 적용한 결과 제안된 학습법이 기존의 BP보다 우수함을 보였다. 학습속도면에서는 기존의 BP법칙에 비해 훨씬 빠른 수렴속도를 보여 주었는데, 그중 한가지 예를 보면 제안된 Hybrid법칙에 의한 학습은 기존의 BP의 학습회수의 2/10만으로도 가능함을 보여주었다. 인식률에서도 제안된 법칙에 의한 결과가 BP에 의한 결과보다 최고 약 $0.77\%$ 우수하다.

  • PDF

저주파 필터 특성을 갖는 다층 구조 신경망을 이용한 시계열 데이터 예측 (Time Series Prediction Using a Multi-layer Neural Network with Low Pass Filter Characteristics)

  • Min-Ho Lee
    • Journal of Advanced Marine Engineering and Technology
    • /
    • 제21권1호
    • /
    • pp.66-70
    • /
    • 1997
  • In this paper a new learning algorithm for curvature smoothing and improved generalization for multi-layer neural networks is proposed. To enhance the generalization ability a constraint term of hidden neuron activations is added to the conventional output error, which gives the curvature smoothing characteristics to multi-layer neural networks. When the total cost consisted of the output error and hidden error is minimized by gradient-descent methods, the additional descent term gives not only the Hebbian learning but also the synaptic weight decay. Therefore it incorporates error back-propagation, Hebbian, and weight decay, and additional computational requirements to the standard error back-propagation is negligible. From the computer simulation of the time series prediction with Santafe competition data it is shown that the proposed learning algorithm gives much better generalization performance.

  • PDF

Recognition of the Korean Character Using Phase Synchronization Neural Oscillator

  • Lee, Joon-Tark;Kwon, Yang-Bum
    • Journal of Advanced Marine Engineering and Technology
    • /
    • 제28권2호
    • /
    • pp.347-353
    • /
    • 2004
  • Neural oscillator can be applied to oscillator systems such as analysis of image information, voice recognition and etc, Conventional learning algorithms(Neural Network or EBPA(Error Back Propagation Algorithm)) are not proper for oscillatory systems with the complicate input patterns because of its too much complex structure. However, these problems can be easily solved by using a synchrony characteristic of neural oscillator with PLL(phase locked loop) function and a simple Hebbian learning rule, Therefore, in this paper, it will introduce an technique for Recognition of the Korean Character using Phase Synchronization Neural Oscillator and will show the result of simulation.