• 제목/요약/키워드: evolutionary neural networks

검색결과 85건 처리시간 0.03초

A New Architecture of Genetically Optimized Self-Organizing Fuzzy Polynomial Neural Networks by Means of Information Granulation

  • Park, Ho-Sung;Oh, Sung-Kwun;Ahn, Tae-Chon
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2005년도 ICCAS
    • /
    • pp.1505-1509
    • /
    • 2005
  • This paper introduces a new architecture of genetically optimized self-organizing fuzzy polynomial neural networks by means of information granulation. The conventional SOFPNNs developed so far are based on mechanisms of self-organization and evolutionary optimization. The augmented genetically optimized SOFPNN using Information Granulation (namely IG_gSOFPNN) results in a structurally and parametrically optimized model and comes with a higher level of flexibility in comparison to the one we encounter in the conventional FPNN. With the aid of the information granulation, we determine the initial location (apexes) of membership functions and initial values of polynomial function being used in the premised and consequence part of the fuzzy rules respectively. The GA-based design procedure being applied at each layer of genetically optimized self-organizing fuzzy polynomial neural networks leads to the selection of preferred nodes with specific local characteristics (such as the number of input variables, the order of the polynomial, a collection of the specific subset of input variables, and the number of membership function) available within the network. To evaluate the performance of the IG_gSOFPNN, the model is experimented with using gas furnace process data. A comparative analysis shows that the proposed IG_gSOFPNN is model with higher accuracy as well as more superb predictive capability than intelligent models presented previously.

  • PDF

A Brief Introduction to Soft Computing

  • Hong Dug Hun;Hwang Changha
    • 한국통계학회:학술대회논문집
    • /
    • 한국통계학회 2004년도 학술발표논문집
    • /
    • pp.65-66
    • /
    • 2004
  • The aim of this article is to illustrate what soft computing is and how important it is.

  • PDF

An Optimization of Polynomial Neural Networks using Genetic Algorithm

  • Kim, Dong-Won;Park, Jang-Hyun;Huh, Sung-Hoe;Yoon, Pil-Sang;Park, Gwi-Tae
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 제어로봇시스템학회 2002년도 ICCAS
    • /
    • pp.61.3-61
    • /
    • 2002
  • $\textbullet$ Abstract $\textbullet$ Introduction $\textbullet$ Genetic Algorithm $\textbullet$ Evolutionary structure optimization of PNN $\textbullet$ Simulation result $\textbullet$ Conclusion $\textbullet$ References

  • PDF

공진화를 이용한 신경회로망의 구조 최적화 (Structure optimization of neural network using co-evolution)

  • 전효병;김대준;심귀보
    • 전자공학회논문지S
    • /
    • 제35S권4호
    • /
    • pp.67-75
    • /
    • 1998
  • In general, Evoluationary Algorithm(EAs) are refered to as methods of population-based optimization. And EAs are considered as very efficient methods of optimal sytem design because they can provice much opportunity for obtaining the global optimal solution. This paper presents a co-evolution scheme of artifical neural networks, which has two different, still cooperatively working, populations, called as a host popuation and a parasite population, respectively. Using the conventional generatic algorithm the host population is evolved in the given environment, and the parastie population composed of schemata is evolved to find useful schema for the host population. the structure of artificial neural network is a diagonal recurrent neural netork which has self-feedback loops only in its hidden nodes. To find optimal neural networks we should take into account the structure of the neural network as well as the adaptive parameters, weight of neurons. So we use the genetic algorithm that searches the structure of the neural network by the co-evolution mechanism, and for the weights learning we adopted the evolutionary stategies. As a results of co-evolution we will find the optimal structure of the neural network in a short time with a small population. The validity and effectiveness of the proposed method are inspected by applying it to the stabilization and position control of the invered-pendulum system. And we will show that the result of co-evolution is better than that of the conventioal genetic algorithm.

  • PDF

진화 신경회로망을 이용한 도립진자 시스템의 안정화 (Evolving Neural Network for Stabilization Control of Inverted Pendulum)

  • 심영진;이준탁
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 1999년도 하계학술대회 논문집 B
    • /
    • pp.963-965
    • /
    • 1999
  • A linear chromosome combined with a grid-based representation of the network and a new crossover operator allow the evolution of the architecture and the weights simultaneously. In our approach there is no need for a separate weight optimization procedure and networks with more than one type of activation function can be evolved. In this paper one evolutionary' strategy of a given dual neural controller was introduced and the simulation results were described in detail through applications to a stabilization control of an Inverted Pendulum System.

  • PDF

신경회로망의 학습 알고리듬을 이용하여 돌연변이를 수행하는 새로운 진화 프로그래밍 알고리듬 (A New Evolutionary Programming Algorithm using the Learning Rule of a Neural Network for Mutation of Individuals)

  • 임종화;최두현;황찬식
    • 전자공학회논문지C
    • /
    • 제36C권3호
    • /
    • pp.58-64
    • /
    • 1999
  • 진화 프로그래밍은 두 가지 요소로 특징지을 수 있다. 하나는 선택 방법이고 나머지는 돌연변이 규칙이다. 본 논문에서는 신경회로망의 역전파 학습 알고리듬을 이용하여 돌연변이 연산을 수행하는 새로운 진화 프로그래밍 알고리듬을 제안한다. 신경회로망의 학습 알고리듬에서 현재 오차는 진화 프로그래밍의 개체가 진화해 나가야 할 방향을 지정해 주고, 관성은 개체의 변이에 지금까지의 진화 경향을 더해 주어서 빠르게 전역 최적해를 찾도록 하였다. 표준 테스트 함수를 이용하여 제안된 알고리듬의 성능과 강건함을 확인하였다.

  • PDF

Genetically Optimized Self-Organizing Fuzzy Polynomial Neural Networks based on Information Granulation and Evolutionary Algorithm

  • 박호성;오성권
    • 한국지능시스템학회:학술대회논문집
    • /
    • 한국퍼지및지능시스템학회 2005년도 춘계학술대회 학술발표 논문집 제15권 제1호
    • /
    • pp.297-300
    • /
    • 2005
  • In this study, we proposed genetically optimized self-organizing fuzzy polynomial neural network based on information granulation and evolutionary algorithm (gdSOFPNN), develop a comprehensive design methodology involving mechanisms of genetic optimization. The proposed gdSOFPNN gives rise to a structural Iy and parametrically optimized network through an optimal parameters design available within FPN (viz. the number of input variables, the order of the polynomial, input variables, the number of membership functions, and the apexes of membership function). Here, with the aid of the information granulation, we determine the initial location (apexes) of membership functions and initial values of polynomial function being used in the premised and consequence part of the fuzzy rules respectively. The performance of the proposed gdSOFPNN is quantified through experimentation that exploits standard data already used in fuzzy modeling.

  • PDF

진화연산 기반 CNN 필터 축소 (Evolutionary Computation Based CNN Filter Reduction)

  • 서기성
    • 전기학회논문지
    • /
    • 제67권12호
    • /
    • pp.1665-1670
    • /
    • 2018
  • A convolutional neural network (CNN), which is one of the deep learning models, has been very successful in a variety of computer vision tasks. Filters of a CNN are automatically generated, however, they can be further optimized since there exist the possibility of existing redundant and less important features. Therefore, the aim of this paper is a filter reduction to accelerate and compress CNN models. Evolutionary algorithms is adopted to remove the unnecessary filters in order to minimize the parameters of CNN networks while maintaining a good performance of classification. We demonstrate the proposed filter reduction methods performing experiments on CIFAR10 data based on the classification performance. The comparison for three approaches is analysed and the outlook for the potential next steps is suggested.

진화론적 파라미터 동정에 기반한 자기구성 퍼지 다항식 뉴럴 네트워크의 새로운 설계 (A New design of Self Organizing Fuzzy Polynomial Neural Network Based on Evolutionary parameter identification)

  • 박호성;이영일;오성권
    • 대한전기학회:학술대회논문집
    • /
    • 대한전기학회 2005년도 제36회 하계학술대회 논문집 D
    • /
    • pp.2891-2893
    • /
    • 2005
  • In this paper, we introduce a new category of Self-Organizing Fuzzy Polynomial Neural Networks (SOFPNN) that is based on a genetically optimized multi-layer perceptron with fuzzy polynomial neurons (FPNs) and discuss its comprehensive design methodology involving mechanisms of genetic optimization. The conventional SOFPNN algorithm leads to a tendency to produce overly complex networks as well as a repetitive computation load by the trial and error method and/or the a repetitive parameter adjustment by designer. In order to generate a structurally and parametrically optimized network, such parameters need to be optimal. In this study, in solving the problems with the conventional SOFPNN, we introduce a new design approach of evolutionary optimized SOFPNN. Optimal parameters design available within FPN (viz. the no. of input variables, the order of the polynomial, input variables, and the no. of membership function) lead to structurally and parametrically optimized network which is more flexible as well as simpler architecture than the conventional SOFPNN. In addition, we determine the initial apexes of membership functions by genetic algorithm.

  • PDF

An Optimized Deep Learning Techniques for Analyzing Mammograms

  • Satish Babu Bandaru;Natarajasivan. D;Rama Mohan Babu. G
    • International Journal of Computer Science & Network Security
    • /
    • 제23권7호
    • /
    • pp.39-48
    • /
    • 2023
  • Breast cancer screening makes extensive utilization of mammography. Even so, there has been a lot of debate with regards to this application's starting age as well as screening interval. The deep learning technique of transfer learning is employed for transferring the knowledge learnt from the source tasks to the target tasks. For the resolution of real-world problems, deep neural networks have demonstrated superior performance in comparison with the standard machine learning algorithms. The architecture of the deep neural networks has to be defined by taking into account the problem domain knowledge. Normally, this technique will consume a lot of time as well as computational resources. This work evaluated the efficacy of the deep learning neural network like Visual Geometry Group Network (VGG Net) Residual Network (Res Net), as well as inception network for classifying the mammograms. This work proposed optimization of ResNet with Teaching Learning Based Optimization (TLBO) algorithm's in order to predict breast cancers by means of mammogram images. The proposed TLBO-ResNet, an optimized ResNet with faster convergence ability when compared with other evolutionary methods for mammogram classification.