Function Approximation Based on a Network with Kernel Functions of Bounds and Locality : an Approach of Non-Parametric Estimation

  • Kil, Rhee-M. (Research Department, ETRI)
  • Published : 1993.10.31

Abstract

This paper presents function approximation based on nonparametric estimation. As an estimation model of function approximation, a three layered network composed of input, hidden and output layers is considered. The input and output layers have linear activation units while the hidden layer has nonlinear activation units or kernel functions which have the characteristics of bounds and locality. Using this type of network, a many-to-one function is synthesized over the domain of the input space by a number of kernel functions. In this network, we have to estimate the necessary number of kernel functions as well as the parameters associated with kernel functions. For this purpose, a new method of parameter estimation in which linear learning rule is applied between hidden and output layers while nonlinear (piecewise-linear) learning rule is applied between input and hidden layers, is considered. The linear learning rule updates the output weights between hidden and output layers based on the Linear Minimization of Mean Square Error (LMMSE) sense in the space of kernel functions while the nonlinear learning rule updates the parameters of kernel functions based on the gradient of the actual output of network with respect to the parameters (especially, the shape) of kernel functions. This approach of parameter adaptation provides near optimal values of the parameters associated with kernel functions in the sense of minimizing mean square error. As a result, the suggested nonparametric estimation provides an efficient way of function approximation from the view point of the number of kernel functions as well as learning speed.

Keywords

References

  1. Applied Optics v.26 Art2: Stable self-organization of pattern recognition codes for analog input patterns Carpenter, G.A.;Grossberg, S.
  2. Biological Cybernetics v.45 A neural model for category learning Reilly, D.L.;Cooper, L.N.;Elbaum, C.
  3. IEEE International Conference on Neural Networks v.3 Kolmogorov mapping neural network existence theorem Hecht-Nielsen, R.
  4. Avtomatika i Telemekhanika v.25 Theoretical foundations of the potential function method in pattern recognition learning Aizerman, M.A.;Braverman, E.M.;Rozonoer, L.I.
  5. IEEE International Conference on Neural Networks v.1 Multilayer feedforward potential function network Lee, S.;Kil, R.M.
  6. Neural Networks v.4 no.2 A gaussian potential function network with hierarchically self-organizing learning Lee, S.;Kil, R.M.
  7. Neural networks and radial basis functions in classifying static speech patterns. Technical Report CUED/FINFENG/TR22 Niranjan, M.;Fallside, F.
  8. Neural Computation v.1 Fast learning in networks of locally-tuned processing units Moody, J.;Darken, C.J.
  9. Parallel Distributed Processing, volume 1 Rumelhart, D.E.;Hinton, G.E.;Williams, R.J.
  10. Perceptrons Minskay, M.;Papert, S.
  11. Annals of Mathematical Statistics v.33 On the estimation of a probability density function and mode Parzen, E.
  12. Neural Networks v.2 no.3 On the approximate realization of continuous mappings by neural networks Funahashi, K.
  13. Neural Networks v.2 Multilayer feedforward networks are universal approximators Hornik, K.;Stinchcombe, M.;White, H.
  14. Proceedings of IRE v.37 Communication in the presence of noise Shannon, C.E.
  15. IEEE International Symposium on Intelligent Control Nonlinear system control based on gaussian potential function network Lee, S.;Kil, R.M.
  16. Nonlinear signal processing using neural networks: Prediction and system modeling, Technical Report LAUR-87-2662 Lapedes, A.S.;Farber, R.
  17. Neural Information Processing Systems How neural nets work Lapedes, A.S.;Farber, R.