DOI QR코드

DOI QR Code

BEGINNER'S GUIDE TO NEURAL NETWORKS FOR THE MNIST DATASET USING MATLAB

  • Kim, Bitna (Department of Mathematics Kangwon National University) ;
  • Park, Young Ho (Department of Mathematics Kangwon National University)
  • Received : 2018.06.07
  • Accepted : 2018.06.20
  • Published : 2018.06.30

Abstract

MNIST dataset is a database containing images of handwritten digits, with each image labeled by an integer from 0 to 9. It is used to benchmark the performance of machine learning algorithms. Neural networks for MNIST are regarded as the starting point of the studying machine learning algorithms. However it is not easy to start the actual programming. In this expository article, we will give a step-by-step instruction to build neural networks for MNIST dataset using MATLAB.

Keywords

References

  1. G. Cybenko, Approximations by superpositions of sigmoidal functions, Mathematics of Control, Signals, and Systems 2 (4) (1989), 303-314. https://doi.org/10.1007/BF02551274
  2. M.T. Hagan, M.H Beale, H.B. Demuth and O.D Jesus, Neural network Design, 2nd Ed.
  3. M.T. Hagan, Neural network design, free book from http://hagan.okstate.edu/NNDesign.pdf
  4. J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proceedings of the National Academy of Sciences 79 (1982), 2554-2558. https://doi.org/10.1073/pnas.79.8.2554
  5. K. Hornik, Approximation capabilities of multilayer feedforward networks, Neural Networks 4 (2) (1991), 251-257. https://doi.org/10.1016/0893-6080(91)90009-T
  6. Bitna Kim, Handwritten digits classi cation by neural networks with small data, Master's thesis, Kangwon National University, 2018.
  7. P. Kim, Matlab deep learning, Apress, 2017
  8. T. Kohonen, Correlation matrix memories,IEEE Transactions on Computers 21 (1972), 353-359.
  9. Mathworks, MATLAB documentation, MATLAB version R2016a, 2016
  10. W.S. McCulloch and W.H. Pitts, A logical calculus of the ideas immanent in nervous activity, Bull. Math. Biophysics 5 (1943) 115-133. https://doi.org/10.1007/BF02478259
  11. M. Minsky and S. Papert, Perceptrons: an introduction to computational geometry, M.I.T. Press, Cambridge, 1969
  12. MNIST, http://yann.lecun.com/exdb/mnist/
  13. A. Ng, Course on machine learing, Cousera, https://www.coursera.org/learn/machine-learning
  14. A.Ng, CS229 lecture notes, http://cs229.standford.edu
  15. M. Nielsen, Neural networks and deep learning, http://neuralnetworksanddeeplearning.com
  16. UFLDL, Using the MNIST dataset, http://udl.stanford.edu/wiki/index.php/Using_the_MNIST_Dataset
  17. T. Rashid, Make your own neural network, CreatSpace, 2016
  18. F. Rosenblatt, The perceptron: A probabilistic model for information storage and organization in the brain, Psycho-logical Review 65 (1958), 386-408. https://doi.org/10.1037/h0042519
  19. D. E. Rumelhart and J. L. McClelland, eds., Parallel Distributed Processing: Explorations, Microstructure of Cognition, Vol. 1, Cambridge, MA: MIT Press, 1986.
  20. J.R. Shewchuk, An introduction to the conjugate gradient method eithout the agonizing pain, Technical report, Carnegie Mellon University, 1994
  21. UFLDL tutorial, Unsupervised feature learning and deep learning, http://deeplearning.stanford.edu/wiki/index.php/UFLDLTutorial
  22. http://udl.stanford.edu/wiki/resources/mnistHelper.zip
  23. Wikipedia, https://en.wikipedia.org/