Browse > Article
http://dx.doi.org/10.11568/kjm.2018.26.2.337

BEGINNER'S GUIDE TO NEURAL NETWORKS FOR THE MNIST DATASET USING MATLAB  

Kim, Bitna (Department of Mathematics Kangwon National University)
Park, Young Ho (Department of Mathematics Kangwon National University)
Publication Information
Korean Journal of Mathematics / v.26, no.2, 2018 , pp. 337-348 More about this Journal
Abstract
MNIST dataset is a database containing images of handwritten digits, with each image labeled by an integer from 0 to 9. It is used to benchmark the performance of machine learning algorithms. Neural networks for MNIST are regarded as the starting point of the studying machine learning algorithms. However it is not easy to start the actual programming. In this expository article, we will give a step-by-step instruction to build neural networks for MNIST dataset using MATLAB.
Keywords
Neural network; Machine learning; Pattern recognition; MNIST dataset; MATLAB;
Citations & Related Records
연도 인용수 순위
  • Reference
1 G. Cybenko, Approximations by superpositions of sigmoidal functions, Mathematics of Control, Signals, and Systems 2 (4) (1989), 303-314.   DOI
2 M.T. Hagan, M.H Beale, H.B. Demuth and O.D Jesus, Neural network Design, 2nd Ed.
3 M.T. Hagan, Neural network design, free book from http://hagan.okstate.edu/NNDesign.pdf
4 J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proceedings of the National Academy of Sciences 79 (1982), 2554-2558.   DOI
5 K. Hornik, Approximation capabilities of multilayer feedforward networks, Neural Networks 4 (2) (1991), 251-257.   DOI
6 Bitna Kim, Handwritten digits classi cation by neural networks with small data, Master's thesis, Kangwon National University, 2018.
7 P. Kim, Matlab deep learning, Apress, 2017
8 T. Kohonen, Correlation matrix memories,IEEE Transactions on Computers 21 (1972), 353-359.
9 W.S. McCulloch and W.H. Pitts, A logical calculus of the ideas immanent in nervous activity, Bull. Math. Biophysics 5 (1943) 115-133.   DOI
10 M. Minsky and S. Papert, Perceptrons: an introduction to computational geometry, M.I.T. Press, Cambridge, 1969
11 MNIST, http://yann.lecun.com/exdb/mnist/
12 A. Ng, Course on machine learing, Cousera, https://www.coursera.org/learn/machine-learning
13 A.Ng, CS229 lecture notes, http://cs229.standford.edu
14 M. Nielsen, Neural networks and deep learning, http://neuralnetworksanddeeplearning.com
15 UFLDL, Using the MNIST dataset, http://udl.stanford.edu/wiki/index.php/Using_the_MNIST_Dataset
16 T. Rashid, Make your own neural network, CreatSpace, 2016
17 F. Rosenblatt, The perceptron: A probabilistic model for information storage and organization in the brain, Psycho-logical Review 65 (1958), 386-408.   DOI
18 Mathworks, MATLAB documentation, MATLAB version R2016a, 2016
19 D. E. Rumelhart and J. L. McClelland, eds., Parallel Distributed Processing: Explorations, Microstructure of Cognition, Vol. 1, Cambridge, MA: MIT Press, 1986.
20 J.R. Shewchuk, An introduction to the conjugate gradient method eithout the agonizing pain, Technical report, Carnegie Mellon University, 1994
21 UFLDL tutorial, Unsupervised feature learning and deep learning, http://deeplearning.stanford.edu/wiki/index.php/UFLDLTutorial
22 http://udl.stanford.edu/wiki/resources/mnistHelper.zip
23 Wikipedia, https://en.wikipedia.org/