Browse > Article
http://dx.doi.org/10.11568/kjm.2022.30.1.131

GRADIENTS IN A DEEP NEURAL NETWORK AND THEIR PYTHON IMPLEMENTATIONS  

Park, Young Ho (Department of Mathematics, Kangwon National University)
Publication Information
Korean Journal of Mathematics / v.30, no.1, 2022 , pp. 131-146 More about this Journal
Abstract
This is an expository article about the gradients in deep neural network. It is hard to find a place where gradients in a deep neural network are dealt in details in a systematic and mathematical way. We review and compute the gradients and Jacobians to derive formulas for gradients which appear in the backpropagation and implement them in vectorized forms in Python.
Keywords
gradients; deep neural networks; backpropagations; machine learning;
Citations & Related Records
연도 인용수 순위
  • Reference
1 A. Geron, Hands-on Machine Learning with Scikit-Learn & TensorFlow (핸즈온 머신러닝), Hanbit Media, 2018
2 Y.H. Park, Verifying gradient formulas by PyTorch, https://deepmath.kangwon.ac.kr/-yhpark/verify gradients.pdf, accessed February 23, 2022
3 T. Rashid, Make your own neural network (신경망 첫걸음), Hanbit Media, 2017
4 J. Stewart, Calculus, Books-Hill, 2021
5 Andrew Ng, Note on neural network and deep learning, https://github.com/ashishpatel26/Andrew-NG-Notes, accessed February 23, 2022