Neural network analysis using neuralnet in R

R의 neuralnet을 활용한 신경망분석

  • Baik, Jaiwook (Department of Statistics.Data Science, Korea National Open University)
  • Received : 2021.01.04
  • Accepted : 2021.01.22
  • Published : 2021.01.31


We investigated multi-layer perceptrons and supervised learning algorithms, and also examined how to model functional relationships between covariates and response variables using a package called neuralnet. The algorithm applied in this paper is characterized by continuous adjustment of the weights, which are parameters to minimize the error function based on the comparison between the actual and predicted values of the response variable. In the neuralnet package, the activation and error functions can be appropriately selected according to the given situation, and the remaining parameters can be set as default values. As a result of using the neuralnet package for the infertility data, we found that age has little influence on infertility among the four independent variables. In addition, the weight of the neural network takes various values from -751.6 to 7.25, and the intercepts of the first hidden layer are -92.6 and 7.25, and the weights for the covariates age, parity, induced, and spontaneous to the first hidden neuron are identified as 3.17, -5.20, -36.82, and -751.6.

본 연구는 다층 퍼셉트론과 지도형 학습알고리즘에 대해 알아보았고, 아울러 neuralnet이라는 패키지를 사용하여 공변수들과 반응변수 간의 함수적 관계를 어떻게 모델링하는지 살펴보았다. 본 연구에서 적용된 알고리즘은 반응변수 값의 실제치와 예측치 간의 비교에 근거한 오차함수의 최소화를 위한 모수인 가중치들의 계속적인 조정을 특징으로 한다. 본 연구에서 설명하는 neuralnet 패키지는 활성화함수와 오차함수를 주어진 상황에 맞게 적절히 선택하고 나머지 매개변수들은 기본값으로 둘 수 있다. 본 연구에서 살펴본 불임 데이터에 대해 neuralnet 패키지를 활용한 결과 4개의 독립변수 중에서 age는 불임에 영향력이 거의 없음을 파악할 수 있었다. 아울러 신경망의 가중치는 -751.6부터 7.25에 이르기까지 다양한 값을 취하며, 첫 번째 은닉층의 절편은 -92.6과 7.25이며, 첫 번째 은닉뉴런으로 가는 공변수 age, parity, induced, spontaneous에 대한 가중치는 각각 3.17, -5.20, -36.82, -751.6임을 파악했다.



  1. McCullagh, P. and Nelder, J. (1983). Generalized Linear Models. Chapman and Hall, London.
  2. Schmidhuber, J. (2015). "Deep learning in neural networks: An overview", Neural networks, 61, 85-117.
  3. Emmert-Streib et al. (2020). "An introductory review of deep learning for prediction models with big data", Frontiers in Artificial Intelligence, 28.
  4. Abiodun et al. (2018). "State-of-the art in artificial neural network applications: A survey", Heliyon, 4, 1-41.
  5. Fritsch, S. and Gunther, F. (2008). neuralnet: Training of Neural Networks. R Foundation for Statistical Computing, R package version 1.2.
  6. Venables, W. and Ripley, B. (2002). Modern Applied Statistics with S. Springer, New York, fourth edition.
  7. Limas, M. C. et al. (2007). AMORE: A MORE Flexible Neural Network Package.
  8. Gunther, F. and Fritsch, S. (2010). "neuralnet: Training of Neural Networks", The R Journal, 2, 30-38.
  9. Schiffmann, W., Joost, M. and Werner, R. (1994). Optimization of the backpropagation algorithm for training multilayer perceptrons. Technical report, University of Koblenz, Institute of Physics.
  10. Kumar, A. and Zhang, D. (2006). "Personal recognition using hand shape and texture", IEEE Transactions on Image Processing, 15, 2454-2461.
  11. Gunther, F., Wawro, N. and Bammann, K. (2009). "Neural networks for modeling gene-gene interactions in association studies", BMC Genetics, 10:87, 1-14.
  12. Bishop, C. (1995). Neural networks for pattern recognition. Oxford University Press, New York.
  13. Hornik, K., Stichcombe, M. and White, H. (1989). "Multilayer feedforward networks are universal approximators", Neural Networks, 2, 359-366.
  14. Rojas, R. (1996). Neural Networks. Springer-Verlag, Berlin.
  15. Trichopoulos et al. (1976). "Induced abortion and secondary infertility", British Journal of Obstetrics and Gynaecology, 83, 645-650.
  16. Intrator, O. and Intrator, N. (2001). "Interpreting neural network results: a simulation study", Computational Statistics & Data Analysis, 37, 373-393.