DOI QR코드

DOI QR Code

대학수학 경사하강법(gradient descent method) 교수·학습자료 개발

A Study on the Development of Teaching-Learning Materials for Gradient Descent Method in College AI Mathematics Classes

  • Lee, Sang-Gu (Department of Mathematics, Sungkyunkwan University) ;
  • Nam, Yun (Institute of Basic Science, Sungkyunkwan University) ;
  • Lee, Jae Hwa (Research Institute of Basic Sciences, Sungkyunkwan University)
  • 투고 : 2023.08.08
  • 심사 : 2023.09.26
  • 발행 : 2023.09.30

초록

본 논문에서는 인공지능 알고리즘에서 많이 사용되는 경사하강법(gradient descent method)을 대학수학 강좌에서 인공지능 활용사례로 사용할 수 있도록 연구한 교수·학습 기초자료를 소개한다. 특히 대학 미적분학 수준에서도 가르칠 수 있도록 자세한 개념 설명과 함께 복잡한 함수에 관해서도 쉽게 계산할 수 있도록 파이썬(Python) 기반의 SageMath 코드를 제공한다. 그리고 실제 인공지능 응용과 연계하여 선형회귀에서 발생하는 최소제곱문제를 경사하강법을 활용하여 풀이한 예시도 함께 소개한다. 본 연구는 대학 미적분학 뿐만 아니라 공학수학, 수치해석, 응용수학 등과 같은 고급 수학 과목을 지도하는 다양한 교수자들에게 도움이 될 수 있다.

In this paper, we present our new teaching and learning materials on gradient descent method, which is widely used in artificial intelligence, available for college mathematics. These materials provide a good explanation of gradient descent method at the level of college calculus, and the presented SageMath code can help students to solve minimization problems easily. And we introduce how to solve least squares problem using gradient descent method. This study can be helpful to instructors who teach various college-level mathematics subjects such as calculus, engineering mathematics, numerical analysis, and applied mathematics.

키워드

과제정보

이 논문은 정부(과학기술정보통신부)의 재원으로 한국연구재단의 지원을 받아 수행된 연구임(No.2021R1F1A1046714).

참고문헌

  1. Lee, S.-G., & Lee, J.H.. (2019). [ BigBook] Basic mathematics for artificial intelligence. Kyobo Book Centre. http://matrix.skku.ac.kr/math4ai/Math4AI.pdf http://matrix.skku.ac.kr/math4ai/Math4AI.pdf
  2. Lee, S.-G., Lee, J.H., Yoo, J.Y. & Ham, Y. (2022). Multivariable calculus & coding. Kyung Moon Sa. https://buk.io/@kc7895
  3. Lee, S.-G., Lee, J.H. & Ham, Y. (2020a). Artificial intelligence and college mathematics education. Communications of Mathematical Education, 34(1), 1-15.
  4. Lee, S.-G., Lee, J.H., Ham, Y. & Park, K.-E. (2020b). Introductory mathematics for artificial intelligence. Kyung Moon Sa. http://matrix.skku.ac.kr/math4ai-intro/
  5. Choi, Y.-S. (2014). [BigBook] Understanding statistics with R. Kyobo Book Centre.
  6. Choi, W. (2022). Elementary statistics (3rd edition). Kyung Moon Sa.
  7. Bottou, L., Curtis, F.E. & Nocedal, J. (2018). Optimization methods for large-scale machine learning. SIAM Review, 60(2), 223-311.
  8. Boyd, S. & Vandenberghe, L. (2004). Convex Optimization. Cambridge University
  9. Burden, R.L. & Faires, J.D. (2010). Numerical Analysis (9th edition). Cengage Learning.
  10. Cauchy, A. (1847). Methode generale pour la resolution des systemes d'equations simultanees. C. R. Acad. Sci. Paris, 25, 536-538.
  11. Chapra, S. & Canale, R. (2020). Numerical methods for engineers (8th edition)., McGraw-Hill Education.
  12. Dennis, J.E. Jr. & Schnabel, R.B. (1996). Numerical methods for unconstrained optimization and nonlinear equations. SIAM.
  13. Fletcher, R. (1987). Practical methods of optimization (2nd edition). John Wiley and Sons.
  14. Goodfellow, I., Bengio, Y. & Courville, A. (2016). Deep learning. MIT Press. http://www.deeplearningbook.org
  15. Lemarechal, C. (2012). Cauchy and the gradient method. Documenta Mathematica, Extra Vol. Optimization Stories, 251-254.
  16. Nocedal J. & Wright, S. (2006). Numerical optimization (2nd edition). Springer.
  17. Sun, W. & Yuan, Y.-X. (2010). Optimization theory and methods: Nonlinear programming. Springer.
  18. Wright, S.J. & Recht, B. (2022). Optimization for data analysis. Cambridge University Press.