Browse > Article
http://dx.doi.org/10.7471/ikeee.2021.25.3.412

Efficient Path Selection in Continuous Learning Environment  

Park, Seong-Hyeon (Dept. of Embedded Systems Engineering, Incheon National University)
Kang, Seok-Hoon (Dept. of Embedded Systems Engineering, Incheon National University)
Publication Information
Journal of IKEEE / v.25, no.3, 2021 , pp. 412-419 More about this Journal
Abstract
In this paper, we propose a performance improvement of the LwF method using efficient path selection in Continuous Learning Environment. We compare performance and structure with conventional LwF. For comparison, we experiment with performance using MNIST, EMNIST, Fashion MNIST, and CIFAR10 data with different complexity configurations. Experiments show up to 20% improvement in accuracy for each task, which mitigating the Catastrophic Forgetting phenomenon in Continuous Learning environments.
Keywords
Neural network; Continuous Learning; Catastrophic Forgetting; Deep Learning; Regularization;
Citations & Related Records
연도 인용수 순위
  • Reference
1 F. Zenke, B. Poole, and S. Ganguli, "Continual learning through synaptic intelligence," In: Proceedings of the 34th International Conference on Machine Learning-Volume 70. JMLR. org, pp.3987-3995, 2017.
2 K. McRae, and PA. Hetherington, "Catastrophic interference is eliminated in pretrained networks," In: Proceedings of the 15h Annual Conference of the Cognitive Science Society, pp.723-728, 1993.
3 Z. Li, and D. Hoiem, "Learning without forgetting," IEEE transactions on pattern analysis and machine intelligence, Vol.40, No.12, pp.2935-2947, 2017. DOI: 10.1109/TPAMI.2017.2773081   DOI
4 YC. Hsu, et al., "Re-evaluating continual learning scenarios: A categorization and case for strong baselines," arXiv preprint arXiv:1810.12488, 2018.
5 IJ. Goodfellow, et al., "An empirical investigation of catastrophic forgetting in gradient-based neural networks," arXiv preprint arXiv:1312.6211, 2013.
6 GI. Parisi, et al., "Continual lifelong learning with neural networks: A review," Neural Networks, Vol.113, pp.54-71, 2019. DOI: 10.1016/j.neunet.2019.01.012   DOI
7 RM. French, "Catastrophic forgetting in connectionist networks," Trends in cognitive sciences, Vol.3, No.4, pp.128-135, 1999. DOI: 10.1016/S1364-6613(99)01294-2   DOI
8 J. Kirkpatrick, et al., "Overcoming catastrophic forgetting in neural networks," Proceedings of the national academy of sciences, Vol.114, No.13, pp.3521-3526. 2017.
9 G. Hinton, O. Vinyals, and J. Dean, "Distilling the knowledge in a neural network," in NIPS Workshop, 2014.
10 SH. park, SH. Kang, "Continual Learning using Data Similarity," Journal of IKEEE, Vol.24, No.2, pp.514-522, 2020. DOI: 10.7471/ikeee.2020.24.2.514   DOI