Acknowledgement
본 연구는 한국전자통신연구원 연구운영비지원사업의 일환으로 수행되었음[21ZS1100, 자율성장형 복합인공지능 원천기술 연구].
References
- D. Hassabis et al., "Neuroscience-inspired artificial intelligence," Neuron, vol. 95, no. 2, July 2017, pp. 245-258. https://doi.org/10.1016/j.neuron.2017.06.011
- N.C. Thompson et al., "The computational limits of deep learning," July 2020, arXiv: 2007.05558.
- M.M. Waldrop, "What are the limits of deep learning?," PNAS, vol. 116, no. 4, Jan. 2019, pp. 1074-1077. https://doi.org/10.1073/pnas.1821594116
- D. Heaven, "Deep trouble for deep learning," Nature, vol. 574, no. 7777, Oct. 2019, pp. 163-166. https://doi.org/10.1038/d41586-019-03013-5
- Wikimedia Commons: Components of neuron, https://commons.wikimedia.org/wiki/File:Components_of_neuron.jpg.
- Wikimedia Commons: Connectome extraction procedure, https://commons.wikimedia.org/wiki/File:Connectome_extraction_procedure.jpg.
- Wikimedia Commons: The Human Connectome, https://commons.wikimedia.org/wiki/File:The_Human_Connectome.png.
- https://www.flickr.com/photos/nihgov/46551667272/.
- K. Lucas, "The 'all or none' contraction of the amphibian skeletal muscle fibre," J. Physiol., vol. 38, no. 2-3, 1909, pp. 113-133. https://doi.org/10.1113/jphysiol.1909.sp001298
- W.S. Mcculloch et al., "A logical calculus of the ideas immanent in nervous activity," Bull. Math. Biophys, vol. 5, no. 4, 1943, pp. 115-133. https://doi.org/10.1007/BF02478259
- D.O. Hebb, The organization of behavior: A neuropsychological theory, Psychology Press, London, UK, 2005.
- E.L. Bienenstock et al., "Theory for the development of neuron selectivity: Orientation specificity and binocular interaction in visual cortex", J. Neurosci., vol. 2, no. 1, 1982, pp. 32-48. https://doi.org/10.1523/JNEUROSCI.02-01-00032.1982
- E. Oja, "Simplified neuron model as a principal component analyzer," J. Math. Biol., vol. 15, no. 3, 1982, pp. 267-273. https://doi.org/10.1007/BF00275687
- F. Rosenblatt, "The perceptron: A probabilistic model for information storage and organization in the brain," Psychol. Rev., vol. 65, no. 6, 1958, pp. 386-408. https://doi.org/10.1037/h0042519
- F. Rosenblatt, "Principles of neurodynamics. Perceptrons and the theory of brain mechanisms," Cornell Aeronautical Lab, Buffalo NY, USA, 1961.
- M. Minsky and S.A. Papert, Perceptrons: An introduction to computational geometry, MIT press, London, UK, 2017.
- J.J. Hopfield, "Neural networks and physical systems with emergent collective computational abilities," PNAS, vol. 79, no. 8, 1982, pp. 2554-2558. https://doi.org/10.1073/pnas.79.8.2554
- G.E. Hinton and T.J. Sejnowski, "Learning and relearning in Boltzmann machines," Parallel Distrib. Process.: Explor. Microstruct. Cogn., vol. 2, no. 1, 1986, pp. 282-317.
- G.E. Hinton, S. Osindero, and Y.W. Teh, "A fast learning algorithm for deep belief nets," Neural Comput., vol. 18, no. 7, July 2006, pp. 1527-1554. https://doi.org/10.1162/neco.2006.18.7.1527
- R.M. French, "Catastrophic forgetting in connectionist networks," Trends Cogn. Sci., vol. 3, no. 4, 1999, pp. 128-135. https://doi.org/10.1016/S1364-6613(99)01294-2
- T. Hospedales et al., "Meta-learning in neural networks: a survey," Nov. 2020, arXiv: 2004.05439.
- S. Hochreiter et al., "Long short-term memory," Neural Comput., vol. 9, no. 8, 1997, pp. 1735-1780. https://doi.org/10.1162/neco.1997.9.8.1735
- A. Vaswani et al., "Attention is all you need," 2017, arXiv: 1706.03762.
- T.P. Lillicrap et al., "Backpropagation and the brain," Nat. Rev. Neurosci., vol. 21, Apr. 2020, pp. 335-346. https://doi.org/10.1038/s41583-020-0277-3
- T.P. Lillicrap et al., "Random synaptic feedback weights support error backpropagation for deep learning," Nat. Commun., vol. 7, no. 1, Dec. 2016, pp. 1-10.
- Y. Bengio et al., "Towards biologically plausible deep learning," Aug. 2016, arXiv: 1502.04156.
- J.C.R. Whittington et al., "Theories of error back-propagation in the brain," Trends Cogn. Sci., vol. 23, no. 3, Mar. 2019, pp. 235-250. https://doi.org/10.1016/j.tics.2018.12.005
- A. Tavanaei et al., "Deep learning in spiking neural networks," Neural Netw., vol. 111, Mar. 2019, pp. 47-63. https://doi.org/10.1016/j.neunet.2018.12.002
- W. Xiao et al., "Biologically-plausible learning algorithms can scale to large datasets," Dec. 2018, arXiv: 1811.03567.
- M. Akrout et al., "Deep learning without weight transport," Jan. 2020, arXiv: 1904.05391.
- C. Baldassi et al., "Learning may need only a few bits of synaptic precision," Phys. Rev. E, vol. 93, no. 5, May 2016.
- W. Wen et al., "TernGrad: Ternary gradients to reduce communication in distributed deep learning," Dec. 2017, arXiv: 1705.07878.
- M. Rastegari et al., "XNoR-Net: ImageNet classification using binary convolutional neural networks," in Computer Vision-ECCV 2016, vol. 9908, Springer, Cham Switzerland, 2016, pp. 525-542.
- Y. Yang et al., "Training high-performance and large-scale deep neural networks with full 8-bit integers," Neural Netw., vol. 125, May 2020, pp. 70-82. https://doi.org/10.1016/j.neunet.2019.12.027
- M. Lechner et al., "Neural circuit policies enabling auditable autonomy," Nat. Mach. Intell., vol. 2, Oct. 2020, pp. 642-652. https://doi.org/10.1038/s42256-020-00237-3
- E.D. Adrian et al., "The impulses produced by sensory nerve endings," J. Physiol., vol. 61 no. 4, 1926, pp. 465-483. https://doi.org/10.1113/jphysiol.1926.sp002308
- G.Q. Bi et al., "Synaptic modifications in cultured hippocampal neurons: Dependence on spike timing, synaptic strength, and postsynaptic cell type," J. Neurosci., vol. 18, no. 24, 1998, pp. 10464-10472. https://doi.org/10.1523/jneurosci.18-24-10464.1998
- W. Guo et al., "Neural coding in spiking neural networks: A comparative study for robust neuromorphic systems," Front. Behav. Neurosci., vol. 15, 2021.
- S.J. Thorpe, "Spike arrival times: A highly efficient coding scheme for neural networks," Parallel Process. Neural Syst., 1990, pp. 91-94.
- D.E. Feldman, "The spike-timing dependence of plasticity," Neuron, vol. 75, no. 4, 2012, pp. 556-571. https://doi.org/10.1016/j.neuron.2012.08.001
- T. Masquelier et al., "Spike timing dependent plasticity finds the start of repeating patterns in continuous spike trains," PloS one, vol. 3, no. 1, 2008, e1377. https://doi.org/10.1371/journal.pone.0001377
- S.M. Bohte et al., "SpikeProp: Backpropagation for networks of spiking neurons," in Proc. ESANN, Bruges, Belgium, Apr. 2000, pp. 419-424.
- A. Kugele et al., "Efficient processing of spatio-temporal data streams with spiking neural networks," Front. Neurosci., vol. 14, 2020.
- D.S. Bassett, et al., "Network neuroscience," Nat. Neurosci., Mar. 2017.
- A. Fomito and E.T. Bullmore, "Connectomic intermediate phenotypes for psychiatric disorders," Front. Psychiatry, Apr. 2012.
- M. Lukosevicius, H. Jaeger, and B. Schrauwen, "Reservoir Comput. Trends," vol. 26, May 2012, pp. 365-371.
- H. Jaeger, "The "echo state" approach to analysing and training recurrent neural networks-with an erratum note," Bonn, GMD Tech. Rep. vol. 148, Jan. 2010.
- L. Grigoryeva et al., "Echo state networks are universal," Neural Netw., vol. 108, Dec. 2018, pp. 495-508. https://doi.org/10.1016/j.neunet.2018.08.025
- H. Jaeger, W. Maass, and J. Principe, "Special issue on echo state networks and liquid state machines," Neural Netw., vol. 20, no. 3, Apr. 2017, pp. 287-289. https://doi.org/10.1016/j.neunet.2007.04.001
- O. Sporns, "The human connectome: Origins and challenges," NeuroImage, vol. 80, Oct. 2013. pp. 53-61. https://doi.org/10.1016/j.neuroimage.2013.03.023
- S.W. Oh et al., "A mesoscale connectome of the mouse brain," Nature, vol. 508, no. 7495, Apr. 2014.
- D. Meunier et al., "Hierarchical modularity in human brain functional networks," Front. Neuroinform., vol. 3, 2009.
- N.T. Markov et al., "A weighted and directed interareal connectivity matrix for macaque cerebral cortex," Cereb. Cortex, vol. 24, 2014.
- R.F. Betzel and D.S. Bassett, "Specificity and robustness of long-distance connections in weighted, interareal connectomes," PNAS, vol. 115, no. 2, May 2018.
- F. Damicelli et al., "Brain connectivity meets reservoir computing," Neurosci., Jan. 2021.
- L.E. Suarez et al., "Learning function from structure in neuromorphic networks," Preprint form Biology, Nov. 2020, doi: 10.1101/2020.11.10.350876.
- W. Luo and Ji-Song Guan, "Do brain oscillations orchestrate memory?," Brain Sci. Adv., vol. 4, no. 1, Oct. 2018. pp. 16-33. https://doi.org/10.26599/bsa.2018.9050008
- R. Fuevara Erra et al., "Neural synchronization from the perspective of non-linear dynamics," Front. Comput. Neurosci., Oct. 2017.
- P. Fries, "Rhythms for cognition: Communication through coherence," Neuron, vol. 88, no. 1, Oct. 2015.
- C. Duclos et al., "Brain network motifs are markers of loss and recovery of consciousness," Sci. Rep., vol. 11, Mar. 2020.
- O. Sporns et al., "Motifs in brain networks," PLoS Biol., vol. 2, Nov. 2004.
- D.S. Bassert et al., "Dynamic reconfiguration of human brain networks during learning," PNAS, May 2011, pp. 7641-7646.
- M. Pedersenet al., "Multilayer network switching rate predicts brain performance," PNAS, vol. 115, Dec. 2018.
- R.F. Betzel et al., "Generative models for network neuroscience: Prospects and promise," J. R. Soc., vol. 14, no. 136, Jun. 2017.