Acknowledgement
This work was supported in part by the Research Grant of Kwangwoon University in 2020 and in part by the Basic Science Research Program through the National Research Foundation of Korea (NRF) funded by the Ministry of Education under Grant NRF-2018R1D1A1B07041783.
References
- F. Simonetta, S. Ntalampiras, and F. Avanzini, Multimodal music information processing and retrieval: Survey and future challenges, in Proc. IEEE Int. Work. Multilayer Music Represent. Process. (Milan, Italy), Jan. 2019, pp. 10-18.
- B. McFee et al., Open-source practices for music signal processing research: Recommendations for transparent, sustainable, and reproducible audio research, IEEE Signal Process. Mag. 36 (2019), no. 1, 128-137. https://doi.org/10.1109/MSP.2018.2875349
- M. A. Casey et al., Content-based music information retrieval: Current directions and future challenges, Proc. IEEE 96 (2008), no. 4, 668-696. https://doi.org/10.1109/JPROC.2008.916370
- T. Greer et al., A multimodal view into music's effect on human neural, physiological, and emotional experience, in Proc. ACM Int. Conf. Multimed. (Nice, France), Oct. 2019, pp. 167-175.
- S. Stober, Toward studying music cognition with information retrieval techniques: Lessons learned from the OpenMIIR initiative, Front. Psychol. 8 (2017), 1255. https://doi.org/10.3389/fpsyg.2017.01255
- T. Nguyen, A. Gibbings, and J. Grahn, Rhythm and beat perception, in Springer Handbook of Systematic Musicology, Springer, Berlin, Heidelberg, Germany, 2018, 507-521.
- S. Ehrlich, C. Guan, and G. Cheng, A closed-loop braincomputer music interface for continuous affective interaction, in Proc. IEEE Int. Conf. Orange Technol. (Singapore, Singapore), Dec. 2017, pp. 176-179.
- D. T. Bishop, M. J. Wright, and C. I. Karageorghis, Tempo and intensity of pre-task music modulate neural activity during reactive task performance, Psychol. Music 42 (2014), no. 5, 714-727. https://doi.org/10.1177/0305735613490595
- S. Stober, T. Pratzlich, and M. Muller, Brain beats: Tempo extraction from EEG data, in Proc. Int. Soc. Music Inf. Retr. (New York City, NY, USA), Aug. 2016, pp. 276-282.
- N. Hurless et al., Music genre preference and tempo alter alpha and beta waves in human non-musicians, Prem. Undergrad. Neurosci. J. 22 (2013), no. 4, 1-11.
- I. Daly et al., Changes in music tempo entrain movement related brain activity, in Proc. IEEE Int. Conf. Eng. Med. Biol. Soc. (Chicago, IL, USA), Aug. 2014, pp. 4595-4598.
- S. Nozaradan et al., EEG frequency-tagging and input-output comparison in rhythm perception, Brain Topogr. 31 (2018), no. 2, 153-160. https://doi.org/10.1007/s10548-017-0605-8
- R. J. Vlek et al., Shared mechanisms in perception and imagery of auditory accents, Clin. Neurophysiol. 122 (2011), no. 8, 1526-1532. https://doi.org/10.1016/j.clinph.2011.01.042
- S. Stober, D. J. Cameron, and J. A. Grahn, Using convolutional neural networks to recognize rhythm stimuli from electroencephalography recordings, in Proc. Adv. Neural Inf. Process. Syst. (Montreal, Canada), Dec. 2014, pp. 1449-1457.
- J. X. Chen, D. M. Jiang, and Y. N. Zhang, A hierarchical bidirectional GRU model with attention for EEG-based emotion classification, IEEE Access 7 (2019), 118530-118540. https://doi.org/10.1109/access.2019.2936817
- E. S. Salama et al., EEG-based emotion recognition using 3D convolutional neural networks, Int. J. Adv. Comput. Sci. Appl. 9 (2018), no. 8, 329-337.
- C. Tan et al., Attention-based transfer learning for brain-computer interface, in Proc. IEEE Int. Conf. Acoust. Speech Signal Process. (Brighton, UK), May 2019, pp. 1154-1158.
- K. Cho et al., Learning phrase representations using RNN encoder-decoder for statistical machine translation, in Proc. Conf. Empir. Methods Natural Lang. Process. (Doha, Qatar), Oct. 2014, pp. 1724-1734.
- S. Hochreiter and J. Schmidhuber, Long short-term memory, Neural Comput. 9 (1997), no. 8, 1735-1780. https://doi.org/10.1162/neco.1997.9.8.1735
- A. Vaswani et al., Attention is all you need, in Proc. Adv. Neural Inf. Process. Syst. (Long Beach, CA, USA), Dec. 2017, pp. 5998-6008.
- S. Losorelli et al., NMED-T: A tempo-focused dataset of cortical and behavioral responses to naturalistic music, in Proc. Int. Soc. Music Inf. Retr. (Suzhou, China), Oct. 2017, pp. 339-346.
- S. Koelstra et al., DEAP: A database for emotion analysis; using physiological signals, IEEE Trans. Affect. Comput. 3 (2012), no. 1, 18-31. https://doi.org/10.1109/T-AFFC.2011.15
- Y.-H. Kwon, S. B. Shin, and S. D. Kim, Electroencephalography based fusion two-dimensional (2D)-convolution neural networks (CNN) model for emotion recognition system, Sensors 18 (2018), no. 5, 1383. https://doi.org/10.3390/s18051383