References
- L. Cavazos Quero, J. Iranzo Bartolome, S. Lee, E. Han, S. Kim & J. Cho. (2018). An Interactive Multimodal Guide to Improve Art Accessibility for Blind People. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility (pp. 346-348). DOI: 10.1145/3234695.3241033
- J. Iranzo Bartolome, L. Cavazos Quero, S. Kim, M. Y. Um & J. Cho. (2019, March). Exploring Art with a Voice Controlled Multimodal Guide for Blind People. In Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction (pp. 383-390). DOI: 10.1145/3294109.3300994
- J. D. Cho et al. (2019). Color Information Transfer Multi-modal Interface Concept Design for People with Visually Impairment to Appreciate Works of Art - Focused on the Case of "Blind-Touch", a Reproduction Art for Blind -. Design Works, 2(2), 44-58. https://doi.org/10.15187/dw.2019.10.2.2.44
- Multisensory Artworks exhibition (2019). Human ICT Convergence, (professor: Jun Dong Cho) BlindTouch (Multisensory Painting Platform for the Blind) Exhibition; Exhibition Place: Siloam Center for the Blind S-Gallery.
- D. B. Faustino, S. Gabriele, R. Ibrahim, A. L Theus & A. Girouard. (2017, October). SensArt demo: A multisensory prototype for engaging with visual art. In Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces (pp. 462-465). DOI: 10.1145/3132272.3132290
- S. Wang. (2020). Museum as a Sensory Space: A Discussion of Communication Effect of Multi-Senses in Taizhou Museum. Sustainability, 12(7), 3061. DOI: 10.3390/su12073061
- R. Murray Schafer. (1977). The Soundscape : Our Sonic Environment and the Tuning of the World. Rochester, Vermont : Destiny Books.
- R. E. Cytowic. (2002). Synesthesia: A union of the senses. Cambridge : MIT press.
- Y. G. Jeon. (2004). (A)Study on the Sound Uses to Maximize Visual Images in Digital Media. Masters dissertation. Hansung University, Seoul.
- T. Baumgartner M. Esslen & L. Jancke. (2006). From emotion perception to emotion experience: Emotions evoked by pictures and classical music. International journal of psychophysiology, 60(1), 34-43. DOI: 10.1016/j.ijpsycho.2005.04.007
- C. T. Vi, D. Ablart, E. Gatti, C. Velasco & M. Obrist. (2017). Not just seeing, but also feeling art: Mid-air haptic experiences integrated in a multisensory art exhibition. International Journal of Human-Computer Studies, 108, 1-14. DOI: 10.1016/j.ijhcs.2017.06.004
- https://www.bunkerdelumieres.com
- K. W. Guk. (2019). Examples of applications by AI technology and industry sectors. Weekly Technical Trends, 20, 15-27.
- M. Muller-Eberstein & N. van Noord. (2019). Translating Visual Art into Music. In Proceedings of the IEEE International Conference on Computer Vision Workshops. DOI: 10.1109/ICCVW.2019.00378
- Y. Kajihara, S. Ozono & N. Tokui. (2017). Imaginary Soundscape : Cross-Modal Approach to Generate Pseudo Sound Environments. NIPS Workshop.
- Y. Aytar, C. Vondrick & A. Torralba. (2016). Soundnet: Learning sound representations from unlabeled video. In Advances in neural information processing systems (pp. 892-900).
- A. Sharghi, J. S. Laurel & B. Gong. (2017). Query-focused video summarization: Dataset, evaluation, and a memory network based approach. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 4788-4797). DOI: 10.1109/CVPR.2017.229
- A. Howard et al. (2019). Searching for mobilenetv3. In Proceedings of the IEEE International Conference on Computer Vision (pp. 1314-1324).
- S. J. Shin. (1999). classification of adjectives. A Collection of Korean Language and Literature Studies at Sookmyung Women's University, 6, 19-40.
- H. W. Jung & K. Nah. (2007). A Study on the Meaning of Sensibility and Vocabulary System for Sensibility Evaluation. Journal of the Ergonomics Society of Korea, 26(3), 17-25. DOI: 10.5143/JESK.2007.26.3.01
- R. Baeza-Yates & B. Ribeiro-Neto. (1999). Modern information retrieval (Vol. 463). New York: ACM press.