1 |
S. J. Shin. (1999). classification of adjectives. A Collection of Korean Language and Literature Studies at Sookmyung Women's University, 6, 19-40.
|
2 |
H. W. Jung & K. Nah. (2007). A Study on the Meaning of Sensibility and Vocabulary System for Sensibility Evaluation. Journal of the Ergonomics Society of Korea, 26(3), 17-25. DOI: 10.5143/JESK.2007.26.3.01
DOI
|
3 |
R. Baeza-Yates & B. Ribeiro-Neto. (1999). Modern information retrieval (Vol. 463). New York: ACM press.
|
4 |
L. Cavazos Quero, J. Iranzo Bartolome, S. Lee, E. Han, S. Kim & J. Cho. (2018). An Interactive Multimodal Guide to Improve Art Accessibility for Blind People. In Proceedings of the 20th International ACM SIGACCESS Conference on Computers and Accessibility (pp. 346-348). DOI: 10.1145/3234695.3241033
DOI
|
5 |
J. Iranzo Bartolome, L. Cavazos Quero, S. Kim, M. Y. Um & J. Cho. (2019, March). Exploring Art with a Voice Controlled Multimodal Guide for Blind People. In Proceedings of the Thirteenth International Conference on Tangible, Embedded, and Embodied Interaction (pp. 383-390). DOI: 10.1145/3294109.3300994
DOI
|
6 |
J. D. Cho et al. (2019). Color Information Transfer Multi-modal Interface Concept Design for People with Visually Impairment to Appreciate Works of Art - Focused on the Case of "Blind-Touch", a Reproduction Art for Blind -. Design Works, 2(2), 44-58.
DOI
|
7 |
Multisensory Artworks exhibition (2019). Human ICT Convergence, (professor: Jun Dong Cho) BlindTouch (Multisensory Painting Platform for the Blind) Exhibition; Exhibition Place: Siloam Center for the Blind S-Gallery.
|
8 |
D. B. Faustino, S. Gabriele, R. Ibrahim, A. L Theus & A. Girouard. (2017, October). SensArt demo: A multisensory prototype for engaging with visual art. In Proceedings of the 2017 ACM International Conference on Interactive Surfaces and Spaces (pp. 462-465). DOI: 10.1145/3132272.3132290
DOI
|
9 |
S. Wang. (2020). Museum as a Sensory Space: A Discussion of Communication Effect of Multi-Senses in Taizhou Museum. Sustainability, 12(7), 3061. DOI: 10.3390/su12073061
DOI
|
10 |
R. Murray Schafer. (1977). The Soundscape : Our Sonic Environment and the Tuning of the World. Rochester, Vermont : Destiny Books.
|
11 |
R. E. Cytowic. (2002). Synesthesia: A union of the senses. Cambridge : MIT press.
|
12 |
https://www.bunkerdelumieres.com
|
13 |
Y. G. Jeon. (2004). (A)Study on the Sound Uses to Maximize Visual Images in Digital Media. Masters dissertation. Hansung University, Seoul.
|
14 |
T. Baumgartner M. Esslen & L. Jancke. (2006). From emotion perception to emotion experience: Emotions evoked by pictures and classical music. International journal of psychophysiology, 60(1), 34-43. DOI: 10.1016/j.ijpsycho.2005.04.007
DOI
|
15 |
C. T. Vi, D. Ablart, E. Gatti, C. Velasco & M. Obrist. (2017). Not just seeing, but also feeling art: Mid-air haptic experiences integrated in a multisensory art exhibition. International Journal of Human-Computer Studies, 108, 1-14. DOI: 10.1016/j.ijhcs.2017.06.004
DOI
|
16 |
K. W. Guk. (2019). Examples of applications by AI technology and industry sectors. Weekly Technical Trends, 20, 15-27.
|
17 |
M. Muller-Eberstein & N. van Noord. (2019). Translating Visual Art into Music. In Proceedings of the IEEE International Conference on Computer Vision Workshops. DOI: 10.1109/ICCVW.2019.00378
DOI
|
18 |
Y. Kajihara, S. Ozono & N. Tokui. (2017). Imaginary Soundscape : Cross-Modal Approach to Generate Pseudo Sound Environments. NIPS Workshop.
|
19 |
Y. Aytar, C. Vondrick & A. Torralba. (2016). Soundnet: Learning sound representations from unlabeled video. In Advances in neural information processing systems (pp. 892-900).
|
20 |
A. Sharghi, J. S. Laurel & B. Gong. (2017). Query-focused video summarization: Dataset, evaluation, and a memory network based approach. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition (pp. 4788-4797). DOI: 10.1109/CVPR.2017.229
DOI
|
21 |
A. Howard et al. (2019). Searching for mobilenetv3. In Proceedings of the IEEE International Conference on Computer Vision (pp. 1314-1324).
|