1 |
Bentivoglio, A. R., Bressman, S. B., Cassetta, E., Carretta, D., Tonali, P., & Albanese, A. (1997). Analysis of blink rate patterns in normal subjects. Movement Disorders, 12(6), 1028-1034. DOI: 10.1002/mds.870120629
DOI
|
2 |
Betella, A., & Verschure, P. F. (2016). The affective slider: A digital self-assessment scale for the measurement of human emotions. PLOS ONE, 11(2). DOI: 10.1371/journal.pone.0148037
DOI
|
3 |
Breazeal, C. L. (2002). Designing sociable robots. MIT Press. ISBN: 978-0-262-02510-2
|
4 |
Hyeon, Y., Pan, Y. H., & Yoo, H. S. (2019). Analysis of users' emotions on lighting effect of artificial intelligence devices. Korean Society for Emotion and Sensibility, 22(3), 35-46. DOI: 10.14695/kjsos.2018.22.3.35
DOI
|
5 |
Russell, J. A., & Bullock, M. (1986). Fuzzy concepts and the perception of emotion in facial expressions. Social Cognition, 4(3), 309-341. DOI: 10.1521/soco.1986.4.3.309
DOI
|
6 |
Baraka, K., Paiva, A., & Veloso, M. (2015). Expressive lights for revealing mobile service robot state. Advances in Intelligent Systems and Computing, 107-119.
|
7 |
Canigueral, R., & Hamilton, A. F. (2019). The role of eye gaze during natural social interactions in typical and autistic people. Frontiers in Psychology, 10. DOI: 10.3389/fpsyg.2019.00560
DOI
|
8 |
Ekman, P., Friesen, W. V., & Ellsworth, P. H. (1972). What emotion categories can observers judge from facial behavior?. Emotion in the Human Face, 57-65. DOI: 10.1016/b978-0-08-016643-8.50024-0
DOI
|
9 |
Funakoshi, K., Kobayashi, K., Nakano, M., Yamada, S., Kitamura, Y., & Tsujino, H. (2008). Conference proceedings of the 10th international conference on multimodal interfaces - IMCI '08. DOI: 10.1145/1452392.1452452
DOI
|
10 |
Homke, P., Holler, J., & Levinson, S. C. (2018). Eye blinks are perceived as communicative signals in human face-to-face interaction. PLOS ONE, 13(12), e0208030. DOI: 10.1371/journal.pone.0208030
DOI
|
11 |
Terada, K., Yamauchi, A., & Ito, A. (2012). Artificial emotion expression for a robot by dynamic color change. 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication, Paris, France. DOI: .10.1109/ROMAN.2012.6343772
DOI
|
12 |
Thomas, F., & Johnston, O. (1981). The illusion of life: Disney animation, 1st edition. Abbeville Press.
|
13 |
Loffler, D., Schmidt, N., & Tscharn, R. (2018). Multimodal expression of artificial emotion in social robots using color, motion and sound. Conference Proceedings of the 2018 ACM/IEEE International Conference on Human-Robot Interaction - HRI '18. DOI: 10.1145/3171221.3171261
DOI
|
14 |
Wilms, L., & Oberfeld, D. (2018). Color and emotion: Effects of hue, saturation, and brightness. Psychological Research, 82(5), 896-914. DOI: 10.1007/s00426-017-0880-8
DOI
|
15 |
Johnson, S. C., & Ma, E. (2005). ROMAN 2005. The role of agent behavior in mentalistic attributions by observers. IEEE International Workshop on Robot and Human Interactive Communication, Nashville, TN, USA (pp. 723-728). DOI: 10.1109/ROMAN.2005.1513865
DOI
|
16 |
Kishi, T., Otani, T., Endo, N., Kryczka, P., Hashimoto, K., Nakata, K., & Takanishi, A. (2012). Development of expressive robotic head for bipedal humanoid robot. 2012 IEEE/RSJ International Conference on Intelligent Robots and Systems, Vilamoura-Algarve, Portugal (pp. 4584-4589). DOI: 10.1109/IROS.2012.6386050
DOI
|
17 |
Johnson, S. C., Ok, S. J., & Luo, Y. (2007). The attribution of attention: 9-month-olds' interpretation of gaze as goal-directed action. Developmental Science, 10(5), 530-537. DOI: 10.111/j.1467-7687.2007.00606.x
DOI
|
18 |
Kim, J., Kim, Y., & Jo, E. (2020). Effect of color and emotional context on processing emotional information of biological motion. Korean Society for Emotion and Sensibility, 23(3), 63-78. DOI: 10.14695/kjsos.2020.23.3.63
DOI
|
19 |
Kuhn, G., Tatler, B. W., & Cole, G. G. (2009). You look where I look! Effect of gaze cues on overt and covert attention in misdirection. Visual Cognition, 17(6-7), 925-944. DOI: 10.1080/13506280902826775
DOI
|
20 |
Lee, J., Yang, H., & Lee, D. (2019). Context modulation effect by affective words influencing on the judgment of facial emotio. Korean Society for Emotion and Sensibility, 22(2), 37-48. DOI: 10.14695/kjsos.2018.22.2.37
DOI
|
21 |
Mohammad, S. M. (2018). Obtaining reliable human ratings of valence, arousal, and dominance for 20,000 English words. Proceedings of The Annual Conference of the Association for Computational Linguistics (ACL).
|
22 |
Jack, R., Garrod, O. B., & Schyns, P. (2014). Dynamic facial expressions of emotion transmit an evolving hierarchy of signals over time. Current Biology, 24(2), 187-192. DOI: 10.1016/j.cub.2013.11.064
DOI
|
23 |
Posner, J., Russell, J. A., & Peterson, B. S. (2005). The circumplex model of affect: An integrative approach to affective neuroscience, cognitive development, and psychopathology. Development and Psychopathology, 17(03). DOI: 10.1017/s0954579405050340
DOI
|
24 |
Terada, K., Takeuchi, C., & Ito, A. (2013). Effect of emotional expression in simple line drawings of a face on human economic behavior. 2013 IEEE RO-MAN, Gyeongju (pp. 51-56). DOI: 10.1109/ROMAN.2013.6628530
DOI
|
25 |
Onchi, E., & Lee, S. H. (2019). Design and evaluation of a spherical robot with emotion-like feedback during human-robot training. Transactions of Japan Society of Kansei Engineering, 19(1), 105-116. DOI: 10.5057/jjske.tjske-d-19-00036
DOI
|
26 |
Ekman, P., & Friesen, W. V. (2003). Unmasking The Face: A guide to recognizing emotions from facial expressions. Malor Books.
|
27 |
Hess, U., Blairy, S., & Kleck, R. E. (1997). The intensity of emotional facial expressions and decoding accuracy. Journal of Nonverbal Behavior, 21(4), 241-257. DOI: 10.1023/a:1024952730333
DOI
|
28 |
Admoni, H., & Scassellati, B. (2017). Social eye gaze in human-robot interaction: A review. Journal of Human-Robot Interaction, 6(1), 25-63.
DOI
|
29 |
Baron-Cohen, S., Wheelwright, S., Jolliffe, T. (1997). Is there a "language of the eyes"? evidence from normal adults, and adults with autism or asperger syndrome. Visual Cognition, 4(3), 311-331. DOI: 10.1080/713756761
DOI
|
30 |
Bradley, M. M., & Lang, P. J. (1994). Measuring emotion: The self-assessment manikin and the semantic differential. Journal of Behavior Therapy and Experimental Psychiatry, 25(1), 49-59. DOI: 10.1016/0005-7916(94)90063-9
DOI
|
31 |
Mori, M., MacDorman, K., & Kageki, N. (2012). The Uncanny Valley [From the Field]. IEEE Robotics & Automation Magazine, 19(2), 98-100. DOI: 10.1109/MRA.2012.2192811
DOI
|
32 |
Andrist, S., Tan, X. Z., Gleicher, M., & Mutlu, B. (2014). Conversational gaze aversion for humanlike robots. Conference Proceedings of the 2014 ACM/IEEE International Conference on Human-Robot Interaction - HRI '14. DOI: 10.1145/2559636.2559666
DOI
|
33 |
Song, S., & Yamada, S. (2017). Expressing emotions through color, sound, and vibration with an appearance-constrained social robot. Conference Proceedings of the 2017 ACM/IEEE International Conference on Human-Robot Interaction - HRI '17. DOI: 10.1145/2909824.3020239
DOI
|
34 |
Kaya, N., & Epps, H. H. (2004). Relationship between color and emotion: A study of college students. College Student Journal, 38(3), 22-63.
|
35 |
Kleinsmith, A., & Semsar, A. (2019). Perception of emotion in body expressions from gaze behavior. Extended Abstracts of the 2019 CHI Conference on Human Factors in Computing Systems - CHI EA '19. DOI: 10.1145/3290607.3313062
DOI
|
36 |
Levinson, S. C. (2016). Turn-taking in human communication - origins and implications for language processing. Trends in Cognitive Sciences, 20(1), 6-14. DOI: 10.1016/j.tics.2015.10.010
DOI
|
37 |
Russell, J. A. (1980). A circumplex model of affect. Journal of Personality and Social Psychology, 39(6), 1161-1178. DOI: 10.1037/h0077714
DOI
|
38 |
Sumioka, H., Minato, T., Matsumoto, Y., Salvini, P., & Ishiguro, H. (2013). Design of human Likeness in HRI from Uncanny Valley to minimal design. 2013 8th ACM/IEEE International Conference on Human-robot Interaction (HRI), Tokyo, Japan (pp. 433-434). DOI: 10.1109/HRI.2013.6483633
DOI
|