Browse > Article
http://dx.doi.org/10.14695/KJSOS.2022.25.3.127

Moderating Effects of User Gender and AI Voice on the Emotional Satisfaction of Users When Interacting with a Voice User Interface  

Shin, Jong-Gyu (금오공과대학교 지역산업경영연구소)
Kang, Jun-Mo (KT NIT기술팀)
Park, Yeong-Jin (금오공과대학교 산업경영공학전공)
Kim, Sang-Ho (금오공과대학교 산업공학부)
Publication Information
Science of Emotion and Sensibility / v.25, no.3, 2022 , pp. 127-134 More about this Journal
Abstract
This study sought to identify the voice user interface (VUI) design parameters that evoked positive user emotions. Six VUI design parameters that could affect emotional user satisfaction were considered. The moderating effects of user gender and the design parameters were analyzed to determine the appropriate conditions for user satisfaction when interacting with the VUI. An interactive VUI system that could modify the six parameters was implemented using the Wizard of OZ experimental method. User emotions were assessed from the users' facial expression data, which was then converted into a valence score. The frequency analysis and chi-square test found that there were statistically significant moderating gender and AI effects. These results implied that it is beneficial to consider the users' gender when designing voice-based interactions. Adult/male/high-tone voices for males and adult/female/mid-tone voices for females are recommended as general guidelines for future VUI designs. Future analyses that consider various human factors will be able to more delicately assess human-AI interactions from a UX perspective.
Keywords
Emotional Satisfaction; Voice User Interface; Facial Expression; Moderating Effects; Human Factors;
Citations & Related Records
연도 인용수 순위
  • Reference
1 Evans, R. E. & Kortum, P. (2010). The impact of voice characteristics on user response in an interactive voice response system. Interacting with Computers, 22(6), 606-614. DOI: 10.1016/j.intcom.2010.07.001   DOI
2 Howland, K. & Jackson, J. (2018, September). Investigating conversational programming for end-users in smart environments through wizard of Oz interactions. In Proceedings of the Psychology of Programming Interest Group-29th Annual Workshop, London, UK (pp. 5-7).
3 Kelley, J. F. (1984). An iterative design methodology for user-friendly natural language office information applications. ACM Transactions on Information Systems (TOIS), 2(1), 26-41. DOI: 10.1145/357417.357420   DOI
4 Large, D. R., Clark, L., Quandt, A., Burnett, G., & Skrypchuk, L. (2017). Steering the conversation: A linguistic exploration of natural language interactions with a digital assistant during simulated driving. Applied Ergonomics, 63, 53-61. DOI: 10.1016/j.apergo.2017.04.003   DOI
5 Lee, E. J., Nass, C., & Brave, S. (2000). Can computergenerated speech have gender? An experimental test of gender stereotype. In CHI'00 Extended Abstracts on Human factors in Computing Systems, Hague, Netherlands (pp. 289-290). DOI: 10.1145/633292.633461   DOI
6 Nass, C. I. & Brave, S. (2005). Wired for speech: How voice activates and advances the human-computer relationship. Cambridge: MIT press.
7 Niculescu, A., Van Dijk, B., Nijholt, A., & See, S. L. (2011). The influence of voice pitch on the evaluation of a social robot receptionist. In 2011 International Conference on User Science and Engineering (i-USEr) (pp. 18-23), IEEE. DOI: 10.1109/iUSE r.2011.6150529   DOI
8 Sandygulova, A. & O'Hare, G. M. (2018). Age-and gender-based differences in children's interactions with a gender-matching robot. International Journal of Social Robotics, 10(5), 687-700. DOI: 10.1007/s12369-018-0472-9   DOI
9 Shin, J. G., Chio, G. Y., Hwang, H. J., & Kim, S. H. (2021). Evaluation of emotional satisfaction using questionnaires in voice-based human-AI interaction. Applied Sciences, 11(4), 1920. DOI: 10.5143/JESK.2020.39.1.73   DOI
10 Shin, J. G., Kim, J. B., & Kim, S. H. (2019). A framework to identify critical design parameters for enhancing user's satisfaction in human-AI interactions. In Journal of Physics: Conference Series, 1284(1), 012036. IOP Publishing.   DOI
11 Walter, A. (2011). Designing for emotion (2nd ed.), A Book Apart: New York.
12 Purington, A., Taft, J. G., Sannon, S., Bazarova, N. N. & Taylor, S. H. (2017). "Alexa is my new BFF" social roles, user satisfaction, and personification of the amazon echo. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems (pp. 2853-2859). DOI: 10.1145/3027063.3053246   DOI
13 Shin, J. G., Jo, I. G., Lim, W. S., & Kim, S. H. (2020). A few critical design parameters affecting user's satisfaction in interaction with voice user interface of AI-infused systems, Journal of the Ergonomics Society of Korea, 39(1), 73-86. DOI:10.5143/JESK.2020.39.1.73   DOI
14 Eyssel, F., Kuchenbrandt, D., Hegel, F., & De Ruiter, L. (2012). Activating elicited agent knowledge: How robot and user features shape the perception of social robots. In 2012 IEEE RO-MAN: The 21st IEEE International Symposium on Robot and Human Interactive Communication (pp. 851-857), IEEE. DOI: 10.1109/ROMAN.2012.6343858   DOI
15 Kim, S., Goh, J., & Jun, S. (2018). The use of voice input to induce human communication with banking chatbots. In Companion of the 2018 ACM/IEEE InTernational Conference on Human-Robot Interaction (pp. 151-152). DOI: 10.1145/3173386.3176970   DOI
16 Moreno, R., Mayer, R. E., Spires, H. A., & Lester, J. C. (2001). The case for social agency in computerbased teaching: Do students learn more deeply when they interact with animated pedagogical agents?. Cognition and Instruction, 19(2), 177-213. DOI:10.1207/S1532690XCI1902_02   DOI