• Title/Summary/Keyword: Emotional User Scenario

Search Result 12, Processing Time 0.07 seconds

Scenario Usefulness and Avatar Realism in an Augmented Reality-based Classroom Simulation for Preservice Teacher Training

  • Kukhyeon KIM;Sanghoon PARK;Jeeheon RYU;Taehyeong LIM
    • Educational Technology International
    • /
    • v.24 no.1
    • /
    • pp.1-27
    • /
    • 2023
  • This study aimed to examine an augmented reality-based teaching simulation in a mobile application. We examined how AR-enabled interactions affect users' perceived scenario usefulness and avatar realism. The participants were forty-six undergraduate students. We randomly grouped them into two conditions: AR and Non-interactive video groups with equal sample sizes. This study employed an experimental design approach with a one-way multivariate analysis of variance with repeated measures. The independent variable is the presence/absence of AR interaction with a mobile application. The dependent variables were avatar realism and scenario usefulness. The measures explored how the student avatar's emotional intensity in a scenario influences user perception. The results showed that participants in the AR-interaction group perceived avatar realism significantly higher than those in the non-interactive video group. Also, participants perceived the high emotional intensity scenario (aggression toward peers) to be significantly higher usefulness than the low emotional intensity scenario (classroom disruption).

Hi, KIA! Classifying Emotional States from Wake-up Words Using Machine Learning (Hi, KIA! 기계 학습을 이용한 기동어 기반 감성 분류)

  • Kim, Taesu;Kim, Yeongwoo;Kim, Keunhyeong;Kim, Chul Min;Jun, Hyung Seok;Suk, Hyeon-Jeong
    • Science of Emotion and Sensibility
    • /
    • v.24 no.1
    • /
    • pp.91-104
    • /
    • 2021
  • This study explored users' emotional states identified from the wake-up words -"Hi, KIA!"- using a machine learning algorithm considering the user interface of passenger cars' voice. We targeted four emotional states, namely, excited, angry, desperate, and neutral, and created a total of 12 emotional scenarios in the context of car driving. Nine college students participated and recorded sentences as guided in the visualized scenario. The wake-up words were extracted from whole sentences, resulting in two data sets. We used the soundgen package and svmRadial method of caret package in open source-based R code to collect acoustic features of the recorded voices and performed machine learning-based analysis to determine the predictability of the modeled algorithm. We compared the accuracy of wake-up words (60.19%: 22%~81%) with that of whole sentences (41.51%) for all nine participants in relation to the four emotional categories. Accuracy and sensitivity performance of individual differences were noticeable, while the selected features were relatively constant. This study provides empirical evidence regarding the potential application of the wake-up words in the practice of emotion-driven user experience in communication between users and the artificial intelligence system.

A Study on System for Analyzing Story of Cinematographic work Based on Estimating Tension of User (감성 상태 기반의 영상 저작물 스토리 분석 시스템 및 분석 방법 개발에 관한 연구)

  • Woo, Jeong-gueon
    • Journal of Engineering Education Research
    • /
    • v.18 no.6
    • /
    • pp.64-69
    • /
    • 2015
  • A video-work story analysis system based on emotional state measurement includes a content provision unit which provides story content of a video-work, a display unit which displays content provided by the content provision unit, an emotional state measurement unit which measures a tense-relaxed emotional state of a viewer viewing the displayed story content, a story pattern analysis unit which analyzes the tense-relaxed emotional state measured from the emotional state measurement unit according to a scene in the story content provided by the content provision unit, and a story pattern display unit which prints out an analysis result or displays the analysis result as an image. The emotional state measurement unit measures a tense or relaxed emotional state through one or more analyses among a brainwave analysis, a vital sign analysis, or an ocular state analysis. A writer may obtain support in an additional scenario modification work, and an investor may obtain support in making a decision through the above description. Furthermore, the video-work story analysis system and analysis method based on emotional state measurement may extract a particular pattern with respect to a change in an emotional state of a viewer, compile statistics, and analyze a correlation between a story and an emotional state.

Quality Evaluation on Emotion Management Support App: A Case on Early Assessment of Emotional Health Issues

  • Anitawati Mohd Lokman;Muhammad Nur Aiman Rosmin;Saidatul Rahah Hamidi;Surya Sumarni Hussein;Shuhaida Mohamed Shuhidan
    • International Journal of Computer Science & Network Security
    • /
    • v.24 no.9
    • /
    • pp.77-84
    • /
    • 2024
  • Emotional health is important for overall health, and those who are experiencing difficulties should seek professional help. However, the social stigma associated with emotional health, as well as the influence of cultural beliefs, prevent many people from seeking help. This makes early detection difficult, which is critical for such health issues. It would be extremely beneficial if they could assess their emotional state and express their thoughts without prejudices. On the market, there are emotional health apps. However, there was little to no evidence-based information on their quality. Hence, this study was conducted in order to provide evidence-based quality in emotional health mobile apps. Eleven functionality task scenarios were used to assess functional quality, while a System Usability Scale test (n=20) was used to assess usability, customer acceptability, learnability, and satisfaction. The findings show that the app for emotional health management is highly efficient and effective, with a high level of user satisfaction. This contributes to the creation of an app that will be useful and practical for people experiencing early-stage emotional health issues, as well as related stakeholders, in order to manage early-stage emotional health issues.

A Study of Integrated Evaluation of System Lighting and User Centered Guideline Development - Focused on the Lighting Design Method for Office Space - (시스템조명 통합평가 및 사용자 맞춤형 가이드라인 개발 연구 - 사무공간 설계방법을 중심으로 -)

  • Kim, Ju-Hyun;Ko, Jae-Kyu;Cho, Mee-Ryoung
    • Korean Institute of Interior Design Journal
    • /
    • v.23 no.6
    • /
    • pp.78-86
    • /
    • 2014
  • Lighting in indoor space is being changed to system lighting converged with IT technology. Office space lighting with incandescent lamps and fluorescent lamps was completed through overall lighting plan by illumination in most cases, but convergence between LED lighting and IT technology enables the technology of responding to user requirements to be realized. Optical physical quantity suitable for humans and the lighting environment in accordance with user's sensibility, based on space function and user's behavior, would contribute to the improvement of service productivity, energy reduction, and enhancement of emotional satisfaction by providing user optimized lighting solution. Thus, user-customized system lighting guidelines to be applied with integration indicators of optics and sensibility are required. For the design elements required by users, environmental factors, product characteristics, optical characteristics, and sensibility factors are drawn from the design cases for office space and the survey, and the design check list and evaluation indicators are considered to reflect the requirements in the design and requirement indicators to give integrated satisfaction for optics and sensibility are developed. Purpose-centered design method from the user's viewpoint is applied to function-focused design through scenario, and it should be applied flexibly, as the new lightning design method solutions, to the concept design stage of space lighting design and device development. This paper, therefore, presents user-customized guidelines by pursuing the optics and sensibility evaluation and design method combining the requirement conditions and scenario, to be used for lighting content development and design.

The Effect of Interjection in Conversational Interaction with the AI Agent: In the Context of Self-Driving Car (인공지능 에이전트 대화형 인터랙션에서의 감탄사 효과: 자율주행 맥락에서)

  • Lee, Sooji;Seo, Jeeyoon;Choi, Junho
    • The Journal of the Convergence on Culture Technology
    • /
    • v.8 no.1
    • /
    • pp.551-563
    • /
    • 2022
  • This study aims to identify the effect on the user experiences when the embodied agent in a self-driving car interacts with emotional expressions by using 'interjection'. An experimental study was designed with two conditions: the inclusion of injections in the agent's conversation feedbacks (with interjections vs. without interjections) and the type of conversation (task-oriented conversation vs. social-oriented conversation). The online experiment was conducted with the four video clips of conversation scenario treatments and measured intimacy, likability, trust, social presence, perceived anthropomorphism, and future intention to use. The result showed that when the agent used interjection, the main effect on social presence was found in both conversation types. When the agent did not use interjection in the task-oriented conversation, trust and future intention to use were higher than when the agent talked with emotional expressions. In the context of the conversation with the AI agent in a self-driving car, we found only the effect of adding emotional expression by using interjection on the enhancing social presence, but no effect on the other user experience factors.

Robot's Emotion Generation Model based on Generalized Context Input Variables with Personality and Familiarity (성격과 친밀도를 지닌 로봇의 일반화된 상황 입력에 기반한 감정 생성)

  • Kwon, Dong-Soo;Park, Jong-Chan;Kim, Young-Min;Kim, Hyoung-Rock;Song, Hyunsoo
    • IEMEK Journal of Embedded Systems and Applications
    • /
    • v.3 no.2
    • /
    • pp.91-101
    • /
    • 2008
  • For a friendly interaction between human and robot, emotional interchange has recently been more important. So many researchers who are investigating the emotion generation model tried to naturalize the robot's emotional state and to improve the usability of the model for the designer of the robot. And also the various emotion generation of the robot is needed to increase the believability of the robot. So in this paper we used the hybrid emotion generation architecture, and defined the generalized context input of emotion generation model for the designer to easily implement it to the robot. And we developed the personality and loyalty model based on the psychology for various emotion generation. Robot's personality is implemented with the emotional stability from Big-Five, and loyalty is made of familiarity generation, expression, and learning procedure which are based on the human-human social relationship such as balance theory and social exchange theory. We verify this emotion generation model by implementing it to the 'user calling and scheduling' scenario.

  • PDF

Visualization using Emotion Information in Movie Script (영화 스크립트 내 감정 정보를 이용한 시각화)

  • Kim, Jinsu
    • Journal of the Korea Convergence Society
    • /
    • v.9 no.11
    • /
    • pp.69-74
    • /
    • 2018
  • Through the convergence of Internet technology and various information technologies, it is possible to collect and process vast amount of information and to exchange various knowledge according to user's personal preference. Especially, there is a tendency to prefer intimate contents connected with the user's preference through the flow of emotional changes contained in the movie media. Based on the information presented in the script, the user seeks to visualize the flow of the entire emotion, the flow of emotions in a specific scene, or a specific scene in order to understand it more quickly. In this paper, after obtaining the raw data from the movie web page, it transforms it into a standardized scenario format after refining process. After converting the refined data into an XML document to easily obtain various information, various sentences are predicted by inputting each paragraph into the emotion prediction system. We propose a system that can easily understand the change of the emotional state between the characters in the whole or a specific part of the various emotions required by the user by mixing the predicted emotions flow and the amount of information included in the script.

Automatic Generation of Diverse Cartoons using User's Profiles and Cartoon Features (사용자 프로파일 및 만화 요소를 활용한 다양한 만화 자동 생성)

  • Song, In-Jee;Jung, Myung-Chul;Cho, Sung-Bae
    • Journal of KIISE:Software and Applications
    • /
    • v.34 no.5
    • /
    • pp.465-475
    • /
    • 2007
  • With the spread of Internet, web users express their daily life by articles, pictures and cartons to recollect personal memory or to share their experience. For the easier recollection and sharing process, this paper proposes diverse cartoon generation methods using the landmark lists which represent the behavior and emotional status of the user. From the priority and causality of each landmark, critical landmark is selected for composing the cartoon scenario, which is revised by story ontology. Using similarity between cartoon images and each landmark in the revised scenario, suitable cartoon cut for each landmark is composed. To make cartoon story more diverse, weather, nightscape, supporting character, exaggeration and animation effects are additionally applied. Through example scenarios and usability tests, the diversity of the generated cartoon is verified.

The Effects of Emotional Interaction with Virtual Student on the User's Eye-fixation and Virtual Presence in the Teaching Simulation (가상현실 수업시뮬레이션에서 가상학생과의 정서적 상호작용이 사용자의 시선응시 및 가상실재감에 미치는 영향)

  • Ryu, Jeeheon;Kim, Kukhyeon
    • The Journal of the Korea Contents Association
    • /
    • v.20 no.2
    • /
    • pp.581-593
    • /
    • 2020
  • The purpose of this study was to examine the eye-fixation times on different parts of a student avatar and the virtual presence with two scenarios in the virtual reality-based teaching simulation. This study was to identify user attention while he or she is interacting with a student avatar. By examining where a user is gazing during a conversation with the avatar, we have a better understanding of non-verbal communication. For this study, forty-five college students (21 females and 24 males) participated in the experiment. They had a conversation with a student avatar in a virtual reality-based teaching simulation. The participants had verbal interactions with the student avatar with two scenarios. While they were having a conversation with the virtual character in the teaching simulation, their eye-movements were collected through a head-mounted display with an eye-tracking function embedded. The results revealed that there were significant differences in eye-fixation times. Participants gazed a longer time on facial expression than any other area. The fixation time on the facial expression was more prolonged than on gestures (F=3.75, p<.05). However, the virtual presence was not significantly different in two scenario levels. This result suggested that users focus on the face more than the gesture when they emotionally interact with the virtual character.