• Title/Summary/Keyword: User Emotion Information

Search Result 238, Processing Time 0.024 seconds

The Study of the Analysis of a User's Perception of Screen Component for Inducing Emotion in the 3D Virtual Reality Environment (3차원 가상현실 환경에서의 감성 유발 화면 구성 요소에 대한 사용자 인식 분석 연구)

  • Han, Hyeong-Jong
    • The Journal of the Korea Contents Association
    • /
    • v.18 no.7
    • /
    • pp.165-176
    • /
    • 2018
  • With the development of information and communication technology, the possibility of utilizing 3D virtual reality in education has been sought. Especially, the screen composition in the virtual reality has the possibility of inducing the emotion of the user which may affect the learning. However, there is little research on what aspects of the screen can cause emotions. The purpose of this study is to analyze the user's perception of screen components inducing emotion in virtual reality learning environment. Using Multi Dimensional Scaling (MDS), the user's perception of the main screen in a representative virtual reality learning environment platform was investigated. As a result, the dimension of depth on the screen and the dynamics of the avatar related to the movement were confirmed. This study is meaningful to explore technical variables that can induce emotions among screen elements in virtual reality contents.

A Study on Utilization of Facial Recognition-based Emotion Measurement Technology for Quantifying Game Experience (게임 경험 정량화를 위한 안면인식 기반 감정측정 기술 활용에 대한 연구)

  • Kim, Jae Beom;Jeong, Hong Kyu;Park, Chang Hoon
    • Asia-pacific Journal of Multimedia Services Convergent with Art, Humanities, and Sociology
    • /
    • v.7 no.9
    • /
    • pp.215-223
    • /
    • 2017
  • Various methods for creating interesting games are used in the development process. Because the empirical part is difficult to measure and analyze, it usually only measures and analyzes the parts where data are easy to quantify. This is a clear limit to the fact that the experience of the game is important.This study proposes a system that recognizes the face of a game user and measures the emotion change from the recognized information in order to easily quantify the experience of the user who is playing the game. The system recognizes emotions and records them in real time from the face of the user who is playing the game. These recorded data include time and figures related to the progress of the game, and numerical values for emotions recognized from the face. Using the recorded data, it is possible to judge what kind of emotion the game induces to the user at a certain point in time. Numerical data on the recorded empirical part using the system of this study is expected to help develop the game according to the developer 's intention.

ME-based Emotion Recognition Model (ME 기반 감성 인식 모델)

  • Park, So-Young;Kim, Dong-Geun;Whang, Min-Cheol
    • Proceedings of the Korean Institute of Information and Commucation Sciences Conference
    • /
    • 2010.05a
    • /
    • pp.985-987
    • /
    • 2010
  • In this paper, we propose a maximum entropy-based emotion recognition model using individual average difference. In order to accurately recognize an user' s emotion, the proposed model utilizes the difference between the average of the given input physiological signals and the average of each emotion state' signals rather than only the input signal. For the purpose of alleviating data sparse -ness, the proposed model substitutes two simple symbols such as +(positive number)/-(negative number) for every average difference value, and calculates the average of physiological signals based on a second rather than the longer total emotion response time. With the aim of easily constructing the model, it utilizes a simple average difference calculation technique and a maximum entropy model, one of well-known machine learning techniques.

  • PDF

A Study on the Emotion Analysis of Instagram Using Images and Hashtags (이미지와 해시태그를 이용한 인스타그램의 감정 분석 연구)

  • Jeong, Dahye;Gim, Jangwon
    • The Journal of Korean Institute of Information Technology
    • /
    • v.17 no.9
    • /
    • pp.123-131
    • /
    • 2019
  • Social network service users actively express and share their feelings about social issues and content of interest through postings. As a result, the sharing of emotions among individuals and community members in social network is spreading rapidly. Therefore, resulting in active research of emotion analysis on posting of users. However, There is insufficient research on emotion analysis for postings containing various emotions. In this paper, we propose a method that analyzes the emotions of an Instagram posts using hashtags and images. This method extracts representative emotion from user posts containing multiple emotions with 66.4% accuracy and 81.7% recall, which improves the emotion classification performance compared to the previous method.

A Design and Implementation Digital Vessel Bio Emotion Recognition LED Control System (디지털 선박 생체 감성 인식 LED 조명 제어 시스템 설계 및 구현)

  • Song, Byoung-Ho;Oh, Il-Whan;Lee, Seong-Ro
    • Journal of the Institute of Electronics Engineers of Korea SP
    • /
    • v.48 no.2
    • /
    • pp.102-108
    • /
    • 2011
  • The existing vessels lighting control system has several problems, which are complexity of construction and high cost of establishment and maintenance. In this paper, We designed low cost and high performance lighting control system at digital vessel environment. We proposed a system which recognize the user's emotions after obtaining the biological informations about user's bio information(pulse sensor, blood pressure sensor, blood sugar sensor etc) through wireless sensors controls the LED Lights. This system classified emotions using backpropagation algorithm. We chose 3,000 data sets to train the backpropagation algorithm. As a result, obtained about 88.7% accuracy. And the classified emotions find the most appropriate point in the method of controlling the waves or frequencies to the red, green, blue LED Lamp comparing with the 20-color-emotion models in the HP's 'The meaning of color' and control the brightness or contrast of the LED Lamp. In this method, the system saved about 20% of the electricity consumed.

Emotion Recognition Implementation with Multimodalities of Face, Voice and EEG

  • Udurume, Miracle;Caliwag, Angela;Lim, Wansu;Kim, Gwigon
    • Journal of information and communication convergence engineering
    • /
    • v.20 no.3
    • /
    • pp.174-180
    • /
    • 2022
  • Emotion recognition is an essential component of complete interaction between human and machine. The issues related to emotion recognition are a result of the different types of emotions expressed in several forms such as visual, sound, and physiological signal. Recent advancements in the field show that combined modalities, such as visual, voice and electroencephalography signals, lead to better result compared to the use of single modalities separately. Previous studies have explored the use of multiple modalities for accurate predictions of emotion; however the number of studies regarding real-time implementation is limited because of the difficulty in simultaneously implementing multiple modalities of emotion recognition. In this study, we proposed an emotion recognition system for real-time emotion recognition implementation. Our model was built with a multithreading block that enables the implementation of each modality using separate threads for continuous synchronization. First, we separately achieved emotion recognition for each modality before enabling the use of the multithreaded system. To verify the correctness of the results, we compared the performance accuracy of unimodal and multimodal emotion recognitions in real-time. The experimental results showed real-time user emotion recognition of the proposed model. In addition, the effectiveness of the multimodalities for emotion recognition was observed. Our multimodal model was able to obtain an accuracy of 80.1% as compared to the unimodality, which obtained accuracies of 70.9, 54.3, and 63.1%.

What Drives Consumers' Purchase Decisions? : User- and Marketer-generated Content

  • Kim, Yu-Jin
    • Science of Emotion and Sensibility
    • /
    • v.24 no.4
    • /
    • pp.79-90
    • /
    • 2021
  • Consumers have an increasingly active role in the marketing cycle, using social media channels to create, distribute, and consume digital content. In this context, this paper investigates the impact of user- and marketer-generated content on consumer purchase intentions and the approach to designing an effective social media marketing platform. Referencing a literature review of social media marketing and consumer purchase intentions, a case study of the social media-marketing platform, 0.8L, was undertaken using both qualitative and quantitative results through content analysis and a participatory survey. First, about 450 consumer reviews for ten sunscreen products posted on the 0.8L platform were compared with products' marketer-generated content. Next, 55 subjects participated in a survey regarding purchase intentions toward moisturizing creams on the 0.8L platform. The results indicated that user-generated content (i.e., texts and photos) provided more personal experiences of the product usage process, whereas marketers focused on distinctive product photos and features. Moreover, customer reviews (particularly high volume and narrative format) had more impact on purchase decisions than marketer information in the online cosmetics market. Real users' honest reviews (both positive and negative) were found to aid companies' prompt and straightforward assessment of newly released products. In addition to the importance of customer-driven marketing practices, distinctive user experience design features of a competitive social media-marketing platform are identified to facilitate the creation and sharing of sincere customer reviews that resonate with potential buyers.

Validity analysis of the social emotion model based on relation types in SNS (SNS 사용자의 관계유형에 따른 사회감성 모델의 타당화 분석)

  • Cha, Ye-Sool;Kim, Ji-Hye;Kim, Jong-Hwa;Kim, Song-Yi;Kim, Dong-Keun;Whang, Min-Cheol
    • Science of Emotion and Sensibility
    • /
    • v.15 no.2
    • /
    • pp.283-296
    • /
    • 2012
  • The goal of this study is to determine the social emotion model as an emotion sharing relationship and information sharing relationship based on the user's relations at social networking services. 26 social emotions were extracted by verification of compliance among 92 different emotions collected from the literature survey. The survey on the 26 emotion words was verified to the similarity of social relation types to the Likert 7-points scale. The principal component analysis of the survey data determined 12 representative social emotions in the emotion sharing relation and 13 representative social emotions in the information sharing relation. Multidimensional scaling developed the two-dimensional social emotion model of emotion sharing relation and of information sharing relation based on online communication environment. Meanwhile, insignificant factors in the suggest social emotion models were removed by the structural equation modeling analysis, statistically. The test result of validity analysis demonstrated the fitness of social emotion models at emotion sharing relationships (CFI: .887, TLI: .885, RMSEA: .094), social emotion model of information sharing relationships (CFI: .917, TLI: .900, RMSEA : 0.050). In conclusion, this study presents two different social emotion models based on two different relation types. The findings of this study will provide not only a reference of evaluating social emotions in designing social networking services but also a direction of improving social emotions.

  • PDF

The Development of Characters with Artificial Emotion through Analyzing Drama characters - With a Korean Drama titled 'The Sons of Sol Pharmacy House' (드라마 대본 분석을 통한 등장인물의 성격이 반영된 인공정서 캐릭터 개발 - '솔약국집 아들들'을 중심으로)

  • Ham, Jun-Seok;Rhee, Shin-Young;Bang, Green;Ko, Il-Ju
    • Science of Emotion and Sensibility
    • /
    • v.15 no.2
    • /
    • pp.239-248
    • /
    • 2012
  • This paper looks to extract personality traits from the drama characters within a drama script, and to apply it them to a character that has an artificial emotion. The method of applying the personality of a character from a drama script is as follows. First, we separate a drama script into several pieces, by the characters therin. Next, we extract emotion-related terms by matching morphemes analysis and by using an emotion terms database. Next, we analyze a dominant emotion using extracted emotion terms. Finally last, we apply the analyzed dominant emotion to an equation pertaining to artificial emotion. We made progress in developing user evaluation that features blind testing, to verify that the artificial emotion character bears the personality of a drama character. We apply three drama character personalities to artificial emotion characters bearing the same appearance. The user had to match three artificial emotion characters and drama characters according to personality. The users had a high percentage of correct answers, thus confirming the efficacy of our method of applying a personality, using information from a drama script.

  • PDF

Sentiment Analysis on 'HelloTalk' App Reviews Using NRC Emotion Lexicon and GoEmotions Dataset

  • Simay Akar;Yang Sok Kim;Mi Jin Noh
    • Smart Media Journal
    • /
    • v.13 no.6
    • /
    • pp.35-43
    • /
    • 2024
  • During the post-pandemic period, the interest in foreign language learning surged, leading to increased usage of language-learning apps. With the rising demand for these apps, analyzing app reviews becomes essential, as they provide valuable insights into user experiences and suggestions for improvement. This research focuses on extracting insights into users' opinions, sentiments, and overall satisfaction from reviews of HelloTalk, one of the most renowned language-learning apps. We employed topic modeling and emotion analysis approaches to analyze reviews collected from the Google Play Store. Several experiments were conducted to evaluate the performance of sentiment classification models with different settings. In addition, we identified dominant emotions and topics within the app reviews using feature importance analysis. The experimental results show that the Random Forest model with topics and emotions outperforms other approaches in accuracy, recall, and F1 score. The findings reveal that topics emphasizing language learning and community interactions, as well as the use of language learning tools and the learning experience, are prominent. Moreover, the emotions of 'admiration' and 'annoyance' emerge as significant factors across all models. This research highlights that incorporating emotion scores into the model and utilizing a broader range of emotion labels enhances model performance.