• Title/Summary/Keyword: Emotional Interface

Search Result 141, Processing Time 0.033 seconds

A Study on the Checklist of Emotional Evaluation for MMORPG (MMORPG의 감성평가 체크리스트에 관한 연구)

  • Park, Sang-Jin;Kwak, Hoon-Sung;Seo, Mi-Ra
    • The Journal of the Korea Contents Association
    • /
    • v.6 no.11
    • /
    • pp.217-224
    • /
    • 2006
  • As the online game which is possible of large-scale multiple access enjoys popularity, the number of development by many online game manufacturers came to increase exponentially. But, the reason why the qualitative expansion pace cannot follow the pace of the quantitative expansion on games is that most of them are paltry enterprises except the leading enterprises which are occupying few competitive advantages, so there is no well-established production procedure, and accurate test before the launch isn't accompanied. Therefore, it doesn't cope flexibly to the expectable results. Now, important function is not embodied through the game feature checking stage of the same feature or is under the situation being used in finding out the fatal error of the specific parts. To this, this study collects the emotional vocabulary in the games to suggest the evaluation system that can evaluate the emotion the user feels while playing the game with the usage of the evaluation system that is leaned toward the existing usage evaluation system, and through factor analysis, we classify them into interactivity, interface and information factors and suggest the emotion evaluation system while designing the evaluation questions according to it.

  • PDF

Solar Power Emotional LED Lightening Street Lamps with Multiple Control Sun Tracker (다중 추적식 태양광 발전 감성형 LED 가로등)

  • Lee, Jae-Min;Kim, Yong;Bae, Cheol-Soo;Kwon, Dae-Sig
    • Journal of the Korea Academia-Industrial cooperation Society
    • /
    • v.12 no.2
    • /
    • pp.920-926
    • /
    • 2011
  • In this paper, a solar power emotional LED lightening street lamps with multi control sun tracker is presented. The proposed system has a multiple control sun tracking function and high quality emotional LED lamps. The system is designed to absorb maximum sun lights by temperature sensor and humidity sensor of control circuits. A battery charge-discharge controller is developed for high efficient usage of battery charger for utilization of new and renewal energy. An interface circuit for remote monitoring and controlling is included in the developed system. The proposed multi tracking solar power emotional LED street lamps is better than conventional systems in aspect of tracking operation and energy efficiency, and expected to be a leading model for next generation solar power street lamp system, because it is a new technology combining sun tracking solar power system and emotional lightening system.

The Effect of Barge-in Function of In-Vehicle Voice Conversational Interface on Driving Experience - Focus on Car Navigation and Music Services - (차량용 음성대화 인터페이스의 Barge-in 기능이 주행 경험에 미치는 효과 연구 - 내비게이션 및 음악서비스 중심으로 -)

  • Kim, Taek Soo;Kim, Ji Hyun;Choi, Jun Ho
    • Design Convergence Study
    • /
    • v.17 no.1
    • /
    • pp.17-28
    • /
    • 2018
  • The manipulation of the device by hand while driving is a major factor to increase the risk of accidents, and the design of in-vehicle voice conversational interface that can compensate for this is being actively researched. The purpose of this study is to investigate the effect of the use of the barge-in function of in-vehicle voice interface on user experience. Participants were asked to carry out two tasks, one for navigation and one for music play. We conducted a survey to measure the functional user 's experience after each participant' s tasks, and measured usefulness, usability, satisfaction, and emotion as user experience factors. As a result, Barge-in has been rated as the better choice for most experience factors. There was a significant effect on usability dimension in navigation task and significant effects on usability dimension and emotional dimension in music play task. So it was found that barge-in function had a positive effect on actual user's usability and emotional dimension.

Trends and Implications of Digital Transformation in Vehicle Experience and Audio User Interface (차내 경험의 디지털 트랜스포메이션과 오디오 기반 인터페이스의 동향 및 시사점)

  • Kim, Kihyun;Kwon, Seong-Geun
    • Journal of Korea Multimedia Society
    • /
    • v.25 no.2
    • /
    • pp.166-175
    • /
    • 2022
  • Digital transformation is driving so many changes in daily life and industry. The automobile industry is in a similar situation. In some cases, element techniques in areas called metabuses are also being adopted, such as 3D animated digital cockpit, around view, and voice AI, etc. Through the growth of the mobile market, the norm of human-computer interaction (HCI) has been evolving from keyboard-mouse interaction to touch screen. The core area was the graphical user interface (GUI), and recently, the audio user interface (AUI) has partially replaced the GUI. Since it is easy to access and intuitive to the user, it is quickly becoming a common area of the in-vehicle experience (IVE), especially. The benefits of a AUI are freeing the driver's eyes and hands, using fewer screens, lower interaction costs, more emotional and personal, effective for people with low vision. Nevertheless, when and where to apply a GUI or AUI are actually different approaches because some information is easier to process as we see it. In other cases, there is potential that AUI is more suitable. This is a study on a proposal to actively apply a AUI in the near future based on the context of various scenes occurring to improve IVE.

A Development of Multi-Emotional Signal Receiving Modules for Cellphone Using Robotic Interaction

  • Jung, Yong-Rae;Kong, Yong-Hae;Um, Tai-Joon;Kim, Seung-Woo
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2005.06a
    • /
    • pp.2231-2236
    • /
    • 2005
  • CP (Cellular Phone) is currently one of the most attractive technologies and RT (Robot Technology) is also considered as one of the most promising next generation technology. We present a new technological concept named RCP (Robotic Cellular Phone), which combines RT and CP. RCP consists of 3 sub-modules, $RCP^{Mobility}$, $RCP^{Interaction}$, and $RCP^{Integration}$. $RCP^{Interaction}$ is the main focus of this paper. It is an interactive emotion system which provides CP with multi-emotional signal receiving functionalities. $RCP^{Interaction}$ is linked with communication functions of CP in order to interface between CP and user through a variety of emotional models. It is divided into a tactile, an olfactory and a visual mode. The tactile signal receiving module is designed by patterns and beat frequencies which are made by mechanical-vibration conversion of the musical melody, rhythm and harmony. The olfactory signal receiving module is designed by switching control of perfume-injection nozzles which are able to give the signal receiving to the CP-called user through a special kind of smell according to the CP-calling user. The visual signal receiving module is made by motion control of DC-motored wheel-based system which can inform the CP-called user of the signal receiving through a desired motion according to the CP-calling user. In this paper, a prototype system is developed for multi-emotional signal receiving modes of CP. We describe an overall structure of the system and provide experimental results of the functional modules.

  • PDF

A Development of Multi-Emotional Signal Receiving Modules for Ubiquitous RCP Interaction (유비쿼터스 RCP 상호작용을 위한 다감각 착신기능모듈의 개발)

  • Jang Kyung-Jun;Jung Yong-Rae;Kim Dong-Wook;Kim Seung-Woo
    • Journal of Institute of Control, Robotics and Systems
    • /
    • v.12 no.1
    • /
    • pp.33-40
    • /
    • 2006
  • We present a new technological concept named RCP (Robotic Cellular Phone), which combines RT and CP. That is an ubiquitous robot. RCP consists of 3 sub-modules, RCP Mobility, RCP interaction, and RCP Integration. RCP Interaction is the main focus of this paper. It is an interactive emotion system which provides CP with multi-emotional signal receiving functionalities. RCP Interaction is linked with communication functions of CP in order to interface between CP and user through a variety of emotional models. It is divided into a tactile, an olfactory and a visual mode. The tactile signal receiving module is designed by patterns and beat frequencies which are made by mechanical-vibration conversion of the musical melody, rhythm and harmony. The olfactory signal receiving module is designed by switching control of perfume-injection nozzles which are able to give the signal receiving to the CP-called user through a special kind of smell according to the CP-calling user. The visual signal receiving module is made by motion control of DC-motored wheel-based system which can inform the CP-called user of the signal receiving through a desired motion according to the CP-calling user. In this paper, a prototype system is developed far multi-emotional signal receiving modes of CP. We describe an overall structure of the system and provide experimental results of the functional modules.

Design and implement of the Educational Humanoid Robot D2 for Emotional Interaction System (감성 상호작용을 갖는 교육용 휴머노이드 로봇 D2 개발)

  • Kim, Do-Woo;Chung, Ki-Chull;Park, Won-Sung
    • Proceedings of the KIEE Conference
    • /
    • 2007.07a
    • /
    • pp.1777-1778
    • /
    • 2007
  • In this paper, We design and implement a humanoid robot, With Educational purpose, which can collaborate and communicate with human. We present an affective human-robot communication system for a humanoid robot, D2, which we designed to communicate with a human through dialogue. D2 communicates with humans by understanding and expressing emotion using facial expressions, voice, gestures and posture. Interaction between a human and a robot is made possible through our affective communication framework. The framework enables a robot to catch the emotional status of the user and to respond appropriately. As a result, the robot can engage in a natural dialogue with a human. According to the aim to be interacted with a human for voice, gestures and posture, the developed Educational humanoid robot consists of upper body, two arms, wheeled mobile platform and control hardware including vision and speech capability and various control boards such as motion control boards, signal processing board proceeding several types of sensors. Using the Educational humanoid robot D2, we have presented the successful demonstrations which consist of manipulation task with two arms, tracking objects using the vision system, and communication with human by the emotional interface, the synthesized speeches, and the recognition of speech commands.

  • PDF

Development of a General Purpose PID Motion Controller Using a Field Programmable Gate Array

  • Kim, Sung-Su;Jung, Seul
    • 제어로봇시스템학회:학술대회논문집
    • /
    • 2003.10a
    • /
    • pp.360-365
    • /
    • 2003
  • In this paper, we have developed a general purpose motion controller using an FPGA(Field Programmable Gate Array). The multi-PID controllers on a single chip are implemented as a system-on-chip for multi-axis motion control. We also develop a PC GUI for an efficient interface control. Comparing with the commercial motion controller LM 629 it has multi-independent PID controllers so that it has several advantages such as space effectiveness, low cost and lower power consumption. In order to test the performance of the proposed controller, robot finger is controlled. The robot finger has three fingers with 2 joints each. Finger movements show that position tracking was very effective. Another experiment of balancing an inverted pendulum on a cart has been conducted to show the generality of the proposed FPGA PID controller. The controller has well maintained the balance of the pendulum.

  • PDF

Extraction of Speech Features for Emotion Recognition (감정 인식을 위한 음성 특징 도출)

  • Kwon, Chul-Hong;Song, Seung-Kyu;Kim, Jong-Yeol;Kim, Keun-Ho;Jang, Jun-Su
    • Phonetics and Speech Sciences
    • /
    • v.4 no.2
    • /
    • pp.73-78
    • /
    • 2012
  • Emotion recognition is an important technology in the filed of human-machine interface. To apply speech technology to emotion recognition, this study aims to establish a relationship between emotional groups and their corresponding voice characteristics by investigating various speech features. The speech features related to speech source and vocal tract filter are included. Experimental results show that statistically significant speech parameters for classifying the emotional groups are mainly related to speech sources such as jitter, shimmer, F0 (F0_min, F0_max, F0_mean, F0_std), harmonic parameters (H1, H2, HNR05, HNR15, HNR25, HNR35), and SPI.

A Preliminary Study for Emotional Expression of Software Robot -Development of Hangul Processing Technique for Inference of Emotional Words- (소프트웨어 로봇의 감성 표현을 위한 기반연구 - 감성어 추론을 위한 한글 처리 기술 개발 -)

  • Song, Bok-Hee;Yun, Han-Kyung
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2012.05a
    • /
    • pp.3-4
    • /
    • 2012
  • 사용자 중심의 man machine interface 기술의 발전은 사용자 인터페이스 기술과 인간공학의 접목으로 인하여 많은 진전이 있으며 계속 진행되고 있다. 근래의 정보전달은 사운드와 텍스트 또는 영상을 통하여 이루어지고 있으나, 감성적인 측면에서의 정보전달에 관한 연구는 활발하지 못한 실정이다. 특히, Human Computer Interaction분야에서 음성이나 표정의 전달에 관한 감성연구는 초기단계로 이모티콘이나 플래쉬콘 등이 감정전달을 위하여 사용되고 있으나 부자연스럽고 기계적인 실정이다. 본 연구는 사용자와 상호작용에서 컴퓨터 또는 응용소프트웨어 등이 자신의 가상객체(Software Robot, Sobot)를 활용하여 인간친화적인 상호작용을 제공하기위한 기반연구로써 한글에서 감성어를 추출하여 분류하고 처리하는 기술을 개발하여 컴퓨터가 전달하고자하는 정보에 인공감정을 이입시켜 사용자들의 감성만족도를 향상시키는데 적용하고자한다.

  • PDF