• Title/Summary/Keyword: 실험애니메이션

Search Result 257, Processing Time 0.024 seconds

Development of Korean Music Multimedia Contents for Preschooler - With Priority to Animation - (유아용 한국음악 멀티미디어 콘텐츠 개발 연구 - 애니메이션을 중심으로 -)

  • Choi, Yoo-Mi
    • The Journal of the Korea Contents Association
    • /
    • v.7 no.2
    • /
    • pp.132-141
    • /
    • 2007
  • Traditional music education for infants is the most important step of forming human knowledge. Since their rapid development of intelligence and emotion has an educational value not only it improves the human knowledge but also it makes infants understand the unique emotion of Korean people. Under the several circumstances, however, we have no enough contents for traditional music education. Thus, by researching and analyzing the existing educational contents and complementing them properly for multimedia environment as a form of animation, we perform a series of experiments to infants who are attending to the kindergarten with interactive animations which are familiar with infants. Infants become interested in the contents and after the lesson of the short bamboo flute content, they show the clear improvements for the playing on a musical instrument. This proves that the Korean music educational content realized with animation can be an alternative plan to improve the educational effect by causing enjoyment and interest of infants.

Development of Emotion Contents Recommender System for Improvement of Sentimental Status (감정 및 정서상태 전이를 위한 감성 컨텐츠 추천 시스템 개발)

  • Park, Myon-Woong;Ha, Sung-Do;Jeong, Do-Un;Lyoo, In-Kyoon;Ahn, Seong-Min
    • Science of Emotion and Sensibility
    • /
    • v.10 no.1
    • /
    • pp.1-11
    • /
    • 2007
  • An Infotainment Service intended to enhance the human emotion is introduced in this paper. The service is to be installed on the robot helping elderly persons to live comfortable and enjoyable life. The research started with defining the undesirable status of emotion in everyday life, and the psychological skills to cope with the status were sought about. Then, a methodology for providing emotion contents reflecting the coping skill has been suggested. Based on the Cognitive Behavior Therapy, the coping skill is used to edit animation clips. A movie recommendation system to utilize the edited animation clips has been being developed. A series of process for developing the system is described, where the emotion elements are taken into consideration in addition to the user preference as the criterion for recommendation.

  • PDF

Parallel Rendering of High Quality Animation based on a Dynamic Workload Allocation Scheme (작업영역의 동적 할당을 통한 고화질 애니메이션의 병렬 렌더링)

  • Rhee, Yun-Seok
    • Journal of the Korea Society of Computer and Information
    • /
    • v.13 no.1
    • /
    • pp.109-116
    • /
    • 2008
  • Even though many studies on parallel rendering based on PC clusters have been done. most of those did not cope with non-uniform scenes, where locations of 3D models are biased. In this work. we have built a PC cluster system with POV-Ray, a free rendering software on the public domain, and developed an adaptive load balancing scheme to optimize the parallel efficiency Especially, we noticed that a frame of 3D animation are closely coherent with adjacent frames. and thus we could estimate distribution of computation amount, based on the computation time of previous frame. The experimental results with 2 real animation data show that the proposed scheme reduces by 40% of execution time compared to the simple static partitioning scheme.

  • PDF

3-D Facial Animation on the PDA via Automatic Facial Expression Recognition (얼굴 표정의 자동 인식을 통한 PDA 상에서의 3차원 얼굴 애니메이션)

  • Lee Don-Soo;Choi Soo-Mi;Kim Hae-Hwang;Kim Yong-Guk
    • The KIPS Transactions:PartB
    • /
    • v.12B no.7 s.103
    • /
    • pp.795-802
    • /
    • 2005
  • In this paper, we present a facial expression recognition-synthesis system that recognizes 7 basic emotion information automatically and renders face with non-photorelistic style in PDA For the recognition of the facial expressions, first we need to detect the face area within the image acquired from the camera. Then, a normalization procedure is applied to it for geometrical and illumination corrections. To classify a facial expression, we have found that when Gabor wavelets is combined with enhanced Fisher model the best result comes out. In our case, the out put is the 7 emotional weighting. Such weighting information transmitted to the PDA via a mobile network, is used for non-photorealistic facial expression animation. To render a 3-D avatar which has unique facial character, we adopted the cartoon-like shading method. We found that facial expression animation using emotional curves is more effective in expressing the timing of an expression comparing to the linear interpolation method.

A Study on the Fabrication of Facial Blend Shape of 3D Character - Focusing on the Facial Capture of the Unreal Engine (3D 캐릭터의 얼굴 블렌드쉐입(blendshape)의 제작연구 -언리얼 엔진의 페이셜 캡처를 중심으로)

  • Lou, Yi-Si;Choi, Dong-Hyuk
    • The Journal of the Korea Contents Association
    • /
    • v.22 no.8
    • /
    • pp.73-80
    • /
    • 2022
  • Facial expression is an important means of representing characteristics in movies and animations, and facial capture technology can support the production of facial animation for 3D characters more quickly and effectively. Blendshape techniques are the most widely used methods for producing high-quality 3D face animations, but traditional blendshape often takes a long time to produce. Therefore, the purpose of this study is to achieve results that are not far behind the effectiveness of traditional production to reduce the production period of blend shape. In this paper, in order to make a blend shape, the method of using the cross-model to convey the blend shape is compared with the traditional method of making the blend shape, and the validity of the new method is verified. This study used kit boy developed by Unreal Engine as an experiment target conducted a facial capture test using two blend shape production techniques, and compared and analyzed the facial effects linked to blend shape.

Analysis of facial expressions using three-dimensional motion capture (3차원동작측정에 의한 얼굴 표정의 분석)

  • 박재희;이경태;김봉옥;조강희
    • Proceedings of the ESK Conference
    • /
    • 1996.10a
    • /
    • pp.59-65
    • /
    • 1996
  • 인간의 얼굴 표정은 인간의 감성이 가장 잘 나타나는 부분이다 . 따라서 전통적으로 인간의 표정을 감 성과 연관 지어 연구하려는 많은 노력이 있어 왔다. 최근에는 얼굴 온도 변화를 측정하는 방법, 근전도(EMG; Electromyography)로 얼굴 근육의 움직임을 측정하는 방법, 이미지나 동작분석에 의한 얼굴 표정의 연구가 가능 하게 되었다. 본 연구에서는 인간의 얼굴 표정 변화를 3차원 동작분석 장비를 이용하여 측정하였다. 얼굴 표정 의 측정을 위해 두가지의 실험을 계획하였는데, 첫번 째 실험에서는 피실험자들로 하여금 웃는 표정, 놀라는 표정, 화난 표정, 그리고 무표정 등을 짓게 한 후 이를 측정하였으며, 두번째 실험에스는 코미디 영화와 공포 영화를 피 실험자들에게 보여 주어 피실험자들의 표정 변화를 측정하였다. 5명의 성인 남자가 실험에 참여하였는데, 감성을 일으킬 수 있는 적절한 자극 제시를 못한 점 등에서 처음에 기도했던 6개의 기본 표정(웃음, 슬픔, 혐오, 공포, 화남, 놀람)에 대한 모든 실험과 분석이 수행되지 못했다. 나머지 부분을 포함한 정교한 실험 준비가 추후 연구 에서 요구된다. 이러한 연구는 앞으로 감성공학, 소비자 반응 측정, 컴퓨터 애니메이션(animation), 정보 표시 장치(display) 수단으로서 사용될 수 있을 것이다.

  • PDF

The Effect of Integrated Instruction for Improving English Ability (Listening & Reading Skills) of Elementary School Students: Using Science Animation (초등학생의 융합수업이 영어 능력(듣기, 읽기) 향상에 미치는 영향: 과학 애니메이션의 활용)

  • Park, Mee-Hwa;Sohng, Hae Sung
    • Journal of the Korea Convergence Society
    • /
    • v.10 no.5
    • /
    • pp.133-140
    • /
    • 2019
  • The purpose of this study is to examine the effects of English integrated instruction on the 6th grade Korean elementary students' English ability using science animation. Twenty-seven students took English integrated instruction before their regular class for 10 minutes while the other students took review activity instead of it. At first, the students were asked to take the sixth grade diagnostic evaluation sponsored by C office of education in 2018 as the pre-test and ten months later, they were asked to take the first grade diagnostic evaluation of middle school sponsored by C office of education in 2017 as the post-test. Results of the study showed that the English ability and the affective attitude of the students taking English integrated instruction were improved. These results suggest that English integrated instruction can contribute to the improvement of Korean elementary students' English ability.

A Study on Expression Analysis of Animation Character Using Action Units(AU) (Action Units(AU)를 사용한 애니메이션 캐릭터 표정 분석)

  • Shin, Hyun-Min;Weon, Sun-Hee;Kim, Gye-Young
    • Proceedings of the Korean Society of Computer Information Conference
    • /
    • 2009.01a
    • /
    • pp.163-167
    • /
    • 2009
  • 본 논문에서는 크게 2단계에 걸쳐 다양한 형태의 얼굴을 가진 2차원 애니메이션 상의 캐릭터 얼굴구성요소를 추출하고 표정을 분석한다. 첫 번째 단계에서는, 기존의 얼굴인식 및 표정인식 분야에서 이용되었던 동적메쉬모델을 간소화하여 캐릭터 얼굴에 적용하기 위한 최적의 표준 메쉬모델을 제작하고, 이 모델을 사용하여 얼굴구성요소의 위치 및 형태정보를 추출한다. 두 번째 단계에서는, 앞 단계에서 추출된 3가지 얼굴구성요소(눈썹, 눈, 입)를 사용하여 FACS(Facial Action Coding System)에 정의된 AU(Action Units) 44개 중 12개의 AU를 사용하여 캐릭터의 5까지 기본적인 얼굴 표정에 대해 분석 및 정의한다. 본 논문에서 정의한 AU로 기본적인 5가지 얼굴표정에 대해 표정분석 정확도를 측정하였고, 서로 다른 캐릭터에 실험함으로써 제안된 AU정의의 타당성을 제시한다.

  • PDF

Modeling of Cloth Material for Garment Animation (의상 애니메이션을 위한 직물 소재의 모델링)

  • 이상곤;남양희
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2002.10d
    • /
    • pp.418-420
    • /
    • 2002
  • 980년대로부터 컴퓨터 그래픽스 분야에서 의상의 자연스러운 움직임을 생성하기 위한 연구가 계속되어 왔다. 유한요소 연속체 모델, 질량-스프링 모델과 같은 물리학적 모델이 등장하고, 수치적 적분법과 정확한 충돌 검사 및 반응처리를 접목하여 그럴듯한 옷감의 움직임을 얻을 수 있게 되었다. 그러나 이들 연구는 대개 표준적 모양 변형을 다루었기에, 두께, 질감, 빳빳한 정도 등 소재의 특성에 따른 차이를 묘사할 수 없었다. 본 논문에서는 의류학을 통해 연구되어온 직물들의 소재특성을 조사분석하고, 시뮬레이션 된 가상 옷감과 실제 옷감의 시각적 유사성 평가를 통한 직물들의 특성 파라미터 추출법을 제안하고 실험하였다. 그 결과 대표적인 옷감 종류에 따라 구별되는 애니메이션 패턴을 생성 할 수 있었다.

  • PDF

Implementation of Emotion Recognition System using Internet Phone (인터넷 폰을 이용한 감성인식 시스템 구현)

  • Kwon, Byeong-Heon;Seo, Burm-Suk
    • Journal of Digital Contents Society
    • /
    • v.8 no.1
    • /
    • pp.35-40
    • /
    • 2007
  • In this paper, we introduces contents about the emotion recognition and character display expressing emotion. In this paper, we proposes on things like how to search characteristic parameters expressing user emotion, how to deduce emotion through pattern matching algorithm. Also, we implemented display platform recognizing the caller's emotion over internet phone. It display character animations expressing emotion as the result deduced by emotion recognition algorithm.

  • PDF