• Title/Summary/Keyword: 대화형 캐릭터 애니메이션

Search Result 10, Processing Time 0.024 seconds

Control of a Three-Dimensional Character Animation Based on H-Anim (H-Anim 기반의 3차원 캐릭터 애니메이션 제어)

  • Kim, Young-Shin;Lee, Min-Geun;Lee, Myeong-Won
    • Journal of the Korea Computer Graphics Society
    • /
    • v.13 no.4
    • /
    • pp.1-6
    • /
    • 2007
  • In this paper we describe the method of controlling the animation of 3D characters according to ISO/IEC 19774 (H-Anim) specification, which has been released by Wed3D Consortium and ISO/IEC SC24 WG6. The animation structure of the H-Anim character can be defined and modified in our H-Anim editor program. Our H-Anim animator generates the character's motion automatically according to the input of motion parameters at the character's joints interactively. This paper is focused on the development of a motion generation tool for human-like characters defined by H-Anim structures.

  • PDF

Control of a Three-Dimensional Character Animation Based on H-Anim (H-Anim 기반의 3차원 캐릭터 애니메이션 제어)

  • Kim, Young-Shin;Lee, Min-Geun;Lee, Myeong-Won
    • Proceedings of the Korean Information Science Society Conference
    • /
    • 2007.06b
    • /
    • pp.194-198
    • /
    • 2007
  • 본 논문은 웹3D 컨소시엄과 ISO/IEC SC24 WG6 가 공동으로 제정한 국제표준안 ISO/IEC 19774 (H-Anim)을 기반으로 모델링한 3차원 캐릭터의 애니메이션 생성 및 제어 방법을 기술한다. H-Anim 캐릭터는 H-Anim 에디터에 의해 대화형으로 애니메이션 구조가 정의되고 수정이 가능하다. H-Anim 애니메이터 인터페이스에서 캐릭터 관절에서의 대화형 모션 파라미터 입력이 가능하고 이에 따라서 모션이 생성되고 캐릭터 애니메이션이 가시화된다.

  • PDF

The Development of Reenactment System (사건 재연 시스템 개발)

  • 윤여천;변혜원;전성규
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2001.11b
    • /
    • pp.45-49
    • /
    • 2001
  • 본 논문에서는 실제 카메라 영상으로 획득하기 어려운 화재, 살인 등의 뉴스 사건을 시청자들에게 효과적으로 전달하기 위해서 다수의 가상캐릭터가 등장하는 애니메이션을 신속, 편리하게 제작할 수 있는 사건 재연 시스템의 개발 기법을 소개한다. 본 시스템은 미리 구축된 동작 및 모델 라이브러리를 이용하여 다수의 가상캐릭터 움직임을 제어함으로써 고가의 동작 포착 장비와 연기자가 없어도 신속한 가상캐릭터 애니메이션 제작이 가능하다. 사건 재연 시스템의 가상 캐릭터 동작 제어는 동작 지정, 동작 생성, 동작 편집의 세 단계로 구성된다. 동작 지정 단계에서는 사건 시나리오에 따라 모델 라이브러리부터 가상캐릭터와배경 모델을 선택하고, 동작 라이브러리부터 가상캐릭터의 동작을 선택한다. 동작 생성 단계에서는 지정된 특정 사건을 기술하는 동작에 이동 동작을 연결함으로써 가상캐릭터의 부드러운 동작을 생성한다. 동작편집 단계에서는 가상캐릭터 사이의 상호 작용 또는 가상캐릭터와 가상환경의 상호작용을 보다 정확하게 제어하기 위해서 특정한 시간에서의 가상캐릭터 위치와 자세 등을 조정한다. 본 시스템은 편리한 사용을 위해 대화형과 스크립트 기반의 인터페이스를 적용하였으며, 3차원 그래픽 소프트웨어인 Maya의 플러그인(Plug-in) 소프트웨어 형태로 개발하여 Maya의 고성능 그래픽 기능을 활용하였다.

  • PDF

Development of Intelligent Messenger for Affective Interaction of Content Robot (콘텐츠 로봇의 감성적 반응을 위한 지능형 메신저 개발)

  • Park, Bum-Jun;So, Su-Hwan;Park, Tae-Keun
    • The Journal of the Korea Contents Association
    • /
    • v.10 no.9
    • /
    • pp.9-17
    • /
    • 2010
  • Nowadays, many research have been conducted on robots or interactive characters that properly respond to the users affection. In this paper, we develop an intelligent messenger that provides appropriate responses to text inputs according to user's intention and affection. In order to properly respond, the intelligent messenger adapts methods to recognize user's speech act and affection. And it uses an AIML-based interactive script to which tags are additionally attached to express affection and speech act. If the intelligent messenger finds a proper reply in the interactive scripts, it displays the reply in a dialog window, and an animation character expresses emotion assimilated with a user's affection. If the animation character is synchronized with a content robot through a wireless link, the robot in the same space with the user can provide emotional response.

Motion Patches (모션 패치)

  • Choi, Myung-Geol;Lee, Kang-Hoon;Lee, Je-Hee
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.33 no.1_2
    • /
    • pp.119-127
    • /
    • 2006
  • Real-time animation of human figures in virtual environments is an important problem in the context of computer games and virtual environments. Recently, the use of large collections of captured motion data have added increased realism in character animation. However, assuming that the virtual environment is large and complex, the effort of capturing motion data in a physical environment and adapting them to an extended virtual environment is the bottleneck for achieving interactive character animation and control. We present a new technique for allowing our animated characters to navigate through a large virtual environment, which is constructed using a small set of building blocks. The building blocks can be tiled or aligned with a repeating pattern to create a large environment. We annotate each block with a motion patch, which informs what motions are available for animated characters within the block. We demonstrate the versatility and flexibility of our approach through examples in which multiple characters are animated and controlled at interactive rates in large, complex virtual environments.

Development of A News Event Reenactment System (사건재연 시스템 개발)

  • 윤여천;변혜원;전성규;박창섭
    • Journal of Broadcast Engineering
    • /
    • v.7 no.1
    • /
    • pp.21-27
    • /
    • 2002
  • This paper presents a mews event reenactment system (NERS), which generates virtual character animations in a quick and convenient manner. Thus, NERS can be used to produce computer graphics(CG) scenes of news events that are hard to photograph, such as fire, traffic accident, cases of murder, and so on. By using plenty of captured motion data and CG model data this system produces an appropriate animation of virtual characters straightforwardly without any motion capturing device and actors in the authoring stage. NERS is designed to be capable of making virtual characters move along user-defined paths, stitching motions smoothly and modifyingthe positions and of the articulations of a virtual character in a specific frame. Therefore a virtual character can be controlled precisely so as to interact with the virtual environments and other characters. NERS provides both an interactive and script-based (MEL: Maya Embedded Language) interface so that user can this system in a convenient way. This system has been implemented as a plug-in of commercial CG tool, Maya (Alias/wavefront), in order to make use of its advanced functions

Dragging Body Parts in 3D Space to Direct Animated Characters (3차원 공간 상의 신체 부위 드래깅을 통한 캐릭터 애니메이션 제어)

  • Lee, Kang Hoon;Choi, Myung Geol
    • Journal of the Korea Computer Graphics Society
    • /
    • v.21 no.2
    • /
    • pp.11-20
    • /
    • 2015
  • We present a new interactive technique for directing the motion sequences of an animated character by dragging its specific body part to a desired location in the three-dimensional virtual environment via a hand motion tracking device. The motion sequences of our character is synthesized by reordering subsequences of captured motion data based on a well-known graph representation. For each new input location, our system samples the space of possible future states by unrolling the graph into a spatial search tree, and retrieves one of the states at which the dragged body part of the character gets closer to the input location. We minimize the difference between each pair of successively retrieved states, so that the user is able to anticipate which states will be found by varying the input location, and resultantly, to quickly reach the desired states. The usefulness of our method is demonstrated through experiments with breakdance, boxing, and basketball motion data.

On the Development of Animated Tutoring Dialogue Agent for Elementary School Science Learning (초등과학 수업을 위한 애니메이션 기반 튜터링 다이얼로그 에이전트 개발)

  • Jeong, Sang-Mok;Han, Byeong-Rae;Song, Gi-Sang
    • Journal of The Korean Association of Information Education
    • /
    • v.9 no.4
    • /
    • pp.673-684
    • /
    • 2005
  • In this research, we have developed a "computer tutor" that mimics the human tutor with animated tutoring dialog agent and the agent was integrated to teaching-learning material for elementary science subject. The developed system is a natural language based teaching-learning system using one-to-one dialogue. The developed pedagogical dialogue teaching-learning system analysis student's answer then provides appropriate answer or questions after comparing the student's answer with elementary school level achievement. When the agent gives either question or answer it uses the TTS(Text-to-Speech) function. Also the agent has an animated human tutor face for providing more human like feedback. The developed dialogue interface has been applied to 64 6th grade students. The test results show that the test group's average score is higher than the control group by 10.797. This shows that unlike conventional web courseware, our approach that "ask-answer" process and the animated character, which has human tutor's emotional expression, attracts students and helps to immerse to the courseware.

  • PDF

An Interactive Character Animation and Data Management Tool (대화형 캐릭터 애니메이션 생성과 데이터 관리 도구)

  • Lee, Min-Geun;Lee, Myeong-Won
    • The KIPS Transactions:PartA
    • /
    • v.8A no.1
    • /
    • pp.63-69
    • /
    • 2001
  • In this paper, we present an interactive 3D character modeling and animation including a data management tool for editing the animation. It includes an animation editor for changing animation sequences according to the modified structure of 3D object in the object structure editor. The animation tool has the feature that it can produce motion data independently of any modeling tool including our modeling tool. Differently from conventional 3D graphics tools that model objects based on geometrically calculated data, our tool models 3D geometric and animation data by approximating to the real object using 2D image interactively. There are some applications that do not need precise representation, but an easier way to obtain an approximated model looking similar to the real object. Our tool is appropriate for such applications. This paper has focused on the data management for enhancing the automatin and convenience when editing a motion or when mapping a motion to the other character.

  • PDF

Interactive Animation of Articulated Bodies using a Procedural Method (절차적 방법을 이용한 다관절체의 대화형 동작생성)

  • Bae, Hui-Jeong;Baek, Nak-Hun;Lee, Jong-Won;Yu, Gwan-U
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.28 no.12
    • /
    • pp.620-631
    • /
    • 2001
  • In interactive environments including computer games and virtual reality applications, we have increased need for interactive control of articulated body motions. Recently, physically based methods including constrained dynamics techniques are introduced to this area, in order to produce more realistic animation sequences. However, they are hard to achieve real-time control of articulated bodies, due to their heavy computations. In this paper, we present a procedural method for interactive animation of articulated bodies. In our method, each object of the constrained body is first moved according to their physical properties and external forces, without considering any constraints. Then, the locations of objects are adjusted to satisfy given constraints. Through adapting this two-stage approach, we have avoided the solving of large linear systems of equations, to finally achieve the interactive animation of articulated bodies. We also present a few example sequences of animations, which are interactively generated on PC platforms. This method can be easily applied to character animations in virtual environments.

  • PDF