• Title/Summary/Keyword: Interactive character animation

Search Result 32, Processing Time 0.023 seconds

An Interactive Character Animation and Data Management Tool (대화형 캐릭터 애니메이션 생성과 데이터 관리 도구)

  • Lee, Min-Geun;Lee, Myeong-Won
    • The KIPS Transactions:PartA
    • /
    • v.8A no.1
    • /
    • pp.63-69
    • /
    • 2001
  • In this paper, we present an interactive 3D character modeling and animation including a data management tool for editing the animation. It includes an animation editor for changing animation sequences according to the modified structure of 3D object in the object structure editor. The animation tool has the feature that it can produce motion data independently of any modeling tool including our modeling tool. Differently from conventional 3D graphics tools that model objects based on geometrically calculated data, our tool models 3D geometric and animation data by approximating to the real object using 2D image interactively. There are some applications that do not need precise representation, but an easier way to obtain an approximated model looking similar to the real object. Our tool is appropriate for such applications. This paper has focused on the data management for enhancing the automatin and convenience when editing a motion or when mapping a motion to the other character.

  • PDF

Real-time Interactive Animation System for Low-Priced Motion Capture Sensors (저가형 모션 캡처 장비를 이용한 실시간 상호작용 애니메이션 시스템)

  • Kim, Jeongho;Kang, Daeun;Lee, Yoonsang;Kwon, Taesoo
    • Journal of the Korea Computer Graphics Society
    • /
    • v.28 no.2
    • /
    • pp.29-41
    • /
    • 2022
  • In this paper, we introduce a novel real-time, interactive animation system which uses real-time motion inputs from a low-cost motion-sensing device Kinect. Our system generates interaction motions between the user character and the counterpart character in real-time. While the motion of the user character is generated mimicking the user's input motion, the other character's motion is decided to react to the user avatar's motion. During a pre-processing step, our system analyzes the reference motion data and generates mapping model in advance. At run-time, our system first generates initial poses of two characters and then modifies them so that it could provide plausible interacting behavior. Our experimental results show plausible interacting animations in that the user character performs a modified motion of user input and the counterpart character properly reacts against the user character. The proposed method will be useful for developing real-time interactive animation systems which provide a better immersive experience for users.

Linear Interpolation Transition of Character Animation for Immediate 3D Response to User Motion

  • Lim, Sooyeon
    • International Journal of Contents
    • /
    • v.11 no.1
    • /
    • pp.15-20
    • /
    • 2015
  • The purpose of this research is to study methods for performing transition that have visual representation of corresponding animations with no bounce in subsequently recognized user information when attempting to interact with a virtual 3D character in real-time using user motion. If the transitions of the animation are needed owing to a variety of external environments, continuous recognition of user information is required to correspond to the motion. The proposed method includes linear interpolation of the transition using cross-fades and blending techniques. The normalized playing time of the source animation was utilized for automatically calculating the transition interpolation length of the target animation and also as the criteria in selecting the crossfades and blending techniques. In particular, in the case of blending, the weighting value based on the degree of similarity between two animations is used as a blending parameter. Accordingly, transitions for visually excellent animation are performed on interactive holographic projection systems.

The Intelligence APP development for children's Kanji character education using Block and Stop motion

  • Jung, Sugkyu
    • International journal of advanced smart convergence
    • /
    • v.5 no.2
    • /
    • pp.66-72
    • /
    • 2016
  • With the growing shift from traditional educational approaches and studying to the more digital classroom, using electronic textbooks and digital native's demand, there is a growing need to develop new methods for learn Kanji characters for children. The purpose of this study is to help children learn the basic Kanji by using stop motion and block methods, and approaching the basic Kanji character education with a more innovative and interactive smart phone APP. In the development of this smart phone App for children's Kanji character education proposed in this study, 100 basic Kanji characters for children are selected. These 100 characters are required for the stop motion animation production, where each selected Kanji is created as a stop-motion animation utilizing a variety of techniques, such as storytelling, to better engage children. The intelligent App is designed with image recognition technology, so that in the learning process children take a picture for the assembled block using their smart phone, the APP then recognizes whether it is assembled correctly, and then plays an animation corresponding to the assembled Kanji character.

Motion Patches (모션 패치)

  • Choi, Myung-Geol;Lee, Kang-Hoon;Lee, Je-Hee
    • Journal of KIISE:Computer Systems and Theory
    • /
    • v.33 no.1_2
    • /
    • pp.119-127
    • /
    • 2006
  • Real-time animation of human figures in virtual environments is an important problem in the context of computer games and virtual environments. Recently, the use of large collections of captured motion data have added increased realism in character animation. However, assuming that the virtual environment is large and complex, the effort of capturing motion data in a physical environment and adapting them to an extended virtual environment is the bottleneck for achieving interactive character animation and control. We present a new technique for allowing our animated characters to navigate through a large virtual environment, which is constructed using a small set of building blocks. The building blocks can be tiled or aligned with a repeating pattern to create a large environment. We annotate each block with a motion patch, which informs what motions are available for animated characters within the block. We demonstrate the versatility and flexibility of our approach through examples in which multiple characters are animated and controlled at interactive rates in large, complex virtual environments.

A Study on Interactive Avatar in Mobile device using facial expression of Animation Character (모바일 기기에서 애니메이션 캐릭터의 얼굴표현을 이용한 인터랙티브 아바타에 관한 연구)

  • Oh Jeong-Seok;Youn Ho-Chang;Jeon Hong-Jun
    • Proceedings of the Korea Contents Association Conference
    • /
    • 2005.05a
    • /
    • pp.229-236
    • /
    • 2005
  • This paper is study about emotional Interactive avatar in cellular phone. When user ask what he want to the avatar, it answer with facial expression based on animation Charac- ter. So the user can approach more friendly to the avatar.

  • PDF

A Design of AI Middleware for Making Interactive Animation Characters (인터랙티브한 애니메이션 캐릭터 제작을 위한 인공지능 미들웨어 설계)

  • Lee, Seung-Sub;Um, Ky-Hyun;Cho, Kyung-Eun
    • Journal of Korea Game Society
    • /
    • v.8 no.1
    • /
    • pp.91-101
    • /
    • 2008
  • Most designers use professional 3D animation tools such as 3DS MAX to manually create animation. This manual method requires a great deal of time and efforts, and does not allow animation characters to interact with one another. In this paper, we design an AI middleware of form as 3DS MAX plug-in to solve these issues. We present an AI expression structure and internal processing method for this middleware, and the method for creating AI character's structure. It creates AI character's structure by drawing figures and lines for representing AI elements. For experiment, we have produced same animations with the traditional method and our method, and measured the task volume in both methods. This result verifies that the task volume is similar or higher than the traditional method in small-scale tasks, but up to 43% of the task volume is reduced in large-scale tasks. Using the method proposed in this paper, we see that characters in an animation interact each other, and task volume in large-scale tasks are reduced.

  • PDF

A Study on the Spectatorship through Character of (<심슨가족>의 캐릭터를 통한 관객성 연구)

  • Youm, Dong-Cheol
    • Cartoon and Animation Studies
    • /
    • s.21
    • /
    • pp.1-17
    • /
    • 2010
  • As animation emerged as a high value-added content business, more studies are conducted focusing on the fact that the key to the success of animation is not a story but a character. This study aims to examine the characters of , the globally loved animation, and figure out its interactive way of attracting viewers based on Spectatorship Theory, so that it can help set the nation's TV animation series to be made on the right track. To achieve this goal, it will explore various aspects including the concept of animation character, the relations between ideology and character, and the changes in design according to a social phenomenon, then based on Spectatorship Theory will analyze and suggest how the characters of fulfill the required conditions to attract viewers. In addition, it will examine the wide application of the characters of . In conclusion, unlike theatrical animation, TV animation has a characteristic that it can easily and repeatedly deliver messages to viewers over a long time, however, domestic TV animation turned out to fail to utilize the advantage. In other words, while its character has distinct individuality, it is not supported by creative and solid story line, and the character is not attractive as much as the characters of which successfully evoke sympathy from viewers. In animation, arousing sympathy from its viewers or audiences is very important, so a character that well reflects social discourse is an integral part of it. Therefore, in-depth and specialized study on animation's character is highly required for the sustainable success and growth of domestic animation.

  • PDF

UCC Cutout Animation Generation using Video Inputs (비디오 입력을 이용한 UCC 컷아웃 애니메이션 생성 기법)

  • Lee, Yun-Jin;Yang, Seung-Jae;Kim, Jun-Ho
    • The Journal of the Korea Contents Association
    • /
    • v.11 no.6
    • /
    • pp.67-75
    • /
    • 2011
  • We propose a novel non-photorealistic rendering technique which generates a cutout animation from a video for UCC. Our method consists of four parts. First, we construct an interactive system to build an articulated character. Second, we extract motions from an input video. Third, we transform motions of a character in order to reflect characteristics of cutout animations. Fourth, we render the extracted or transformed components in cutout animation style. We developed a unified system for a user to easily create a cutout animation from an input video and showed the system generated a cutout animation efficiently.