• Title/Summary/Keyword: 수화 아바타

Search Result 13, Processing Time 0.019 seconds

Web-based Text-To-Sign Language Translating System (웹기반 청각장애인용 수화 웹페이지 제작 시스템)

  • Park, Sung-Wook;Wang, Bo-Hyeun
    • Journal of the Korean Institute of Intelligent Systems
    • /
    • v.24 no.3
    • /
    • pp.265-270
    • /
    • 2014
  • Hearing-impaired people have difficulty in hearing, so it is also hard for them to learn letters that represent sound and text that conveys complex and abstract concepts. Therefore it has been natural choice for the hearing-impaired people to use sign language for communication, which employes facial expression, and hands and body motion. However, the major communication methods in daily life are text and speech, which are big obstacles for the hearing-impaired people to access information, to learn and make intellectual activities, and to get jobs. As delivering information via internet become common the hearing-impaired people are experiencing more difficulty in accessing information since internet represents information mostly in text forms. This intensifies unbalance of information accessibility. This paper reports web-based text-to-sign language translating system that helps web designer to use sign language in web page design. Since the system is web-based, if web designers are equipped with common computing environment for internet browsing, they can use the system. The web-based text-to-sign language system takes the format of bulletin board as user interface. When web designers write paragraphs and post them through the bulletin board to the translating server, the server translates the incoming text to sign language, animates with 3D avatar and records the animation in a MP4 file. The file addresses are fetched by the bulletin board and it enables web designers embed the translated sign language file into their web pages by using HTML5 or Javascript. Also we analyzed text used by web pages of public services, then figured out new words to the translating system, and added to improve translation. This addition is expected to encourage wide and easy acceptance of web pages for hearing-impaired people to public services.

A Script Format of Korean Sing Language for Animated Signing Avatar Service (아바타수어 서비스를 위한 한국수어 스크립트 기술)

  • Lee, Han-kyu;Choi, Ji Hoon;AHN, ChungHyun
    • Proceedings of the Korean Society of Broadcast Engineers Conference
    • /
    • 2020.07a
    • /
    • pp.456-458
    • /
    • 2020
  • 한국수화언어(한국수어)는 농인들이 사용하는 언어이며, 농인이라 함은 청각장애를 가진 사람으로서 한국수어를 일상어로 사용하는 사람을 말한다. 수어를 하나의 언어로써 다른 언어로의 번역 또는 상호번역을 위하여 기계학습 기반의 기술이 연구개발 되고 있으나, 수어는 영상 기반의 언어이고 한국수어의 문법 및 사전체계의 구축이 진행 중인 이유로 한국수어의 번역기술은 상대적으로 다른 이종언어 간의 번역기술에 비하여 발전속도가 느리다. 본 논문에서는 한국어를 한국수어로 번역하여 표현하기 위하여 필요한 수어 스크립트 포맷 및 데이터 인터페이스 규격을 제안한다.

  • PDF

A Comic Facial Expression Method for Intelligent Avatar Communications in the Internet Cyberspace (인터넷 가상공간에서 지적 아바타 통신을 위한 코믹한 얼굴 표정의 생성법)

  • 이용후;김상운;청목유직
    • Journal of the Institute of Electronics Engineers of Korea CI
    • /
    • v.40 no.1
    • /
    • pp.59-73
    • /
    • 2003
  • As a means of overcoming the linguistic barrier between different languages in the Internet, a new sign-language communication system with CG animation techniques has been developed and proposed. In the system, the joint angles of the arms and the hands corresponding to the gesture as a non-verbal communication tool have been considered. The emotional expression, however, could as play also an important role in communicating each other. Especially, a comic expression is more efficient than real facial expression, and the movements of the cheeks and the jaws are more important AU's than those of the eyebrow, eye, mouth etc. Therefore, in this paper, we designed a 3D emotion editor using 2D model, and we extract AU's (called as PAU, here) which play a principal function in expressing emotions. We also proposed a method of generating the universal emotional expression with Avatar models which have different vertex structures. Here, we employed a method of dynamically adjusting the AU movements according to emotional intensities. The proposed system is implemented with Visual C++ and Open Inventor on windows platforms. Experimental results show a possibility that the system could be used as a non-verbal communication means to overcome the linguistic barrier.